Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

So "X" - nothing to see here. Elon's in control - Part XXX **Threadbans in OP**

Options
1265266268270271329

Comments

  • Registered Users Posts: 5,964 ✭✭✭Cordell


    Again, for death threats and other forms of violence and harassment there need to be criminal proceedings, not only social media bans.



  • Registered Users Posts: 5,964 ✭✭✭Cordell


    I'm not in favor of lies and disinformation, I'm opposing having a call centre level employee deciding what are lies and disinformation.



  • Registered Users Posts: 40,427 ✭✭✭✭ohnonotgmail


    you dont seem to be in favour of any twitter employees fact checking.



  • Registered Users Posts: 40,427 ✭✭✭✭ohnonotgmail


    and for other lies and disinformation that dont reach a criminal standard you are happy just to let them be?



  • Moderators, Category Moderators, Entertainment Moderators Posts: 36,467 CMod ✭✭✭✭pixelburp


    All smelling faintly of warmed up libertarianism at this stage.

    And as anyone who has been stalked, harassed in real life or had former teachers appear on their premises without permissions will tell you, legal or criminal proceedings don't make the perpetrators run off and hide. It's not some press button, get response workflow that makes the pain stop.

    Social media is a unique system in that it allowed immediate, direct and intimate access to a person's being (for lack of a better word). That's the entire point of the abuse; that they can get in your face. Going to the police to report 1000 death threats? No let's keep it at least simple: 10. What are the police gonna do - that famously understaffed institution chronically incapable of helping even "modest" domestic crimes like rape? Log into `@police` and DM the bully? Don't do it again TrutherMaga7632. OR else we'll do absolutely nothing cos we have 0 jurisdiction!

    Here's a wild idea: let the media outlet have a system to help genuine victims turn off the tap of abuse, it's not a sacrifice of some higher purpose of free speech. It's a basic, pain free bit of empathy for those who experience on social media is "not great".



  • Advertisement
  • Registered Users Posts: 29,934 ✭✭✭✭odyssey06


    What are these serious consequences then then would stop the interference? Declaration of war? Trade sanctions?

    How would that stop the interference in a global online world? How do you regulate the acts of interference?

    What utter nonsense.

    "To follow knowledge like a sinking star..." (Tennyson's Ulysses)



  • Registered Users Posts: 5,964 ✭✭✭Cordell


    Yes, trade sanctions and diplomatic sanctions. Let's actually name that country who does interfere, it's Russia. And now it's facing trade and diplomatic sanctions, not for election interference officially but the message is clear, the world have had enough of their shite. So it can be done, and it must be done, but the governments need to take this actions, not social media.



  • Registered Users Posts: 29,934 ✭✭✭✭odyssey06


    How would that stop them from doing what they were doing?

    So instead of social media doing something, you now want to put obligations on trading companies. So how is that the government taking the actions? How is that any different from the government putting obligations on social media?

    Makes no sense.

    "To follow knowledge like a sinking star..." (Tennyson's Ulysses)



  • Registered Users Posts: 5,964 ✭✭✭Cordell


    Yes. No one is forcing anyone to read lies and disinformation, and it's dangerous to come to depend on a third party to do the thinking and filtering.



  • Registered Users Posts: 5,964 ✭✭✭Cordell


    What doesn't actually makes no sense is to create obligations and consequences only for the tool, and not for the perpetrator.



  • Advertisement
  • Registered Users Posts: 23,625 ✭✭✭✭pjohnson


    No one is forcing them sure but generally its people of limited intellectual abilility who see the rubbish and believe it due to their own issues.

    Its far easier to just remove the rubbish, unless you realise you can try to monetize it by being the only place refusing to remove it.



  • Registered Users Posts: 40,427 ✭✭✭✭ohnonotgmail


    why doesn't it make sense? why can there not be obligations and consequences on both?



  • Registered Users Posts: 29,934 ✭✭✭✭odyssey06


    Why? Why does that make no sense?

    If you have a lever \ point of action within your legal domain that will shut off the problem, then of course you use it.

    You were ok with creating obligations and consequences for the tool when it was companies trading with Russia, the sanctions consequences there may harm those companies too as they lose a sales outlet.

    The 'perpetrator' isn't someone you can get your hands on and bring to court - it is a nebulous global network, behind that shell game, a state actor.

    "To follow knowledge like a sinking star..." (Tennyson's Ulysses)



  • Moderators, Category Moderators, Entertainment Moderators Posts: 36,467 CMod ✭✭✭✭pixelburp


    Something something free speech, take responsibility, report to the police, slippery slope. Like I said, warmed-up libertarianism making sure private enterprise has no responsibility to at least help those in trouble. God forbid Saint Musk show a modicum of empathy towards those who use his tool but find themselves targetted by the ásshole demographic.

    I've asked this question before and yet to see a sensible explanation for what the Gardaí are expected to do if I received abuse from, say, a dozen teenagers in Acron, Ohio. Should a task force be sent there to wag their finger, ask nicely to stop sending me DMs? Oh and that's assuming the abuser is dumb enough to share any breadcrumbs as to their location.

    It's never quite clear how this magical report to the police is supposed to just make it better, instantly. The pivot I guess will then become blaming the victim; just put down your phone, Twitter's not real life etc. etc.



  • Registered Users Posts: 5,964 ✭✭✭Cordell


    I think the proper solution is to educate and trust people more, not less.



  • Registered Users Posts: 5,964 ✭✭✭Cordell


    what the Gardaí are expected to do if I received abuse from, say, a dozen teenagers in Acron, Ohio

    Again, regulation of abuse and harassment is not the same as deciding on your behalf what is and what isn't misinformation.



  • Registered Users Posts: 17,998 ✭✭✭✭VinLieger




  • Registered Users Posts: 2,739 ✭✭✭eightieschewbaccy


    Thing is, the social media sites that don't attempt some kind of moderation are the more niche and unsuccessful ones. Twitter are effectively the only major social media site that aren't adopting the policy... Same platform can't get advertisers due to their attitude to these issues be it racism, anti semitism, disinformation and conspiracy theories. So this discounting of these issues, you might view as genius on musk's part but it doesn't look like it's the smartest overall. Similar to all of Musk's moves on the platform.



  • Moderators, Category Moderators, Entertainment Moderators Posts: 36,467 CMod ✭✭✭✭pixelburp


    Not answering the question. Exactly as I said. Stick to the example please.

    Your response to the removal of basic empathetic moderation from Twitter was that victims of abuse et al report it to the Gardaí; this is your idea so you tell me: what can the Gardaí do that's equivalent to Twitter's former moderation policy? Have you ever actually rung the Gardaí BTW, or taken legal action? Neither are quick, and depending on the issue - effective.



  • Advertisement
  • Registered Users Posts: 2,739 ✭✭✭eightieschewbaccy


    Sure, why should the families of mass shootings be afforded basic decency? This stuff is also still frequently posted on there but all fine..





  • Moderators, Category Moderators, Entertainment Moderators Posts: 36,467 CMod ✭✭✭✭pixelburp


    Just this week we've seen that GB News spat, the end result being a journalist (female, of course) receiving an avalanche of directed online abuse 'cos ... lemme check the notes here ... was a female news journalist who Laurence Fox didn't wanna shag and was too woke. Hahaha. What a cow.

    She should just ring the police then, get it all sorted. Or just put up with it: her fault for being female and a journalist and using Twitter and having Woke opinions anyway. Though given the issue's with GB news, there's a good chance its 12 viewers and fans probably are all based in the same English village, so yeah; the Police probably could arrest them all for harassment.



  • Registered Users Posts: 5,964 ✭✭✭Cordell


    OK, to the example, it depends on the abuse. If they are just calling you names, block them, and that's the end of it, move on, it's just words from insignificant twats across the world. If there is an actual threat of violence they won't be in Ohio and X won't be able to protect you; otherwise again, it's just words from insignificant twats across the world. Yes X can block them, but that won't change anything.

    But I never referred to abuse, I never implied that abuse should go free. My point was about misinformation, not abuse, this is where I have an issue with having someone deciding of other's behalf what is true and what isn't.



  • Registered Users Posts: 7,220 ✭✭✭HalloweenJack


    Its a question of fact-checking. You can't 'decide' what's misinformation or not; it either is or it isn't: What's true can be proved; if it can't be proved, it shouldn't be posted.

    Your wording reveals your belief that these moderation teams are acting as sinister gate-keepers who control the narrative when they would be ensuring accuracy in debate taking place on their platforms.

    Due to libel laws, a lot of traditional media have fact-checking departments because they can get into a lot of trouble otherwise. Its not always sufficiently thorough but a lot more is done to ensure that stories aren't just plucked out of thin air and have something to support them.

    People in positions of responsibility and with an audience to influence public debate (say people looking to get elected or those who report the news) should be held accountable for what they say and should be required to provide evidence of their claims.

    I don't see why social media should be any exemption in this case and passing the torch to the authorities is just giving them more work that would require more specification. Also, given you think people in this area have biases, what's to say the authorities wouldn't 'decide' what is or isn't misinformation?

    I agree with another post of yours that people should be educated and trust more and having social media challenge and deny misinformation is one way of ensuring that people using it are trusted sources.



  • Registered Users Posts: 5,964 ✭✭✭Cordell


    What's true can be proved; if it can't be proved, it shouldn't be posted.

    No, we don't apply this rule in real life, we shouldn't apply it to social media either. If we were to live by this rule, first thing that will need to go is religions.



  • Registered Users Posts: 23,625 ✭✭✭✭pjohnson


    What is and isnt misinformation isn't "decided".

    Its using actual facts.



  • Registered Users Posts: 43,311 ✭✭✭✭K-9


    Seriously, its gone long past throwing a few insults at people with "limited intellectual liability" and calling them gullible eejits.

    We've had Donald Trump and Brexit, which lies and disinformation played a big part in.

    How we root out all the lies and disinfo is a whole other thing, but it has long gone past people with limited intellectual ability getting sold bridges.

    But anyway, the point is moot on a thread about Elon, because he'll do f*ck all about it anyway. The only hope is he keeps doing what he is doing and Twitter becomes irrelevant and eventually bust.

    Mad Men's Don Draper : What you call love was invented by guys like me, to sell nylons.



  • Registered Users Posts: 7,220 ✭✭✭HalloweenJack


    And I would gladly welcome such a move.

    My point is that people who seek to influence public debate and hold positions of power should be held to a higher standard: Whatever they are claiming should be backed up by proper evidence.

    Being able to poison the well should not be tolerated and this responsibility should lie with social media (and all media) as they are the ones giving a platform to these people. Social media companies should have a conscience and aim to encourage truthful debate.

    Whether they want to or not is another debate but I have no problem with there being repercussions for companies that do nothing to counter misinformation that they profit from.



  • Registered Users Posts: 2,739 ✭✭✭eightieschewbaccy


    Also worth noting, that Elon is himself a prominent spreader of conspiracy theories. At rate he's going, he's not that far off from the point of Henry Ford's more nasty ones. Frequently eggs on the likes of Andy Ngo. Recently enough claimed a spree shooter was a psyop. Got a user unsuspended for sharing CSA content because he politically aligns with that user's conspiracy theories.



  • Advertisement
  • Registered Users Posts: 885 ✭✭✭I.R.Y.E.D


    "first thing that will need to go is religions"

    Personally no problem with that, it would help reduce child abuse and calls for violence because people are different, something some people are trying to increase instances of on musks version of twitter.



Advertisement