Welcome to Incremental Social! Learn more about this project here!
Check out lemmyverse to find more communities to join from here!

cnn.com

The_Tired_Horizon , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business
@The_Tired_Horizon@lemmy.world avatar

I gave up reporting on major sites where I saw abuse. Stuff that if you said that in public, also witnessed by others, you've be investigated. Twitter was also bad for responding to reports with "this doesnt break our rules" when a) it clearly did and b) probably a few laws.

reverendsteveii , (edited )

I gave up after I was told that people DMing me photographs of people committing suicide was not harassment but me referencing Yo La Tengo's album "I Am Not Afraid Of You And I Will Beat Your Ass" was worthy of a 30 day ban

PiratePanPan ,

I remember one time somebody tweeted asking what the third track off Whole Lotta Red and I watched at least 50 people get perma'd before my eyes.

The third track is named Stop Breathing.

LiveLM ,

I TAKE MY SHIRT OFF AND ALL THE HOES STOP BREATHIN' accessing their Twitter accounts WHEH? 🧛‍♂️🦇🩸

PiratePanPan ,

sLatt! +*

The_Tired_Horizon ,
@The_Tired_Horizon@lemmy.world avatar

On youtube I had a persistent one who only stopped threatening to track me down and kill me (for a road safety video) when I posted the address of a local police station and said "pop in, any time!"

BaardFigur ,

[Thread, post or comment was deleted by the author]

  • Loading...
  • cows_are_underrated ,

    That's true, but a lotnof things are illegal eeverywhere. Sexual Harassment or death treads will get you a lawsuit in probably every single country of the world.

    prole ,
    @prole@sh.itjust.works avatar

    Lawsuits are for civil cases. If someone breaks a law, they're charged by authorities at their discretion.

    The_Tired_Horizon ,
    @The_Tired_Horizon@lemmy.world avatar

    Laws against threats to kill, rape and assault tend to be pretty constant across the world.. 🤷‍♂️

    otp ,

    Americans online regularly tell me that that's protected free speech down there! Haha

    yarr , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business

    Are the platforms guilty or are the users that supplied the radicalized content guilty? Last I checked, most of the content on YouTube, Facebook and Reddit is not generated by the companies themselves.

    KneeTitts ,
    @KneeTitts@lemmy.world avatar

    most of the content on YouTube, Facebook and Reddit is not generated by the companies themselves

    Its their job to block that content before it reaches an audience, but since thats how they make their money, they dont or wont do that. The monetization of evil is the problem, those platforms are the biggest perpetrators.

    yarr ,

    Its their job to block that content before it reaches an audience

    The problem is (or isn't, depending on your perspective) that it is NOT their job. Facebook, YouTube, and Reddit are private companies that have the right to develop and enforce their own community guidelines or terms of service, which dictate what type of content can be posted on their platforms. This includes blocking or removing content they deem harmful, objectionable, or radicalizing. While these platforms are protected under Section 230 of the Communications Decency Act (CDA), which provides immunity from liability for user-generated content, this protection does not extend to knowingly facilitating or encouraging illegal activities.

    There isn't specific U.S. legislation requiring social media platforms like Facebook, YouTube, and Reddit to block radicalizing content. However, many countries, including the United Kingdom and Australia, have enacted laws that hold platforms accountable if they fail to remove extremist content. In the United States, there have been proposals to amend or repeal Section 230 of CDA to make tech companies more responsible for moderating the content on their sites.

    ITGuyLevi ,

    The argument could be made (and probably will be) that they promote those activities by allowing their algorithms to promote that content. Its's a dangerous precedent to set, but not unlikely given the recent rulings.

    FlyingSpaceCow ,

    Any precedent here regardless of outcome will have significant (and dangerous) impact, as the status quo is already causing significant harm.

    For example Meta/Facebook used to prioritize content that generates an angry face emoji (over that of a "like") - - as it results in more engagement and revenue.

    However the problem still exists. If you combat problematic content with a reply of your own (because you want to push back against hatred, misinformation, or disinformation) then they have even more incentiive to show similar content. And they justify it by saying "if you engaged with content, then you've clearly indicated that you WANT to engage with content like that".

    The financial incentives as they currently exist run counter to the public good

    joel_feila ,
    @joel_feila@lemmy.world avatar

    Yeah i have made that argument before. By pushing content via user recommended lists and auto play YouTube becomes a publisher and meeds to be held accountable

    hybridhavoc ,
    @hybridhavoc@lemmy.world avatar

    Not how it works. Also your use of "becomes a publisher" suggests to me that you are misinformed - as so many people are - that there is some sort of a publisher vs platform distinction in Section 230. There is not.

    joel_feila ,
    @joel_feila@lemmy.world avatar

    Oh no i am aware of that distinction. I just think it needs to go away and be replaced.

    Currently sec 230 treats websites as not responsible for user generated content. Example, if I made a video defaming someone I get sued but YouTube is in the clear. But if The New York Times publishes an article defaming someone they get sued not just the writer.

    Why? Because NYT published that article but YouTube just hosts it. This publisher platform distinction is not stated in section 230 but it is part of usa law.

    hybridhavoc ,
    @hybridhavoc@lemmy.world avatar

    This is frankly bizarre. I don't understand how you can even write that and reasonably think that the platform hosting the hypothetical defamation should have any liability there. Like this is actually a braindead take.

    reverendsteveii ,

    this protection does not extend to knowingly facilitating or encouraging illegal activities.

    if it's illegal to encourage illegal activities it's illegal to build an algorithm that automates encouraging illegal activities

    afraid_of_zombies ,

    What if I built software to build software to do it?

    hybridhavoc ,
    @hybridhavoc@lemmy.world avatar

    Repealing Section 230 would actually have the opposite effect, and lead to less moderation as it would incentivize not knowing about the content in the first place.

    afraid_of_zombies ,

    I can't see that. Not knowing about it would be impossible position to maintain since you would be getting reports. Now you might say they will disable reports which they might try but they have to do business with other companies who will require that they do. Apple isn't going to let your social media app on if people are yelling at Apple about the child porn and bomb threats on it, AWS will kick you as well, even Cloudflare might consider you not worth the legal risk. This has already happened multiple times even with section 230 providing a lot of immunity to these companies. Without that immunity they would be even more likely to block.

    baru ,

    Those sites determine what they promote. Such sites often promote extreme views as it gets people to watch or view the next thing. Facebook for instance researched this outcome, then ignored that knowledge.

    reverendsteveii ,

    is the pusher guilty? last time I checked he didn't grow the poppies or process them into heroin.

    yarr ,

    That's why we have separate charges for drug manufacturing and distribution.

    afraid_of_zombies ,

    I never liked that logic it's basically "success has many father's but failure is an orphan" applied.

    Are you involved with something immoral? To the extent of your involvement is the extent of how immoral your actions are. Same goes for doing the right thing.

    Binthinkin , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business

    Goddamn right they do. Meta should be sued to death for the genocides too.

    Jaysyn , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business
    @Jaysyn@kbin.social avatar

    Good.

    There should be no quarter for fascists, violent racist or their enablers.

    Conspiracy for cash isn't a free speech issue.

    Morefan ,

    for fascists, violent racist or their enablers.

    Take a good long look in the mirror (and a dictionary printed before 2005) before you say things like this.

    PiratePanPan ,
    Jaysyn ,
    @Jaysyn@kbin.social avatar

    Fuck off, symp.

    Morefan ,

    Glow harder.

    roguetrick , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business

    They're appealing the denial of motion to dismiss huh? I agree that this case really doesn't have legs but I didn't know that was an interlocutory appeal that they could do. They'd win in summary judgement regardless.

    Phanatik , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business

    I don't understand the comments suggesting this is "guilty by proxy". These platforms have algorithms designed to keep you engaged and through their callousness, have allowed extremist content to remain visible.

    Are we going to ignore all the anti-vaxxer groups who fueled vaccine hesitancy which resulted in long dead diseases making a resurgence?

    To call Facebook anything less than complicit in the rise of extremist ideologies and conspiratorial beliefs, is extremely short-sighted.

    "But Freedom of Speech!"

    If that speech causes harm like convincing a teenager walking into a grocery store and gunning people down is a good idea, you don't deserve to have that speech. Sorry, you've violated the social contract and those people's blood is on your hands.

    SuperSaiyanSwag ,

    This may seem baseless, but I have seen this from years of experience in online forums. You don’t have to take it seriously, but maybe you can relate. We have seen time and time again that if there is no moderation then the shit floats to the top. The reason being that when people can’t post something creative or fun, but they still want the attention, they will post negative. It’s the loud minority, but it’s a very dedicated loud minority. Let’s say we have 5 people and 4 of them are very creative time and funny, but 1 of them complains all the time. If they make posts to the same community then there is a very good chance that the one negative person will make a lot more posts than the 4 creative types.

    Kyatto ,
    @Kyatto@leminal.space avatar

    Oh absolutely, and making something creative takes days, weeks, months.

    Drama, complaining, conspiracy theorizing, and hate-videos take a few minutes to make more than the video itself lasts.

    firadin ,

    Not just "remain visible" - actively promoted. There's a reason people talk about Youtube's right-wing content pipeline. If you start watching anything male-oriented, Youtube will start slowly promoting more and more right-wing content to you until you're watching Ben Shaprio and Andrew Tate

    BeMoreCareful ,

    YouTube is really bad about trying to show you right wing crap. It's overwhelming. The shorts are even worse. Every few minutes there's some new suggestion for some stuff that is way out of the norm.

    Tiktok doesn't have this problem and is being attacked by politicians?

    reverendsteveii ,

    it legit took youtube's autoplay about half an hour after I searched "counting macros" to bring me to american monarchist content

    Wogi ,

    Oh are we doing kings now?

    Vote for me for King, I'll make sure there's a taco truck on every corner.

    captainlezbian ,

    I’ll vote for you if you make me the Marquess of Michigan

    Wogi ,

    ... Deal. But I reserve the right to turn the upper peninsula in to my own personal resort.

    captainlezbian ,

    Deal, but Michigan gets Cleveland then. I feel like Cleveland for the UP is a fair trade

    Wogi ,

    That's actually a great idea. I've always been a fan of the Browns.

    Ragnarok314159 ,

    I got into painting mini Warhammer 40k figurines during covid, and thought the lore was pretty interesting.

    Every time I watch a video, my suggested feed goes from videos related to my hobbies to entirely replaced with red pill garbage. The right wing channels have to be highly profitable to YouTube to funnel people into, just an endless tornado of rage and constant viewing.

    Gullible ,

    The algorithm is, after all, optimized for nothing other than advertisements/time period. So long as the algorithm believes that a video suggestion will keep you on the website for a minute more, it will suggest it. I occasionally wonder about the implications of one topic leading to another. Is WH40k suggested the pipeline by demographics alone or more?

    Irritation at suggestions was actually what originally led me to invidious. I just wanted to watch a speech without hitting the “____ GETS DUNKED ON LIKE A TINY LITTLE BITCH” zone. Fuck me for trying to verify information.

    r3df0x ,

    One thing to consider is that conservatives are likely paying for progressives to see their content, and geeks tend to have liberal views and follow the harm principle without many conditions.

    Otherwise, it really shows the demographics of the people who play Warhammer. Before my sister transitioned, she played Warhammer and was a socialist but had a lot of really wehraboo interests. She has been talking about getting back into it, but she passes really well and imagines how it would go with the neckbeards.

    driving_crooner ,
    @driving_crooner@lemmy.eco.br avatar

    What about youtube? That had actually paid those people to spread their sick ideas, making the world a worst place and getting rich while doing it.

    Phanatik ,

    YouTube will actually take action and has done in most instances. I won't say they're the fastest but they do kick people off the platform if they deem them high risk.

    cows_are_underrated ,

    "But freedom of speech"

    If that speech causes harm like convincing a teenager walking into a grocery store and gunning people down is a good idea, you don't deserve to have that speech.

    In Germany we have a very good rule for this(its not written down, but that's something you can usually count onto). Your freedom ends, where it violates the freedom of others. Examples for this: Everyone has the right to live a healthy life and everyone has the right to walk wherever you want. If I now take my right to walk wherever to want to cause a car accident with people getting hurt(and it was only my fault). My freedom violated the right that the person who has been hurt to life a healthy life. That's not freedom.

    RaoulDook ,

    Very reasonable, close to the "Golden Rule" concept of being excellent to each other

    Syringe ,

    In Canada, they have an idea called "right to peace". It means that you can't stand outside of an abortion clinic and scream at people because your right to free speech doesn't exceed that person's right to peace.

    I don't know if that's 100% how it works so someone can sort me out, but I kind of liked that idea

    afraid_of_zombies ,

    Ok...don't complain to me later when the thing you like gets taken down.

    antidote101 , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business

    Can we stop letting the actions of a few bad people be used to curtail our freedom on platforms we all use.

    I don't want the internet to end up being policed by corporate AIs and poorly implemented bots (looking at you auto-mod).

    The internet is already a husk of what it used to be, what it could be. It used to be personal, customisable... Dare I say it; messy and human...

    .... maybe that was serving a need that now people feel alienated from. Now we live as corporate avatars who risk being banned every time we comment anywhere.

    It's tiresome.

    tocopherol ,
    @tocopherol@lemmy.dbzer0.com avatar

    Facebook and others actively promote harmful content because they know it drives interactions, I believe it's possible to punish corps without making the internet overly policed.

    tbs9000 ,

    I agree with you in spirit. The most common sentiment I see among the comments is not to limit what people can share but how actively platforms move people down rabbit holes. If there is not action on the part of the platforms to correct for this, they risk regulation which in turn puts freedom of speech at risk.

    Embarrassingskidmark , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business

    The trifecta of evil. Especially Reddit, fuck Reddit... Facebook too.

    echodot ,

    Facebook will have actively pushed this stuff. Reddit will have just ignored it, and YouTube just feeds your own bubble back to you.

    YouTube doesn't radicalize people, it only increases their existing radicalization, but the process must start elsewhere, and to be completely fair they do put warnings and links to further information on the bottom of questionable videos, and they also delist quite a lot of stuff as well.

    I don't know what's better to completely block conspiracy theory videos or to allow them and then have other people mock them.

    jkrtn ,

    Hard disagree that YouTube doesn't radicalize people. It's far too easy to have Ben Shapiro show up in the recommendations.

    echodot ,

    Well I don't know who that is, my which is my point really. I'm assuming he's some right wing conspiracy theorist but because I'm not already pre-disposed to listen to that kind of stuff I don't get it in my recommendations.

    Meanwhile Facebook would actively promote that stuff.

    afraid_of_zombies ,

    Well I don’t know who that is,

    Consider yourself lucky.

    echodot ,

    Yeah I feel like people are missing my point I don't know who it is and I don't get recommended his content.

    The only people who get recommended his content are people who are already going to be thinking along those lines and watching videos along those lines.

    YouTube does not radicalize people they do it to themselves.

    afraid_of_zombies ,

    Right except people are telling you, repeatedly, that this isn't true.

    ultranaut ,

    Why do you believe "the process must start elsewhere"? I've literally had YouTube start feeding me this sort of content, which I have no interest in at all and actively try to avoid. It seems very obvious that YouTube is a major factor in inculcating these belief systems in people who would otherwise not be exposed to them without YouTube ensuring they reach an audience.

    afraid_of_zombies ,

    YouTube would hit me hard with religious messaging and rightwing stuff. Which is not at all reflective of what content I want to view.

    Zuberi , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business

    Fuck Reddit, can't wait to see the IPO burn

    RealFknNito , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business
    @RealFknNito@lemmy.world avatar

    [Thread, post or comment was deleted by the moderator]

  • Loading...
  • RatBin ,

    Completely different cases, questionable comparison;

    • social media are the biggest cultural industry at the moment, albeit a silent and unnoticed one. Cultural industries like this are means of propaganda, information and socilalization, all of which is impactful and heavily personal and personalised for everyone's opinion.

    • thus the role of such an impactul business is huge and can move opinions and whole movements, the choices that people takes are driven by their media consumption and communities they take part in.

    • In other words, policy, algorhitms, GUI are all factors that drive the users to engage in speific ways with harmful content.

    RealFknNito ,
    @RealFknNito@lemmy.world avatar

    biggest cultural industry at the moment

    I wish you guys would stop making me defend corporations. Doesn't matter how big they are, doesn't matter their influence, claiming that they are responsible for someone breaking the law because someone else wrote something that set them off and they, as overlords, didn't swoop in to stop it is batshit.

    Since you don't like those comparisons, I'll do one better. This is akin to a man shoving someone over a railing and trying to hold the landowners responsible for not having built a taller railing or more gradual drop.

    You completely fucking ignore the fact someone used what would otherwise be a completely safe platform because another party found a way to make it harmful.

    polocy and algorithm are factors that drive users to engage

    Yes. Engage. Not in harmful content specifically, that content just so happens to be the content humans react to the strongest. If talking about fields of flowers drove more engagement, we'd never stop seeing shit about flowers. It's not them maliciously pushing it, it's the collective society that's fucked.

    The solution is exactly what it has always been. Stop fucking using the sites if they make you feel bad.

    RatBin ,

    Again, no such a thing as a neutral space or platform, case in point, reddit with its gated communities and the lack of control over what people does with the platform is in fact creating safe spaces for these kind of things. This may not be inentional, but it ultimately leads towards the radicalization of many people, it's a design choice followed by the internal policy of the admins who can decide to let these communities be on one of the mainstream websites. If you're unsure about what to think, delving deep into these subreddits has the effect of radicalising you, whereas in a normal space you wouldn't be able o do it as easily. Since this counts as engagement, reddit can suggest similar forums, leading via algorhitms to a path of radicalisation. This is why a site that claims to be neutra is't truly neutral.

    This is an example of alt-right pipeline that reddit succesfully mastered:

    The alt-right pipeline (also called the alt-right rabbit hole) is a proposed conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming provocative right-wing political content, such as antifeminist or anti-SJW ideas, gradually increases exposure to the alt-right or similar far-right politics. It posits that this interaction takes place due to the interconnected nature of political commentators and online communities, allowing members of one audience or community to discover more extreme groups (*https://en.wikipedia.org/wiki/Alt-right_pipeline*)

    And yet you keep comparing cultural and media consumption to a physical infrastructure, which is regulated as to prevent what you mentioned, an unsafe management of the terrain for instace. So taking your examples as you wanted, you may just prove that regulations can in fact exist and private companies or citizens are supposed to follow them. Since social media started to use personalisation and predictive algorhitms, they also behave as editors, handling and selecting the content that users see. Why woul they not be partly responsible based on your argument?

    RealFknNito ,
    @RealFknNito@lemmy.world avatar

    No such thing as neutral space

    it may not be intentional, but

    They can suggest similar [communities] so it can't be neutral

    My guy, what? If all you did was look at cat pictures you'd get communities to share fucking cat pictures. These sites aren't to blame for "radicalizing" people into sharing cat pictures any more than they are to actually harmful communities. By your logic, lemmy can also radicalize people. I see anarchist bullshit all the time, had to block those communities and curate my own experience. I took responsibility and instead of engaging with every post that pissed me off, removed that content or avoided it. Should the instance I'm on be responsible for not defederating radical instances? Should these communities be made to pay for radicalizing others?

    Fuck no. People are not victims because of the content they're exposed to, they choose to allow themselves to become radical. This isn't a "I woke up and I really think Hitler had a point." situation, it's a gradual decline that isn't going to be fixed by censoring or obscuring extreme content. Companies already try to deal with the flagrant forms of it but holding them to account for all of it is truly and completely stupid.

    Nobody should be responsible because cat pictures radicalized you into becoming a furry. That's on you. The content changed you and the platform suggesting that content is not malicious nor should it be held to account for that.

    cophater69 ,

    This is an extremely childish way of looking at the world, IT infrastructure, social media content algorithms, and legal culpability.

    RealFknNito ,
    @RealFknNito@lemmy.world avatar

    As neutral platforms that will as readily push cat pictures as often it will far right extremism and the only difference is how much the user personally engages with it?

    Whatever you say, CopHater69. You're definitely not extremely childish and radical.

    cophater69 ,

    Oh I'm most certainly a radical, but I understand what that means because I got a college degree, and now engineer the internet.

    RealFknNito ,
    @RealFknNito@lemmy.world avatar

    I doubt you could engineer a plug into your own asshole but sure, I'll take your word that you're not just lying and have expert knowledge on this field yet still refused to engage with the point to sling insults instead.

    cophater69 ,

    So triggered

    RealFknNito ,
    @RealFknNito@lemmy.world avatar

    Always something about radicals and their need to point out "Ur triggered"

    herpaderp , (edited )

    I’ve literally watched friends of mine descend into far right thinking and I can point to the moment when they started having algorithms suggest content that puts them down a “rabbit hole”

    Like, you’re not wrong they were right wing initially but they became the “lmao I’m an unironic fascist and you should be pilled like me” variety over a period of six months or so. Started stock piling guns and etc.

    This phenomena is so commonly reported it makes you start wonder where all these people deciding to “radicalize themselves” all at once seemingly came out in droves.

    Additionally, these companies are responsible for their content serving algorithms, and if they did not matter for affecting the thoughts of the users: why do propaganda efforts from nation states target their narratives and interests appearing within them if it was not effective? Did we forget the spawn and ensuing fall out of the Arab Spring?

    Bonesince1997 ,

    The examples you came up with hit that last line to a T!

    JustZ ,
    @JustZ@lemmy.world avatar

    [Thread, post or comment was deleted by the moderator]

  • Loading...
  • RealFknNito ,
    @RealFknNito@lemmy.world avatar

    [Thread, post or comment was deleted by the moderator]

  • Loading...
  • cophater69 ,

    Just say you dropped out of high school.

    RealFknNito ,
    @RealFknNito@lemmy.world avatar

    [Thread, post or comment was deleted by the moderator]

  • Loading...
  • cophater69 ,

    It's obvious bud. Awesome talk.

    Remember kids -- if you want to talk like a big kid, stay in school!

    RealFknNito ,
    @RealFknNito@lemmy.world avatar

    [Thread, post or comment was deleted by the moderator]

  • Loading...
  • JustZ ,
    @JustZ@lemmy.world avatar

    [Thread, post or comment was deleted by the moderator]

  • Loading...
  • Passerby6497 ,

    Yeah, good thing we don't have evidence of any social media company's algorithms radicalizing and promoting more and more extreme content to people.

    Could you imagine? Companies actively radicalizing people for money??

    RealFknNito ,
    @RealFknNito@lemmy.world avatar

    Fuck it's almost like they promote things that have high engagement and rage and fear happen to be our most reactive emotions.

    Could you imagine? A coincidence without a malicious conspiracy??

    Socsa , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business

    Please let me know if you want me to testify that reddit actively protected white supremacist communities and even banned users who engaged in direct activism against these communities

    FenrirIII ,
    @FenrirIII@lemmy.world avatar

    I was banned for activism against genocide. Reddit is a shithole.

    fine_sandy_bottom ,

    Well yeah it is but... what did you think would happen?

    misspacific ,
    @misspacific@lemmy.blahaj.zone avatar

    i was banned for similar reasons.

    seems like a lot of mods just have the ability to say whatever about whoever and the admins just nuke any account they target.

    Ragnarok314159 ,

    I have noticed a massive drop in the quality of posting in Reddit over the last year. It was on a decline, but there was a massive drop off.

    It’s anecdotal to what I have read off Lemmy, but a lot of high Karma accounts have been nuked due to mods and admins being ridiculously over zealous in handing out permabans.

    jkrtn ,

    Send your evidence to the lawyers, couldn't hurt.

    0x0 , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business

    It's never the parents, is it?

    geogle ,
    @geogle@lemmy.world avatar

    Ask those parents in the Michigan case

    ButtCheekOnAStick ,

    Ask the parents of the Menendez brothers, oh wait.

    PoliticalAgitator ,

    You mean the "responsible gun owners" who don't properly secure their weapons from a child?

    echodot ,

    I couldn't work this out from the article is it the parents raising this suit or the victims families?

    Simulation6 , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business

    Add Fox news and Trump rallies to the list.

    0x0 ,

    Don't forget Marilyn Manson and videogames.

    /s

    cophater69 ,

    Marilyn Manson led a charge to overthrow the government??

    Passerby6497 ,

    Doubt it. Last time I saw him on stage, he made trump look like an eloquent speaker.

    cophater69 ,

    [Thread, post or comment was deleted by the author]

  • Loading...
  • Passerby6497 ,

    Because I don't like than an artist I once enjoyed is a drugged out and drunken mess? Based on the reaction it definitely sounds like it.

    Didn't think that many lemmings likes washed up has been metal acts, but to each their own I guess.

    cophater69 ,

    I actually responded to the wrong person and I apologize. I've actually heard the same thing about MM lately -- just washed-up and sad.

    TheDarksteel94 ,

    Idk why you're getting downvoted for an obvious joke lol

    cophater69 ,

    Because it's not funny or relevant and is an attempt to join two things - satanic panic with legal culpability in social media platforms.

    TheBat ,
    @TheBat@lemmy.world avatar

    Not relevant?

    Metal music and videos games have been blamed for mass shootings before.

    allcopsarebad ,

    And this is neither of those things. This is something much more tangible, with actual science behind it.

    TheBat ,
    @TheBat@lemmy.world avatar

    Yes, that exactly is the point.

    How people who supposedly care for children's safety are willing to ignore science and instead choose to hue and cry about bullshit stuff they perceive (or told by their favourite TV personality) as evil.

    Have you got it now? Or should I explain it further?

    Didn't expect Lemmy to have people who lack reading comprehension.

    isles ,

    People don't appreciate having spurious claims attached to their legitimate claims, even in jest. It invokes the idea that since the previous targets of blame were false that these likely are as well.

    0x0 ,

    They're all external factors. Music and videogames have been (wrongly, imo) blamed in the past. Media, especially nowadays, is probably more "blameable" than music and games, but i still think it's bs to use external factors as an excuse to justify mass shootings.

    isles ,

    What are the internal factors of a person that are not influenced by the environment or culture?

    Not_mikey , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business

    Sweet, I'm sure this won't be used by AIPAC to sue all the tech companies for causing October 7th somehow like unrwa and force them to shutdown or suppress all talk on Palestine. People hearing about a genocide happening might radicalize them, maybe we could get away with allowing discussion but better safe then sorry, to the banned words list it goes.

    This isn't going to end in the tech companies hiring a team of skilled moderators who understand the nuance between passion and radical intention trying to preserve a safe space for political discussion, that costs money. This is going to end up with a dictionary of banned and suppressed words.

    glovecraft ,

    This is going to end up with a dictionary of banned and suppressed word

    Do you have some examples?

    Alpha71 ,

    It's already out there. For example you can't use the words "Suicide" or "rape" or "murder" in YouTube, TikTok etc. even when the discussion is clearly about trying to educate people. Heck, you can't even mention Onlyfans on Twitch...

    Makhno ,

    Heck, you can't even mention Onlyfans on Twitch...

    They don't like users mentioning their direct competition

    0x0 ,

    Plus more demands for brackdoors in encryption.

    Nomad , to Technology in Judge rules YouTube, Facebook and Reddit must face lawsuits claiming they helped radicalize a mass shooter | CNN Business

    Nice, now do all regigions and churches next

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • incremental_games
  • meta
  • All magazines