• Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      10 months ago

      Facebook will have actively pushed this stuff. Reddit will have just ignored it, and YouTube just feeds your own bubble back to you.

      YouTube doesn’t radicalize people, it only increases their existing radicalization, but the process must start elsewhere, and to be completely fair they do put warnings and links to further information on the bottom of questionable videos, and they also delist quite a lot of stuff as well.

      I don’t know what’s better to completely block conspiracy theory videos or to allow them and then have other people mock them.

      • jkrtn@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Hard disagree that YouTube doesn’t radicalize people. It’s far too easy to have Ben Shapiro show up in the recommendations.

        • Echo Dot@feddit.uk
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          10 months ago

          Well I don’t know who that is, my which is my point really. I’m assuming he’s some right wing conspiracy theorist but because I’m not already pre-disposed to listen to that kind of stuff I don’t get it in my recommendations.

          Meanwhile Facebook would actively promote that stuff.

            • Echo Dot@feddit.uk
              link
              fedilink
              English
              arrow-up
              0
              ·
              10 months ago

              Yeah I feel like people are missing my point I don’t know who it is and I don’t get recommended his content.

              The only people who get recommended his content are people who are already going to be thinking along those lines and watching videos along those lines.

              YouTube does not radicalize people they do it to themselves.

      • afraid_of_zombies@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        YouTube would hit me hard with religious messaging and rightwing stuff. Which is not at all reflective of what content I want to view.

      • ultranaut@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Why do you believe “the process must start elsewhere”? I’ve literally had YouTube start feeding me this sort of content, which I have no interest in at all and actively try to avoid. It seems very obvious that YouTube is a major factor in inculcating these belief systems in people who would otherwise not be exposed to them without YouTube ensuring they reach an audience.

    • Lath@kbin.earth
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      So if some random hacker takes over your network connection and publishes illegal content which then leads back to you, you should be held responsible. It’s your platform after all.

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      10 months ago

      Then user generated content completely disappears.

      Without the basic protection of section 230, it’s not possible to allow users to exist or interact with anything. I’m not sure you could even pay for web hosting without it.

    • 0x0@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Content creators should be held responsible for their content. Platforms are mere distributors, in general terms, otherwise you’re blaming the messenger.

      Specific to social media (and television) yes, they bank on hate, it’s known - so don’t use them or do so with that ever dwindling human quality called critical thinking. Wanting to hold them accountable for murder is just dismissing the real underlying issues, like unsupervised impressionable people watching content, easy access to guns, human nature itself, societal issues…

  • RealFknNito@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Lol no. Social media isn’t responsible it’s the people on it. I fucking hate this brain dead logic of “Well punishing the bad person isn’t enough, go for the manufacturer!”

    Yeah, fuck it, next time someone is beaten to death with a power tool hold DeWalt accountable. Next time someone plays loud music during their murder hold Spotify accountable. So fucking retarded.

      • RealFknNito@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Fuck it’s almost like they promote things that have high engagement and rage and fear happen to be our most reactive emotions.

        Could you imagine? A coincidence without a malicious conspiracy??

    • RatBin@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Completely different cases, questionable comparison;

      • social media are the biggest cultural industry at the moment, albeit a silent and unnoticed one. Cultural industries like this are means of propaganda, information and socilalization, all of which is impactful and heavily personal and personalised for everyone’s opinion.

      • thus the role of such an impactul business is huge and can move opinions and whole movements, the choices that people takes are driven by their media consumption and communities they take part in.

      • In other words, policy, algorhitms, GUI are all factors that drive the users to engage in speific ways with harmful content.

      • RealFknNito@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        biggest cultural industry at the moment

        I wish you guys would stop making me defend corporations. Doesn’t matter how big they are, doesn’t matter their influence, claiming that they are responsible for someone breaking the law because someone else wrote something that set them off and they, as overlords, didn’t swoop in to stop it is batshit.

        Since you don’t like those comparisons, I’ll do one better. This is akin to a man shoving someone over a railing and trying to hold the landowners responsible for not having built a taller railing or more gradual drop.

        You completely fucking ignore the fact someone used what would otherwise be a completely safe platform because another party found a way to make it harmful.

        polocy and algorithm are factors that drive users to engage

        Yes. Engage. Not in harmful content specifically, that content just so happens to be the content humans react to the strongest. If talking about fields of flowers drove more engagement, we’d never stop seeing shit about flowers. It’s not them maliciously pushing it, it’s the collective society that’s fucked.

        The solution is exactly what it has always been. Stop fucking using the sites if they make you feel bad.

        • RatBin@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          Again, no such a thing as a neutral space or platform, case in point, reddit with its gated communities and the lack of control over what people does with the platform is in fact creating safe spaces for these kind of things. This may not be inentional, but it ultimately leads towards the radicalization of many people, it’s a design choice followed by the internal policy of the admins who can decide to let these communities be on one of the mainstream websites. If you’re unsure about what to think, delving deep into these subreddits has the effect of radicalising you, whereas in a normal space you wouldn’t be able o do it as easily. Since this counts as engagement, reddit can suggest similar forums, leading via algorhitms to a path of radicalisation. This is why a site that claims to be neutra is’t truly neutral.

          This is an example of alt-right pipeline that reddit succesfully mastered:

          The alt-right pipeline (also called the alt-right rabbit hole) is a proposed conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming provocative right-wing political content, such as antifeminist or anti-SJW ideas, gradually increases exposure to the alt-right or similar far-right politics. It posits that this interaction takes place due to the interconnected nature of political commentators and online communities, allowing members of one audience or community to discover more extreme groups (https://en.wikipedia.org/wiki/Alt-right_pipeline)

          And yet you keep comparing cultural and media consumption to a physical infrastructure, which is regulated as to prevent what you mentioned, an unsafe management of the terrain for instace. So taking your examples as you wanted, you may just prove that regulations can in fact exist and private companies or citizens are supposed to follow them. Since social media started to use personalisation and predictive algorhitms, they also behave as editors, handling and selecting the content that users see. Why woul they not be partly responsible based on your argument?

          • RealFknNito@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 months ago

            No such thing as neutral space

            it may not be intentional, but

            They can suggest similar [communities] so it can’t be neutral

            My guy, what? If all you did was look at cat pictures you’d get communities to share fucking cat pictures. These sites aren’t to blame for “radicalizing” people into sharing cat pictures any more than they are to actually harmful communities. By your logic, lemmy can also radicalize people. I see anarchist bullshit all the time, had to block those communities and curate my own experience. I took responsibility and instead of engaging with every post that pissed me off, removed that content or avoided it. Should the instance I’m on be responsible for not defederating radical instances? Should these communities be made to pay for radicalizing others?

            Fuck no. People are not victims because of the content they’re exposed to, they choose to allow themselves to become radical. This isn’t a “I woke up and I really think Hitler had a point.” situation, it’s a gradual decline that isn’t going to be fixed by censoring or obscuring extreme content. Companies already try to deal with the flagrant forms of it but holding them to account for all of it is truly and completely stupid.

            Nobody should be responsible because cat pictures radicalized you into becoming a furry. That’s on you. The content changed you and people posting that content are not malicious nor should be held to account for that.

            • herpaderp@lemmynsfw.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              10 months ago

              I’ve literally watched friends of mine descend into far right thinking and I can point to the moment when they started having algorithms suggest content that puts them down a “rabbit hole”

              Like, you’re not wrong they were right wing but they became “lmao I’m an unironic fascist and you should be pilled like me” variety over a period of six months or so. Started stock piling guns and etc.

              This phenomena is so commonly reported it makes you start wonder where all these people deciding to “radicalize themselves” all at once.

              Additionally, these companies are responsible for their content serving algorithms and if they did not matter for affecting the thoughts of the users: why do propaganda efforts from nation states target having their narratives and interests appear within them if it was not effective? Did we forget the spawn and ensuing fall out of the Arab Spring?

            • CopHater69@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              10 months ago

              This is an extremely childish way of looking at the world, IT infrastructure, social media content algorithms, and legal culpability.

              • RealFknNito@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                10 months ago

                As neutral platforms that will as readily push cat pictures as often it will far right extremism and the only difference is how much the user personally engages with it?

                Whatever you say, CopHater69. You’re definitely not extremely childish and radical.

                • CopHater69@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  10 months ago

                  Oh I’m most certainly a radical, but I understand what that means because I got a college degree, and now engineer the internet.

  • Socsa@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Please let me know if you want me to testify that reddit actively protected white supremacist communities and even banned users who engaged in direct activism against these communities

      • misspacific@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        i was banned for similar reasons.

        seems like a lot of mods just have the ability to say whatever about whoever and the admins just nuke any account they target.

        • Ragnarok314159@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          I have noticed a massive drop in the quality of posting in Reddit over the last year. It was on a decline, but there was a massive drop off.

          It’s anecdotal to what I have read off Lemmy, but a lot of high Karma accounts have been nuked due to mods and admins being ridiculously over zealous in handing out permabans.

  • Krudler@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    I just would like to show something about Reddit. Below is a post I made about how Reddit was literally harassing and specifically targeting me, after I let slip in a comment one day that I was sober - I had previously never made such a comment because my sobriety journey was personal, and I never wanted to define myself or pigeonhole myself as a “recovering person”.

    I reported the recommended subs and ads to Reddit Admins multiple times and was told there was nothing they could do about it.

    I posted a screenshot to DangerousDesign and it flew up to like 5K+ votes in like 30 minutes before admins removed it. I later reposted it to AssholeDesign where it nestled into 2K+ votes before shadow-vanishing.

    Yes, Reddit and similar are definitely responsible for a lot of suffering and pain at the expense of humans in the pursuit of profit. After it blew up and front-paged, “magically” my home page didn’t have booze related ads/subs/recs any more! What a totally mystery how that happened /s

    The post in question, and a perfect “outing” of how Reddit continually tracks and tailors the User Experience specifically to exploit human frailty for their own gains.

    • mlg@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Its not reddit if posts don’t get nuked or shadowbanned by literal sitewide admins

      • Krudler@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Yes I was advised in the removal notice that it had been removed by the Reddit Administrators so that they could keep Reddit “safe”.

        I guess their idea of “safe” isn’t 4+ million users going into their privacy panel and turning off exploitative sub recommendations.

        Idk though I’m just a humble bird lawyer.

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      10 months ago

      Yeah this happens a lot more than people think. I used to work at a hotel, and when the large sobriety group got together yearly, they changed bar hours from the normal hours, to as close to 24/7 as they could legally get. They also raised the prices on alcohol.

  • PorkSoda@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    Back when I was on reddit, I subscribed to about 120 subreddits. Starting a couple years ago though, I noticed that my front page really only showed content for 15-20 subreddits at a time and it was heavily weighted towards recent visits and interactions.

    For example, if I hadn’t visited r/3DPrinting in a couple weeks, it slowly faded from my front page until it disappeared all together. It was so bad that I ended up writing a browser automation script to visit all 120 of my subreddits at night and click the top link. This ended up giving me a more balanced front page that mixed in all of my subreddits and interests.

    My point is, these algorithms are fucking toxic. They’re focused 100% on increasing time on page and interaction with zero consideration for side effects. I would love to see social media algorithms required by law to be open source. We have a public interest in knowing how we’re being manipulated.

      • Corhen@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        thats why i always use youtube by subscribed first, then only delve into regular front page if theres nothing interesting in my subscriptions

    • Carlo@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Yeah, social media algorithms are doing a lot of damage. I wish there was more general awareness of this. Based on personal experience, I think many people actually like being fed relevant content, and are blind to the consequences. I think Lemmy is great, because you have to curate your own feed, but many people would never use it for that very reason. I don’t know what the solution is.

    • Fedizen@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I used google news phone widget years ago and clicked on a giant asteroid article, and for whatever reason my entire feed became asteroid/meteor articles. Its also just such a dumb way to populate feeds.

    • r3df0x ✡️✝☪️@7.62x54r.ru
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I agree. It’s important to remember the only “conspiracy” is making money and keeping people on the platform. That said, it will cause people to go down rabbit holes. The solution isn’t as simple as “show people content they disagree with” because they either ignore it or it creates another rabbit hole. For example, it would mean that progressives start getting bombarded with Tim Pool videos. I don’t believe Tim is intentionally “alt right” but that’s exactly why his videos are the most dangerous. They consist of nothing but conservative rage bait with a veneer of progressiveness that allows his viewers to believe they aren’t being manipulated.

  • Melllvar@startrek.website
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    I think there’s definitely a case to be made that recommendation algorithms, etc. constitute editorial control and thus the platform may not be immune to lawsuits based on user posts.

  • casual_turtle_stew_enjoyer@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    I will testify under oath with evidence that Reddit, the company, has not only turned a blind eye to but also encouraged and intentfully enabled radicalization on their platform. It is the entire reason I am on Lemmy. It is the entire reason for my username. It is the reason I questioned my allyship with certain marginalized communities. It is the reason I tense up at the mention of turtles.

  • UsernamesAreDifficult@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    Honestly, good, they should be held accountable and I hope they will be. They shouldn’t be offering extremist content recommendations in the first place.