I made sure to remove cookies and not sign in so I think these are the base suggestions made by youtube.

  • captainlezbian@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    The YouTube algorithm is biased towards content that encourages further engagement, anger causes that, and right wing propaganda is designed to make people angry

  • rm_dash_r_star@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    This is the reason why all corporate media has become the dumpster fire it is (not just social media). They use negative emotion like fear and anger to promote engagement. So all you get as a viewer is stuff that gets you fired up. The quality of journalism is so low now they’re fabricating stuff to engage the viewer. Then there’s no journalistic accountability when they do get caught with their hand in the cookie jar.

    • Asafum@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      “Then there’s no journalistic accountability when they do get caught with their hand in the cookie jar.”

      Literally Fox News Entertainment when brought to court over Tucker Carlsons bullshit…

      “We’re not news we’re Entertainment™©®, no reasonable person would believe what Mr.Carlson is saying is true.” So they don’t need to have any integrity whatsoever…

  • emeralddawn45@discuss.tchncs.de
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    It’s definitely heavily weighted towards right wing content. Even though it should be less popular. It’s like how right wing states always complain about “liberal spending” but then their entire states are subsidized by blue states because their policies are trash. Right wing ideology can’t stand on its own without ‘training wheels’ ie: funding from massively wealthy donors who want to manipulate people’s reality.

  • dottedgreenline@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 year ago

    I was looking for Captain Marvel movie content way back when it came out and accidentally clicked on one of those pitiful right-wing, woman-hating nerdbro videos and my suggestions suddenly became quite heavily peppered with similar horrible content. It feels the same thing doesn’t happen with non-conservative content quite so much. I wonder if there’s just so much right-wing and liberal content that the algorithm’s percentage calculations don’t know how to compensate for this sort of political nuance, or is it something more insidious?

    • kali@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I’m trans, watch OneTopicAtATime and similar pro-trans and pro-gay channels and still get recommended Matt Walsh somehow.

  • Thorny_Thicket@sopuli.xyz
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    1 year ago

    It’s probably what people have searched for before. I mean why would anyone watch peaceful protest videos on youtube?

  • Candelestine@lemmy.world
    link
    fedilink
    arrow-up
    0
    arrow-down
    7
    ·
    1 year ago

    Did you change your IP address as well?

    At any rate, google shows things to people that other people are looking at, as that’s how it’s algorithm decides what is popular or not. So, the conclusion that can be drawn from this, is that most people that search for videos of protestors, are looking to see them get owned. If you consider internet demographics, this should not be that unusual, really.