this post was submitted on 23 Jan 2025
897 points (97.4% liked)

Technology

60800 readers
3901 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

TLDR if you don't wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for "Kamilia" would lose you "10000 rizz", and how voting for Trump would get you "1 million rizz".

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn't necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

(page 3) 49 comments
sorted by: hot top controversial new old
[–] SamuelRJankis@lemmy.world 47 points 19 hours ago (3 children)

Instagram is probably notably worse, I have a very establish account that should be very anti that sort of thing and it keeps serving up idiotic guru garbage.

Tiktok is by far the best in this aspect, at least before recent weeks.

[–] IllNess@infosec.pub 16 points 18 hours ago (1 children)

A couple of years ago, I started two other Instagram accounts besides my personal one. I needed to organize and have more control of what content I want to see at times I choose. One was mostly for combat sports, other sports, and fitness. The second one was just food.

The first one, right off the bat showed me girls with OnlyFan accounts in the discovery page. Then after a few days, they begin showing me right wing content, and alpha male garbage.

The second one, the food account, showed alternative holistic solutions. Stuff like showing me 10 different accounts of people suggesting I consume raw milk. They started sending me a mix of people who just eat meat and vegans.

It's really wild what these companies show you to complete your profile.

[–] SamuelRJankis@lemmy.world 13 points 18 hours ago

I saw a tiktok video talking about how Instagram starts the redpill/incel stuff early for the young people then they become failures in life at which point they push the guru stuff for "guidance".

EU and even China has at least made a attempt of holding these companies accountable for the algorithm but US and Canadian government just sat there and did nothing.

[–] brucethemoose@lemmy.world 3 points 19 hours ago
[–] LandedGentry@lemmy.zip 9 points 15 hours ago

This is basically the central thesis of The Social Dilemma.

[–] MITM0@lemmy.world 4 points 12 hours ago (2 children)

So....... in the US then ?

load more comments (2 replies)
[–] ohlaph@lemmy.world 15 points 18 hours ago (1 children)

If I see any alt-right content, I immediately block the account and report it. I don't see any now. I go to yourube for entertainment only. I don't want that trash propaganda.

[–] KariKariCrunch@lemmy.world 4 points 15 hours ago

Same. I watched one Rogan video in like, 2019, and it was like opening a flood gate. Almost immediately almost every other recommendation was some right-wing personality's opinion about "cancel culture" or "political correctness." It eventually called down once I started blocking those channels and anything that looks like it might lead to that kind of content. I can only imagine what would pop up now.

[–] thezeesystem@lemmy.blahaj.zone 11 points 17 hours ago

All platforms are now excessively catering to Elon Nazi trump America. It's pretty much propaganda. And it's extreme and excessive.

[–] jared@mander.xyz 15 points 19 hours ago

Don't let the algorithm feed you!

[–] bulwark@lemmy.world 10 points 19 hours ago

I noticed my feed almost immediately changed after Trump was elected. I didn't change my viewing habits. I'm positive YouTube tweaked the algorithm to lean more right.

[–] ohellidk@sh.itjust.works 9 points 19 hours ago

Crazy stuff. So not only does YouTube make you generally dumber, it now is pushing the audience to more Conservative viewpoints because of the "emotional engagement" that keeps 'em watching. and YouTube probably sells more premium subscriptions that way. fuck google!

[–] Hope@lemmy.world 8 points 19 hours ago* (last edited 19 hours ago) (1 children)

Just scrolling through shorts on a given day, I'm usually recommended at least one short by someone infamously hostile to the LGBTQIA+ community. I get that it could be from people with my interests hate-watching, but I don't want to be shown any of it. (Nearly all of my YouTube subscriptions are to LGBTQIA+ creators. I'm not subscribed to anyone who has ever even mentioned support for right leaning policies on anything.)

[–] UraniumBlazer@lemm.ee 1 points 13 hours ago

Oh same! There's also casual hate towards the queer community in really random videos.

[–] random_character_a@lemmy.world 0 points 14 hours ago (1 children)

You get what you usually click?

[–] psx_crab@lemmy.zip 3 points 13 hours ago* (last edited 13 hours ago)

I didn't watch the video, but it's YT short, you just swipe like tiktok. The few ways to curate the algorithm is to either swipe away quickly, click on the "not interested" button, downvote, or delete watched shorts from history. If you doesn't interact with any of this and watch the full length of the video, the algorithm gonna assume you like this kind of content. They also will introduce you content you never watched before to gauge your interest, a lot of times it's not even related to what you currently watched, and if you didn't do any curation, they gonna feed you the exact type for some times. I don't know how they manage the curation but that's the gist of it from my experience. My feed have 0 politics, mostly cats. I control the feed strictly so i got what i demand.

load more comments
view more: ‹ prev next ›