You most probably viewed these types of videos a few times and the algorithm started recommending them to you. It only takes a couple of videos for the algorithm to start recommending video of the same topic. You could easily solve this by clicking on the 3 dots next to the video and then selecting "Not interested", do it enough times and they'll be gone from your feed.
Asklemmy
A loosely moderated place to ask open-ended questions
Search asklemmy π
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
This, the algorithm doesn't care whether you ~~like~~ enjoy it or not, it cares whether you engage with it or not. Even dislikes are engagement.
I'm not talking about liking the video, I'm talking about viewing the video. Viewing the video, especially a large part of it, will contribute to the watch time, therefore affecting the algorithm's recommendations.
It's also kinda dodgy what YouTube considers "viewing" a video. 3 seconds of autoplay? You viewed it. Misclick an ad trying to skip? Viewed it. AndroidTV previewed the video when you left it highlighted for too long? You guessed it, viewed.
Fortunately figuring out what it considers you to have viewed is pretty straightforward: if it's in your watch history it considers it viewed. You can also go through that and manually purge things.
Of course there's still the question of why YouTube will so consistently recommend videos that get people started down this downhill slide of absolute dogshit atrocious content.
I meant like as in "enjoy" not as in click the thumbs up button, sorry for the confusion. Even a video that pisses you off, that you thumb it down along with every comment is considered engagement and it not only feeds you more but also boost the video as a whole.
All it cares is whether it keeps you on YouTube. It's like news, outrage is profitable even if everyone hates it.
I am constantly bombarded with Jordan Peterson videos despite disliking them and telling the algorithm to show less like this.
Iβm not sure how it profiles people, but it sucks.
Disliking is engagement. Instead tap the three dots and select "not interested"
Go into your watch history and remove them after disliking it. I find removing stuff from my watch history to have a bigger impact over disliking stuff. You can also have YouTube stop recommending specific channels to you. Odds are if they're posting Jordan Peterson content, you're not missing anything of value by blocking them
If I accidently watch a Linus Tech tips video, that will be all it recommends me for the next month.
I watched a Some More News video criticizing Jordan Peterson, and Google thought "did I hear Jordan Peterson? Well in that case, here's 5 of his videos!"
Almost all content algorithms are hot garbage that are not interested in serving you what you want, just what makes money. It always ends up serving right wing nut jobs because that conspiracy theorists watch a lot of scam videos.
Edit: my little jab at Linus has nothing to do with politics. I have no idea what his views are. I only mentioned it to point out how YouTube will annihilate my recommendations if I watch a single one of his videos.
I watch Linus from time to time, but donβt get that sort of recommendation (unless I watch some gun videos!). I only watch his tech stuff and donβt know anything about his politics. Now Iβm worried.
It sees you like standup comedy (POTENTIALLY ANTI-WOKE) and gaming (POTENTIALLY INCEL) and in your 30s (LIKELY HAS SOME DISPOSABLE INCOME)
So it's pipelining you to the annals of YouTube that check those boxes, have very good viewer retention and have good user engagement.
The ones with good user engagement are the ones that raise your blood pressure. They make you angry. So, politics, or dirty cops, interrogations, murder stories, stuff like that.
You can resist them all you want, but stuff that makes people angry and chatty and commenty and re-watchy is like cocaine to the algorithm. It makes them money, so they spam it everywhere it even remotely makes sense to try to get you stuck in their quicksand.
Because 'conservative' content gets a lot of engagement (ie ad money). The more they recommend it the bigger the audience, the bigger ad payout. They're literally monetizing hate.
Maybe you have the same problem I have: my wife is still a republican. When that kind of stuff shows up, I know she has been watching it on the family PC. She's not that tech savvy, so I usually go in later and block or limit some of it. It's a pain to fight the algorithms.
As a woman, I can never understand how other women can be Republican.
My mom was a Dem growing up, but then she fell down the religious rabbit hole after I left home and it was all downhill from there.
My little sister though? I have zero idea how she ended up Republican. It's fucking bizarre. π€·ββοΈ
βThe Leopards wonβt eat my faceβ
Its the "Leopards eating people's faces party"; not the " Leopards eating my face party". GOSH!
PS - is that a community here yet?
Conservative proganda is a highly coordinated and well-funded network. They pay for preference.
I get them all the time. Pro-gun crap, Jordon Peterson, Joe Rogan, and other trash. I constantly hit βdonβt recommend this channelβ or βshow me less of thisβ but they still come. I honestly think Youtube is getting paid off.
Machining and blacksmithing are highly correlated with right wing BS in the US. Check my uname and ask me how I know. π
For me if I ever look up warhammer 40k it immediately starts sending me losers like quartering or other channels like him.
Like, bo youtube, I don't want to hear about how feminism and "wokes" are ruining warhammer. I also don't want to be sent Sargon of Carl videos. Uhg. Lmao
My interest are gaming, standup comedy, memes, react videos, metalurgy, machining, blacksmithing,
Because a lot of the type of guys who like seeing those stupid conservative videos also like many of the same things as you. Gaming^(TM) is well known to have a problem with the alt-right, react videos have a very similar structure to conservative βlibs destroyed with fact and logicβ types of videos, and finally a lot of conservatives like to think of themselves as an old fashioned manβs-man so they enjoy things like metalwork and other typical βmanlyβ careers.
Not to be that guy, but you can say "don't recommend this channel again" to YouTube. I haven't seen Quartering, Asmogold, etc for years now. Unless you search for them.
I had the issue that I got Andrew Tate shit recommended. I said don't recommend that, and block the uploader. Youtube still suggested me that video. Exactly that video.
Your interests have a strong correlation with people on the right aside from maybe react videos.
But even if your interested were not so strongly correlated with the right, you would probably still get right wing ads or videos suggested. They garner the highest engagement because it is often outrage porn. Google gets their money that way. My subscriptions are to let wing political channels, science, and solar channels but I still get a decent amount of PragerU and Matt Walsh ads. Reporting then does not stop them from popping up either.
Right-wing media is well-funded.
Algorithms. It's why I use newpipe on mobile and freetube on desktop. Both allow me to import and export my subscription list, and block ads and have sponsorblock support.
But, most importantly it doesn't make use of the Google algorithm to try and recommend me videos that might be of the radical nature on misinterpretation of past viewing history.
three dots -> don't recommend channel
use it on anything even a little sus
One thing I noticed about browsing the youtube homepage on PC is if your mouse hovers over a video, it starts playing, and that puts it in your watch history. So you might be accidentally adding a trash video it recommended to your watch history while looking at the other offerings. You can disable the "mouse hover auto play" by clicking your profile pic in the top right > settings > playback and performance > inline playback.
It's because you are a guy in your 30s. You (and me) are in a shitty demographic so the algorithm peddles us shit. It doesn't matter how many times I list shitheel vids by the ppl you me mentioned as "do not recommend channel", I will always have Rogan or Peterson popping up in my feed to ruin my day. It drives me crazy
It can't just be that though, I'm a guy in my 30s and I never get any of these channels recommended to me, just videos by channels I've subscribed to or similar to what I've been watching, even after watching political videos I don't get those shysters recommended to me.
You've probably watched the other side quite often, so YouTube is recommending this for ya to enrage you and keep you engaged. Just good old rage bating
You need to train your algorithm.
When you hover over those videos there will be three dots in the lower right hand corner use either not interested or don't recommend this channel to clear the right wing trash from your feed.
On videos you do like you need to do some engagement, watch complete videos, thumbs up (or down it doesn't matter) and comment.
Do that for a few days and your feed should clear up.
I've been vegetarian since 2009 and in the past 2-3 years I keep getting all kinds of bow hunting videos.
It's particularly amusing when my plan was to watch an old episode of "The Golden Girls."
They REALLY misread my demographic.
I'm 30 and have a small family, too. When I watch shorts on YouTube I get the exact same content you're describing. None of the long videos I'm watching are political, yet the Algo keeps throwing them at me. I get a lot of Jordan Peterson crap or lil Wayne explaining how there's no racism. I hate it.
A while back the conservative party of Canada was caught inserting MGTOW / Ben Shapiro tags on all their Youtube uploads. In other words they were poisoning peoples social graph in order to cause exactly what you're talking about.
This is why I use a Google account that is only for Youtube entertainment. I keep it on a separate chromium profile. I turn on all the privacy toggles in the Google account. Only Youtube history is turned on. I curate the watch history.
You cannot tell what content might have breadcrumbs that eventually open the floodgates of far right echo chambers. They do this intentionally. So it requires active measures on your part to counter them. You've got to manage your account with intention. I do not use that account at all for random browsing. I usually do that in incognito on a different browser.
Clear your history! I limit a lot of what Google collects. If I see videos trending in that direction, I clear youtube and start fresh.
I think it might be the opposite. If extreme content is kind of "default" when the algorithm doesn't know what to give you, them starving it off history might push it into that default more often. I have a very used YouTube account with a metric ton of history and honestly I very rarely see that kind of content. (From Europe though so it might be different)
Conseratives and fascists are the same group, so I'll refer to them as fascists.
You are talking about one of the core criticisms of corporate secret algorithms to determine what to influence you with. Fascism is forced to creep into everyones world view when you use standard social media, and the average person wouldn't have the slightest idea. Certain key things will be more related to fascist content, like philosophy, psychology, guns, comedy. If you think about what fascists enjoy, or what they need to slander then it makes what I said make more sense.
Jordan Peterson does a lot of vids around psych/philosophy to redirect curious people to false answers that are close to true but more agreeable for fascists. An example of a psychological cooption is "mass psychosis" being coopted into "mass formation psychosis" by fascists. Mass psychosis explains too many true things, where mass formation psychosis redirects people towards a more palletable direction for them.
This is why I want to be nowhere near corporate media if possible. If you delete your cookies(or private browse for the same effect) then youtube will promote the most adjacent things to what you watch like old youtube used to do, although it'll still promote fascism when directly adjacent. With cookies though they have an excuse to have questionable content linger statistically too often.
Because the algorithms favor alt right garbage heaps and the companies will never bother to fix them. Hence why we need some regulations.
I bought a new phone a few months back, and just for shits and giggles I tried looking at YouTube Shorts without any account logged in. I clicked on a food Shorts, and within 5 Shorts, I got a "manosphere" video, and within 10 Shorts I got Ben Shapiro.
Unfortunately, fear and rage drive engagement, and these Conservative grifters are more than happy to give it to them and more. That on top of corporations being Capitalists and thus right-wing by default.
Those types of videos have the most engagement. YouTube is trying to show you whatever it thinks will keep you there longer.
Turns out conservative radicalization keeps people longer there. Iβve never googled or watched any Andrew Tate videos but my recommended has at least 3 videos front and center.
"Hi, I'm a guy in early thirties--"
Well there's your answer.
I very rarely use YouTube, but I've noticed that when I start listening to older top 40 or contemporary bluegrass, those videos drop in
Youtube really said "you're old, you must be racist" lmao
you are a closet conservative. the algoritham has spoken.