this post was submitted on 10 Jul 2023
36 points (100.0% liked)

Asklemmy

1457 readers
65 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

Hi, I am a guy in early thirties with a wife and two kids and whenever I go on youtube it always suggests conservative things like guys dunking on women, ben shapiro reacting to some bullshit, joe rogan, all the works. I almost never allow it to go to that type of video and when I do it is either by accident or by curiosity. My interest are gaming, standup comedy, memes, react videos, metalurgy, machining, blacksmithing, with occasional songs and videos about funny bullshit and I am not from america and I consider myself pretty liberal if I had to put it into terms used in america. But european liberal, so by american standards a socialist. Why does it recommend this shit to me, is this some kind of vector for radicalization of guys in my category? Do you have similar experience?

top 50 comments
sorted by: hot top controversial new old
[–] const_void@lemmy.ml 10 points 1 year ago

Because 'conservative' content gets a lot of engagement (ie ad money). The more they recommend it the bigger the audience, the bigger ad payout. They're literally monetizing hate.

[–] SeaJ@lemm.ee 7 points 1 year ago (1 children)

Your interests have a strong correlation with people on the right aside from maybe react videos.

But even if your interested were not so strongly correlated with the right, you would probably still get right wing ads or videos suggested. They garner the highest engagement because it is often outrage porn. Google gets their money that way. My subscriptions are to let wing political channels, science, and solar channels but I still get a decent amount of PragerU and Matt Walsh ads. Reporting then does not stop them from popping up either.

[–] cuppaconcrete@aussie.zone 3 points 1 year ago

Yeah big tech loves to throw dumb stuff your way to piss you off and keep you engaged, even if you've never shown an interest before.

[–] josephsh98@lemmy.kde.social 7 points 1 year ago* (last edited 1 year ago) (2 children)

You most probably viewed these types of videos a few times and the algorithm started recommending them to you. It only takes a couple of videos for the algorithm to start recommending video of the same topic. You could easily solve this by clicking on the 3 dots next to the video and then selecting "Not interested", do it enough times and they'll be gone from your feed.

[–] Max_P@lemmy.max-p.me 2 points 1 year ago* (last edited 1 year ago)

This, the algorithm doesn't care whether you ~~like~~ enjoy it or not, it cares whether you engage with it or not. Even dislikes are engagement.

[–] Viper_NZ@lemmy.nz 1 points 1 year ago

I am constantly bombarded with Jordan Peterson videos despite disliking them and telling the algorithm to show less like this.

I’m not sure how it profiles people, but it sucks.

[–] Poob@lemmy.ca 7 points 1 year ago* (last edited 1 year ago) (2 children)

If I accidently watch a Linus Tech tips video, that will be all it recommends me for the next month.

I watched a Some More News video criticizing Jordan Peterson, and Google thought "did I hear Jordan Peterson? Well in that case, here's 5 of his videos!"

Almost all content algorithms are hot garbage that are not interested in serving you what you want, just what makes money. It always ends up serving right wing nut jobs because that conspiracy theorists watch a lot of scam videos.

load more comments (2 replies)
[–] nekat_emanresu@lemmy.ml 6 points 1 year ago (1 children)

Conseratives and fascists are the same group, so I'll refer to them as fascists.

You are talking about one of the core criticisms of corporate secret algorithms to determine what to influence you with. Fascism is forced to creep into everyones world view when you use standard social media, and the average person wouldn't have the slightest idea. Certain key things will be more related to fascist content, like philosophy, psychology, guns, comedy. If you think about what fascists enjoy, or what they need to slander then it makes what I said make more sense.

Jordan Peterson does a lot of vids around psych/philosophy to redirect curious people to false answers that are close to true but more agreeable for fascists. An example of a psychological cooption is "mass psychosis" being coopted into "mass formation psychosis" by fascists. Mass psychosis explains too many true things, where mass formation psychosis redirects people towards a more palletable direction for them.

This is why I want to be nowhere near corporate media if possible. If you delete your cookies(or private browse for the same effect) then youtube will promote the most adjacent things to what you watch like old youtube used to do, although it'll still promote fascism when directly adjacent. With cookies though they have an excuse to have questionable content linger statistically too often.

[–] HappyHam@lemm.ee 1 points 1 year ago (1 children)

You actually believe that every single conservative is a fascist? Jesus Christ political literacy is dead

[–] nekat_emanresu@lemmy.ml 7 points 1 year ago (3 children)

Maybe you need to go have a read about fascism, then tell me, are the points presented becoming more or less similar to current conservative views?

[–] PowerCrazy@lemmy.ml 1 points 1 year ago

I just use the term capitalism since there is no difference in the goals of either.

load more comments (2 replies)
[–] Cyder@lemm.ee 6 points 1 year ago (1 children)

Maybe you have the same problem I have: my wife is still a republican. When that kind of stuff shows up, I know she has been watching it on the family PC. She's not that tech savvy, so I usually go in later and block or limit some of it. It's a pain to fight the algorithms.

[–] stappern@lemmy.one 2 points 1 year ago

condolences...

[–] bobthened@feddit.uk 5 points 1 year ago (1 children)

My interest are gaming, standup comedy, memes, react videos, metalurgy, machining, blacksmithing,

Because a lot of the type of guys who like seeing those stupid conservative videos also like many of the same things as you. Gaming^(TM) is well known to have a problem with the alt-right, react videos have a very similar structure to conservative “libs destroyed with fact and logic” types of videos, and finally a lot of conservatives like to think of themselves as an old fashioned man’s man meaning things like metalwork and other typical “manly” careers.

[–] MooseBoys@programming.dev 1 points 1 year ago

Pretty sure this is it - I see them as well but my interests are science education, gaming, and 3D printing.

[–] RocksForBrains@lemm.ee 4 points 1 year ago

Conservative proganda is a highly coordinated and well-funded network. They pay for preference.

[–] masquenox@lemmy.ml 3 points 1 year ago

Right-wing media is well-funded.

[–] macwinux@lemmy.ml 3 points 1 year ago

I bought a new phone a few months back, and just for shits and giggles I tried looking at YouTube Shorts without any account logged in. I clicked on a food Shorts, and within 5 Shorts, I got a "manosphere" video, and within 10 Shorts I got Ben Shapiro.

Unfortunately, fear and rage drive engagement, and these Conservative grifters are more than happy to give it to them and more. That on top of corporations being Capitalists and thus right-wing by default.

[–] yoz@aussie.zone 3 points 1 year ago

This can't be any more clearer

Android TV- install smartnexttube Android phone- Set private DNS to Nextdns and block all ads + install firefox w/ unlock origin Or if you want an app install - newpipe from fdroid app store

[–] HubbleST@lemm.ee 2 points 1 year ago

One thing I noticed about browsing the youtube homepage on PC is if your mouse hovers over a video, it starts playing, and that puts it in your watch history. So you might be accidentally adding a trash video it recommended to your watch history while looking at the other offerings. You can disable the "mouse hover auto play" by clicking your profile pic in the top right > settings > playback and performance > inline playback.

[–] CurlyWurlies4All@prxs.site 2 points 1 year ago
[–] bownt@lemmy.ml 2 points 1 year ago

you are a closet conservative. the algoritham has spoken.

[–] NoMoreCocaine@lemm.ee 2 points 1 year ago (1 children)

Not to be that guy, but you can say "don't recommend this channel again" to YouTube. I haven't seen Quartering, Asmogold, etc for years now. Unless you search for them.

[–] Tartas1995@discuss.tchncs.de 1 points 1 year ago

I had the issue that I got Andrew Tate shit recommended. I said don't recommend that, and block the uploader. Youtube still suggested me that video. Exactly that video.

[–] AlexWIWA@lemmy.ml 2 points 1 year ago

If you watch any kind of gaming videos, and haven't trained your algorithm, then you'll get flooded with this shit

[–] PerCarita@discuss.tchncs.de 2 points 1 year ago

You can view what Google "knows" about you on your account settings. I made my account when I was very young, I lied about my age and my gender, then it made assumptions based on my interests of my professional situation. I guess many people in my gender and age group, sharing my actual interests (tech, movies, culture, food) are also interested in the kind of content you described (Joe Rogan, Jordan Peterson, Yiannopolous, etc). I keep clicking "not interested", but the algorithm keep suggesting these videos to me. I don't mind that Google doesn't know my politics. I'm a feminist, but there's really not a lot of interesting discourse about feminism on Youtube, so I just read and attend real life lectures instead.

[–] Bencodec@waveform.social 2 points 1 year ago

The algorithm is clever enough to know that people that watch a few of those videos are likely to watch a whole lot more. So it’s good business to recommend them as often as possible. If they CAN convince you to dive into that, the stats are that you will start to watch a ton more YouTube content.

[–] MiloSquirrel@lemmy.ml 2 points 1 year ago

For me if I ever look up warhammer 40k it immediately starts sending me losers like quartering or other channels like him.

Like, bo youtube, I don't want to hear about how feminism and "wokes" are ruining warhammer. I also don't want to be sent Sargon of Carl videos. Uhg. Lmao

[–] SaltySalamander@lemmy.fmhy.ml 2 points 1 year ago (2 children)

Why does it recommend this shit to me

Because you, or someone using your account, has watched this type of shit in the past.

[–] elkaki@lemmy.dbzer0.com 2 points 1 year ago

Not necessarily, although YouTube shorts may be it's own thing in terms of algorithm I frequently see andrew tate, ben shapiro, jordan peterdon clips despite disliking and inmediatly scrolling past when I see their faces. Also I have encountered a lot of anti feminism content of the likes of 2014 this year, where someone is seen mocking "feminists" making what seems like stupid remark and getting owned with some sigma face meme from the american psycho guy and music.

it has been the case multiple times that the YouTube algorithm makes weird connections which often lead to right wing channels being promoted. Or sometimes an entire subsection of creatores are libked with the alt right without being direct (the old atheism sphere, gaming channels are common ones too)

[–] stappern@lemmy.one 1 points 1 year ago
[–] stappern@lemmy.one 2 points 1 year ago (2 children)

youtube is a right wing propaganda machine. im convinced of it at this point.

i cannot look at shorts without getting a fucking piers morgan or andrew tate or joe rogan with the antivaxx guest . FUCK ME.

no matter how many times i click "dont reccomend this channel to me" or click dislike they always come back. it cant be a coincidence. i even tried with a clean account, 30 min in you get some shit from that sphere.

[–] dizzy@lemmy.ml 2 points 1 year ago

I’m getting the exact same shit on shorts constantly. Joe Rogan, Andrew Tate, Piers Morgan, some kid explaining why women should stay in the kitchen, another guy screaming that trans people don’t exist, some imam thinking he’s making people look stupid for not believing in god, etc

I’m not sure if I’m impressed in my ad/content blocking and general internet hygiene that google/youtube knows me so badly or worried that so many people, particularly young/impressionable people, who may not have such strongly opposed views to that hateful crap, are getting brainwashed and radicalized into having extreme conservative views.

Mine only shows me tech, science, and games related things. I'm glad I am one of the lucky ones.

[–] forgotmylastusername@lemmy.ml 2 points 1 year ago

A while back the conservative party of Canada was caught inserting MGTOW / Ben Shapiro tags on all their Youtube uploads. In other words they were poisoning peoples social graph in order to cause exactly what you're talking about.

This is why I use a Google account that is only for Youtube entertainment. I keep it on a separate chromium profile. I turn on all the privacy toggles in the Google account. Only Youtube history is turned on. I curate the watch history.

You cannot tell what content might have breadcrumbs that eventually open the floodgates of far right echo chambers. They do this intentionally. So it requires active measures on your part to counter them. You've got to manage your account with intention. I do not use that account at all for random browsing. I usually do that in incognito on a different browser.

[–] buckykat@lemmy.fmhy.ml 2 points 1 year ago (1 children)

three dots -> don't recommend channel

use it on anything even a little sus

[–] primalmotion@lemmy.antisocial.ly 3 points 1 year ago (2 children)
load more comments (2 replies)
[–] IDe@lemmy.one 1 points 1 year ago

The best way to tune the algorithm on Youtube is to aggressively prune your watch/search history.
Even just one "stereotypical" video can cause your recommendations to go to shit.

[–] king_dead 1 points 1 year ago

You can get rid of a lot of the bullshit YouTube loves to shove down your throat by telling it not to recommend the channel. I haven't got any of that garbage in years

[–] PowerCrazy@lemmy.ml 1 points 1 year ago

The algorithm wants engagement first and foremost (positive vs negative is irrelevant), after that it wants to push view points that preserve the status quo since change is scary to shareholders. So of course capitalist/fascist propaganda is preferred especially if the host is wrong about basic facts (being wrong drives engagement.)

[–] Glaive0 1 points 1 year ago

I have a lot of ways to curate YouTube and I haven’t seen scummy conservative junk in a while.

One way I curate this is to only watch via watch later. I’ll subscribe to my interests and add videos according to what I actually want to watch either from my sub feed or my home recommendations.

I also actively clear my history of any videos that I don’t want in my recommendations, even if I was interested at some point (and usually all shorts). It’s manual work, but I just went to run the “not interested” process (below) to better explain it and couldn’t find a single thing bad enough to get rid of.

On a bad video the meatball menu > “not interested” can also help cut down on unwanted content right from your home feed.

It’s a bit of work for how much I watch, and right now I’m HATING the sort errors for their playlists, but it means I only watch what I want to and don’t get spam and scams.

Bonus tip! You can use the (i) button>”stop showing this ad” during an ad to immediately skip an ad when it shows up even if it’s unskippable. Annoyingly, this doesn’t get rid of it entirely and/or a lot of ads have 3-10 different versions that all count as different ads. You can also get rid of static ads in the home feed by doing the same thing from the meatball menu on the ad.

[–] borlax@lemmy.borlax.com 1 points 1 year ago

Because controversy makes money and conservatism is filled with controversial opinions and purposely obtuse takes intended to spark conversation and promote divisiveness. That’s the grift.

[–] PerogiBoi@lemmy.ca 1 points 1 year ago

Those types of videos have the most engagement. YouTube is trying to show you whatever it thinks will keep you there longer.

Turns out conservative radicalization keeps people longer there. I’ve never googled or watched any Andrew Tate videos but my recommended has at least 3 videos front and center.

[–] stiephel@feddit.de 1 points 1 year ago

I'm 30 and have a small family, too. When I watch shorts on YouTube I get the exact same content you're describing. None of the long videos I'm watching are political, yet the Algo keeps throwing them at me. I get a lot of Jordan Peterson crap or lil Wayne explaining how there's no racism. I hate it.

[–] ChaoticEntropy@feddit.uk 1 points 1 year ago* (last edited 1 year ago)

I almost never allow it

The times you do allow it are all the algorithm cares about, sadly. Any kind of engagement is great for companies.

"Hate Rogan? Cool, watch some Rogan as hateporn, hate watching is still watching."

[–] EssentialCoffee@midwest.social 1 points 1 year ago (2 children)

Like others have said, the things you watch are prime interests for right wing in the US. You have to train the algorithm that you don't want it.

load more comments (2 replies)
load more comments
view more: next ›