How Youtube content-agnostic algorithm drove hate group recruitment

Ever wonder why the youtube algorithm is so big on pushing right wing conspiracies, hate videos, misogynistic “red pill” rants, and (seemingly inexplicably) ghost stories? Spoiler warning: it’s part of my “algorithms written by idiots destroy our world” series.

So pretend you are a team at youtube and are charged with making the algorithm that will recommend videos. Your goal is to maximize profit and show the most ads, which is a perfectly reasonable goal for youtube, a company that sells ads. You are also a gullible person who genuinely bought into the libertarian “everything should be content agnostic, speech must be free as an absolute, and everything can be separated from context”.

The methodology you take is simple: it turns out that while most people watch one video and leave, some smaller portion end up going on hours long youtube benders. Those people sell the most ads, so therefore the videos that those people like are best, because that’s the content that’s most liable to lead to long binges. Actual video content isn’t relevant here because, again, you’re a free speech believer and if you just be colorblind everything will be ok.

Let’s recap this point cause it’s important:
Mathematically the best videos are videos that lead to people watching for very long periods of time. Preferably hours. Ok.

What’s the result of this content agnostic approach? It turns out that in America the people most addicted to youtube are those without much else going on, probably dealing with some sort of isolation and depression, and most vulnerable to be preyed on by people ready to blame their troubles on “the others”. This is why the anti-women videos bubble up, this is why the right wing conspiracies bubble up, and outside America in a famous example this is why the pro-genocide videos came up in Myanmar. I actually don’t know why ghost videos came up there but that one doesn’t worry me as much. People like ghost videos.

And turns out that those things self-reinforce. People who feel isolated want videos that feed a hate, and videos that feed hate lead people to isolate themselves. So it works great in that regard: you’re selling ads using videos that create people that then go on to watch even more ads. It’s great for short term profit, and also why youtube has become such and incredible driver of recruitment for hate groups.

This came up cause youtube is finally admitting it has a problem and is claiming to be redoing the algorithm to try and bury hate content more. We’ll see how it goes, but at least they’re claiming to be doing something, as opposed to just patiently corpsplaining that free-speech is in and of itself both the absolute utopian goal and the only means to arrive at it.

So yeah, reiteration that hiding behind algorithms is always the wrongest thing that can be done, and deplatforming hate speech using actual human moderators is the only solution around it.

And to as a final footnote: I guess if you must hate someone and blame them for everything, hate ghosts. No one gets hurt and life goes on. Bonus, if you make videos about it you’ll get great algorithmic youtube rank.