The Reason Conspiracy Videos Work So Well on YouTube

Cataloging the conspiracies on offer on YouTube is a fool’s errand, but let’s try: fake moon landing, flat earth , 9/11 stuff, the illuminati, anti-vaxxer propaganda, medical quackery, Qanon,  Nikola Tesla and the pyramids, fiat currency, global cooling, lizard people, robot overlords, time travel, and many even odder things you’ve probably never heard of.

Last month, YouTube said it would stop recommending “content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”

But the conspiracy videos continue to burble up in the great seething mass of moving pictures.  Earlier this week, in a report on the continued success of conspiracy videos on the platform, The New York Times’ Kevin Roose observed, “many young people have absorbed a YouTube-centric worldview, including rejecting mainstream information sources in favor of platform-native creators bearing ‘secret histories’ and faux-authoritative explanations.”

YouTube likes to say that this problematic stuff is “less than one percent of the content on YouTube.” This is, undoubtedly, true, simply because there is so much stuff on YouTube. It is an explosion of creativity, wild and invigorating. One exploration from 2015 found that fully half of its videos had fewer 350 views, and that 90 percent had less than roughly 11,000 views. That is to say, YouTube is driven not by the tail of barely viewed videos, but the head of wildly popular stuff. At the scale of a YouTube, every category of content represents less than one percent of the content, but that doesn’t mean a smallish number of videos can’t assemble a vast audience, some of whom are led further into the lizard-person weirdness of the fringe.

The deeper argument that YouTube is making is that conspiracy videos on the platform are just a kind of mistake. But the conspiratorial mindset is threaded through the social fabric of YouTube. In fact, it’s intrinsic to the production economy of the site.

YouTube offers infinite opportunity to create, a closed ecosystem, an opaque algorithm, and the chance for a very small number of people to make a very large amount of money. While these conditions of production—which incentivize content creation at very low cost to YouTube—exist on other modern social platforms, YouTube’s particular constellation of them is special. It’s why conspiracy videos get purchase on the site, and why they will be very hard to uproot.

***

Inside each content creator on the late capitalist internet, a tiny flame of conspiracy burns.

The internet was supposed to set media free, which, for the content creator, should have removed all barriers to fame. But it did this for everyone, and suddenly, every corner of the internet was a barrel of crabs, a hurlyburly of dumb, fierce competition from which only a select few scrabble out. They are plucked from above by the Recommendation algorithm, which bestows the local currency (views) for reasons that no one can quite explain. This, then, is the central question of the failing YouTuber: Is my content being suppressed?

I’m not above this thinking. No one who has posted on the internet is. Watch your story sink while another similar one rises to the top of Google News, and you, too, will wonder. Watch some stories explode across Facebook while better, worthier ones get sent to the bottom of the feed, and you, too, will wonder: Is some content being suppressed?

Media scholar Tania Bucher calls the folk understanding that people have of these systems the “algorithmic imaginary.” And Bucher found that even random social media users, let alone would-be YouTube stars, were “redesigning their expressions so as to be better recognised and distributed by Facebook's news feed algorithm.” Some go to the next logical step and feel they’ve been targeted by the algorithm or that there is “something weird” going on when their posts don’t get seen.

It’s probably just random flux or luck, but that doesn’t make it feel less weird. As psychologist Rob Brotherton argues in Suspicious Minds, “Our ancestors’ legacy to us is a brain programmed to see coincidence and infer cause.” And what that means, Brotherton says, is that “Sometimes, it would seem, buying into a conspiracy is the cognitive equivalent of seeing meaning in randomness.”

And what place introduces us to a more random distribution of viewlessness and extreme popularity than YouTube?

Google and Twitter spawned verbs, but YouTube created a noun: YouTuber. YouTube mints personalities engaged in great dramas among networks of other YouTubers. It is a George R.R. Martin-level, quasi-fantastical universe, in which there are teams and drama, strategies and tactics, winners (views) and losers (less views). Popular YouTubers appear in each others’ videos. They feud. They ride political positions and news to views. They copy each other’s video tricks and types. They fought outside media, purporting to take down the old celebrity establishment; to support a YouTuber in his or her battle for fame was to oppose the powerful forces of Hollywood.

YouTube is conceived as a real community built on top of the business platform. And as time has gone on, popular YouTubers have presented themselves as protecting this community, saving it from YouTube the company as well as inauthentic YouTubers who don’t get it. YouTubers love making videos about the relationship between creators and YouTube corporate, even going into the nitty-gritty details of levels of monetization. It’s part of the meta-drama of the platform—and it is one way that creators wield power against the quasi-governmental regulatory entity of YouTube. Creators are, in fact, responsible for YouTube’s massive revenues, and yet they are individually powerless to dictate the terms of their relationship, even strung together in so-called “multi-channel networks” of creators. YouTube wants views where it makes money; YouTubers want views on their content, whether it is to YouTube’s benefit or not.

Add in certain kinds of grievance politics, and you have a perfect recipe for hundreds of videos about YouTube “censoring” people or suppressing their views in some way.

Hence the wild overreaction to the marketing video that YouTube put out at the end of last year. It received a record 15 million dislikes in response to the videomakers leaving out the most popular YouTuber, PewPewDie, after he did a string of weird things, including shouting out a proud anti-Semite channel. The “real” YouTube community had spoken out against the corporate brand of YouTube. As one commenter put it, “15 million dislikes. I am so proud of this community.”

Crucially, YouTubers must get viewers to emotionally invest in them because they need people to “like, comment, and subscribe.” The dedicated community around a YouTuber has to support them with concrete actions to pull them up the rankings. People who love YouTube have even been found “47% more likely than average adults to feel personally connected to characters on their favorite TV programs.” This is an intimate medium that generates real feelings of attachment to the people on the other end of the camera. They’re not some stuck-up movie star, they’re a YouTuber fighting the good fight for views. They give you the real stuff, not whatever has been filtered by the goons of mainstream media.

But because of that very accessibility, many, many people see the videos on YouTube and say, “I could do that.” Viewers become creators by the truckload. For every popular YouTuber, there are thousands of others in the same vein—makeup tutorialists, gadget reviewers, gamer livestreamers, newscaster types, people playing ukulele, comedians (oh so many comedians), fun adventure guys. For someone looking to rise up the ranks, it must infuriating. Why that guy? Why that lady? How did this all come to be? Why is my content being suppressed?

The content production system has created a kind of conspiracist politics that is native to YouTube. Richard Hofstadter identified “the paranoid style” in American politics decades ago. The “paranoid spokesman” was “overheated, oversuspicious, overaggressive, grandiose, and apocalyptic in expression,” seeing himself as the guardian of “a nation, a culture, a way of life” against “the hostile and conspiratorial world.”

This style now appears in mutated form on YouTube, beginning with the ur-conspiracy of YouTube itself. It’s a creepy circle. Whatever conspiracy is being suppressed outside YouTube is of course also being suppressed by the algorithm inside YouTube. And, likewise, if the algorithm is suppressing your content, then the outside world probably is, too! As the vast majority of YouTubers are failing at YouTube, there is a constant production line minting people who feel wronged.

This audience of the aggrieved just happen to be the perfect group for successful YouTubers to find as conspiratorial viewers, whether they believe what they’re saying or not. Which is how YouTube star Logan Paul, not otherwise known for his interest in conspiracies, ended up keynoting a Flat Earth conference. Once something is known to work in the YouTube world—once it’s clear that there is demand out there—the supply side of videomakers kick in. Each trying to find just the right conspiracy and spin on conspiracy to move up the logarithmic scale of YouTube popularity.

Now that YouTube corporate is attempting to use its levers to tamp down the worst conspiratorial thinking, isn’t that exactly what the conspiracists would predict would happen to the truth? “YouTube is now cracking down on conspiracy videos even harder than before and you have to wonder why,” one channel called Truth Center posted. “If conspiracies are so stupid and so easy to debunk, then why make an extreme effort to censor conspiracies? Why not just debunk them? Why not let people have the freedom to speak like ‘You’Tube supposedly promotes and why shut it down?”

The very mythology on which the platform was built can now be weaponized by its creators and users. So, it’s not only that conspiracy content made YouTube viewers more prone to believe conspiracies. It’s that the economics and illusions of content production on YouTube, itself, made conspiracy content more likely to be created and viewed. And these forces have reinforced each other for years, hardening them against the forms of control that YouTube can exert.



from Technology | The Atlantic https://ift.tt/2T8Qle6