YouTube’s conspiracy theory crisis, explained

The three-and-a-half-hour hearing with Google CEO Sundar Pichai and the House Judiciary Committee wasn’t exactly a showcase of deep knowledge of technology. One Republican representative complained that all of the Google results for the Obamacare repeal act and the Republican tax bill were negative. Rep. Steve King (R-IA) had to be told that Google does not make the iPhone. Rep. Louie Gohmert (R-TX) demanded that Google be held liable for Wikipedia’s “political bias.”

But one lawmaker, Rep. Jamie Raskin (D-MD), raised an actually important and pressing issue: the way YouTube’s algorithms can be used to push conspiracy theories.

“The point at which it becomes a matter of serious public interest is when your communication vehicle is being used to promote propaganda that leads to violent events.” He was alluding to the Pizzagate conspiracy theory which led to an armed gunman showing up at a DC-area pizzeria in 2016 — a conspiracy theory spread, in part, on YouTube.

Raskin asked about another especially strange conspiracy theory that emerged on YouTube — “Frazzledrip,” which has deep ties to the QAnon and Pizzagate conspiracy theories. He asked Pichai, “Is your basic position that [Frazzledrip] is something you want to try to do something about, but basically there is just an avalanche of such material and there’s really nothing that can be done, and it should be buyer beware or consumer beware when you go on YouTube?” adding, “Are you taking the threats seriously?

Raskin’s questions were getting at an important issue: YouTube, which Google purchased for $1.65 billion 12 years ago, has a conspiracy theory problem. It’s baked into the way the service works. And it appears that neither Congress nor YouTube itself is anywhere near solving it.

YouTube and conspiracy theories, explained

One billion hours’ worth of content is viewed on YouTube every single day. About 70 percent of those views come from YouTube’s recommendations, according to Algotransparency, a website that attempts to track “what videos YouTube’s recommendation algorithm most often recommends.”

YouTube’s content algorithms are incredibly powerful — they determine what videos show up in your search results, the suggested videos stream, on the home page, the trending stream, and under your subscriptions. If you go to the YouTube homepage, algorithms dictate which videos you see, and which ones you don’t. And if you search for something, it’s an algorithm that decides which videos you get first.

For example, as I write, I am listening to The Nutcracker Suite on YouTube, so YouTube has recommended a list of classical music videos, along with several others based on my viewing history. But the algorithm knows that I probably don’t want to listen to Nine Inch Nails right now, so it isn’t suggesting say, Nine Inch Nails’ “Broken” album.

But YouTube’s algorithms have an extremism problem.

As Zeynep Tufekci, an associate professor at the School of Information and Library Science at the University of North Carolina, wrote in the New York Times in March, the YouTube advertising model is based on you watching as many videos as they can show you (and the ads that appear before and during those videos).

Whether the subject of the original video selected was right-leaning or left-leaning, or even nonpolitical, the algorithm tends to recommend increasingly more extreme videos — escalating the viewer, Tufekci wrote, from videos of Trump rallies to videos featuring “white supremacist rants, Holocaust denials, and other disturbing content.”

Watching videos of Hillary Clinton and Bernie Sanders, on the other hand, led to videos featuring “arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11,” Tufekci wrote.

On Algotransparency’s website, which tries to reverse-engineer YouTube’s recommendation algorithm, I entered two terms to find out what the algorithm would recommend for a user with no search history based on those terms. First up was “Trump.” (You can try this yourself.)

The first recommended video was from MSNBC, detailing James Comey’s testimony before the House Judiciary and Oversight committees. The second recommended video was a QAnon-themed video — relating to the conspiracy theory alleging President Donald Trump and Robert Mueller are working together to uncover a vast pedophile network including many prominent Democrats (and actor Tom Hanks). (“D5” refers to December 5, which QAnon believers argued would be the day when thousands of their political enemies would be arrested.)

YouTube’s conspiracy theory crisis, explained

Next, I tried “Hillary Clinton.” The top three recommended videos based on YouTube’s algorithm are all conspiracy-theory driven, from a video from an anti-Semitic YouTube channel that argues Freemasons will escape from the United States on private yachts after America’s eventual collapse to a user alleging that Hillary Clinton has a seizure disorder (she does not) to one alleging that Hillary Clinton has had a number of people murdered (also untrue.)

YouTube’s conspiracy theory crisis, explained

I spend a lot of time consuming content about conspiracy theories — but these results weren’t tailored to me. These results were based on a user who had never watched any YouTube videos before.

This isn’t a flaw in YouTube’s system — this is how YouTube works. Which brings us to Frazzledrip.

How YouTube helped spread the weirdest conspiracy theory of them all

The conspiracy theory behind Frazzledrip is this, as “explained” on the fake news website, YourNewsWire.com in April: Hillary Clinton and former Clinton aide Huma Abedin were filmed ripping a child’s face off and wearing it as a mask before drinking the child’s blood in a Satanic ritual sacrifice, and that video was then found on the hard drive of Abedin’s former husband, Anthony Weiner, under the code name: “Frazzledrip.”

For the record: This is not true. There is no such video, and no such thing ever happened. But as Snopes has detailed, multiple conspiracy theories of the Trump era, including QAnon and Pizzagate, overlap, and all of them hold that Hillary Clinton is a secret child pedophile and murderer.

You have probably never heard of Frazzledrip. Most people haven’t heard of Frazzledrip, or QAnon, or perhaps even Pizzagate. But on YouTube, there are hundreds of videos, each with thousands of views, dedicated to a conspiracy theory alleging that a former presidential candidate ripped a child’s face off and wore it as a mask. And there’s markedly little YouTube, or Google, or even Congress, seem able to do about it.

YouTube’s conspiracy theory crisis, explained

“It’s an area we acknowledge there is more work to be done”

Here’s how Pichai answered Raskin’s question: “We are constantly undertaking efforts to deal with misinformation, but we have clearly stated policies, and we have made lots of progress in many of the areas over the past year. … This is a recent thing but I’m following up on it and making sure we are evaluating these against our policies. It’s an area we acknowledge there is more work to be done.”

While explaining that YouTube takes problematic videos on a case by case basis, he added, “It’s our responsibility, I think, to make sure YouTube is a platform for freedom of expression, but it needs to be responsible in our society.”

But it isn’t easy to balance a platform that claims to be for freedom of expression with societal responsibility. It’s not illegal to believe in conspiracy theories, or to think that the September 11th attacks were an inside job (they weren’t) or that the Sandy Hook shootings never happened (they did) or that Hillary Clinton is a child-eating pedophilic cannibal (this, it must be said, I suppose, is untrue).

YouTube could radically change its terms of service — in a way that would dramatically limit the freedom of expression Pichai and his colleagues are attempting to provide. Or it could invest much more heavily in moderation, or change its algorithm.

But all of that would be bad for business. As long as YouTube is so heavily reliant on algorithms to keep viewers watching, on a platform where hundreds of hours of video are uploaded every minute of every day, the conspiracy theories will remain. Even if YouTube occasionally bans conspiracy theorists like Alex Jones, users will continue to upload videos about Frazzledrip, or QAnon, or videos arguing that the earth is flat — and YouTube’s algorithms, without any change, will keep recommending them, and other users will watch them.

.

Sourse: breakingnews.ie

YouTube’s conspiracy theory crisis, explained

0.00 (0%) 0 votes

LEAVE A REPLY

Please enter your comment!
Please enter your name here