YouTube has defended its video recommendation algorithms, amid suggestions that the technology serves up increasingly extreme videos.
On Thursday, a BBC report explored how YouTube had helped the Flat Earth conspiracy theory spread.
But the company’s new managing director for the UK, Ben McOwen-Wilson, said YouTube “does the opposite of taking you down the rabbit hole”.
He told the BBC that YouTube worked to dispel misinformation and conspiracies.
But warned that some types of government regulation could start to look like censorship.
YouTube, as well as other internet giants such as Facebook and Twitter, have some big decisions to make. All must decide where they draw the line between freedom of expression, hateful content and misinformation.
And the government is watching. It has published a White Paper laying out its plans to regulate online platforms.
In his first interview since starting his new role, Ben spoke about the company’s algorithms, its approach to hate speech and what it expects from the UK government’s “online harms” legislation.
YouTube uses algorithms to recommend more videos for you to watch. These video suggestions appear in the app, down the side of the website and also show up when you get to the end of a video.
But YouTube has never explained exactly how its algorithms work. Critics say the platform offers up increasingly sensationalist and conspiratorial videos.
“It’s what’s great about YouTube. It is what brings you from one small area and actually expands your horizon and does the opposite of taking you down the rabbit hole,” he says.
“Very often it doesn’t take you to content that’s exactly like the one you’ve watched before.”
Even so, Ben says YouTube has started adding a sort of “warning” label to certain conspiracy topics.
“If it’s misinformation, we provide correct information around that. We work with Encyclopaedia Britannica and Wikipedia to provide knowledge panels that come up on the side of the screen. So if you’re watching a flat Earth video… we will present to you a link to the facts about that.”
Facebook used to do something similar with fake news. It would label false stories as “disputed” with a red warning label, and offered up other sources of information. But the social network later said this had often entrenched people’s pre-existing views and made the problem worse.
“We haven’t found that,” says Ben. He says the platform reduces the spread of content designed to mislead people, and raises up “authoritative voices”.
He names BBC News, the Guardian, the Telegraph and the Sun as examples of authoritative sources.
Some conspiracy theories – such as Holocaust denial – have been banned on the platform completely.