Regulation is Coming to YouTube, and It’s Going to be Ugly
For a company focused on the bottom line, running 100 parallel versions of your platform doesn’t make sense. Welcome to the era of lowest common denominator regulation
Mark the date in your calendars: YouTube changed today.
The digital video platform, which has battled repeated negative headlines in the last two years, published new terms of service for its users in the European Union and Switzerland last month. Hardly anyone noticed.
The rules beef up the power of the video sharing platform to remove access to users who cause harm to the reputation of the service, or harm its users.
The new terms of service for European users also make clearer the requirement for users of the website to be 13 years old or older, otherwise their parents bear the responsibility for their actions — crucial given an imminent ruling by the Federal Trade Commission (FTC) about YouTube’s compliance with the Children’s Online Privacy Protection Act (COPPA, which stops online services tracking children under the age of 13), as first reported by The Washington Post, which over the weekend said that the FTC had settled with YouTube.
Today, the new rules came into force across 29 countries — and at a stroke YouTube became fractured.
Users who access the platform from different corners of the world — the site has around 100 localized versions worldwide, covering 95% of the internet’s users, except China, where the ruling Communist party’s censorship was seen as too difficult to comply with — all see different versions of the site already. An analysis of the top trending videos in different countries published earlier this year by Mitchell Jolly, a computing science student at the University of Glasgow, Scotland, showed that YouTube promotes different videos on its Trending tab depending on that market’s taste. Certain creators transcend the international date line and national boundaries. Others don’t.
So your flavor of YouTube depends on from which part of the world you log on. But the underlying principles for accessing the website — the rules by which all users must abide — have until recently been mostly universal.
“A lot of the regulation we see happening now, especially coming out of places like the European Union, has been this idea that if you acknowledge the international superstructure, you will be able to steer its development by exercising regulatory authority that you possess as the government of a geographical nation-state,” says Meredith Filak Rose, policy counsel at Public Knowledge, which promotes a free internet.
Which is why the disparity between the rules European users must follow to use YouTube and users elsewhere won’t last for long: websites that cross borders and continents find it difficult to enforce piecemeal regulations depending on the whims of the ruling politicians in the countries they operate. Instead, we are likely to see a worrying trend: platforms developing their terms of service to fit the lowest common denominator.
Asymmetrical regulation does not work for vast multinational organisations. While companies like YouTube have outreach teams and legal experts operating in different jurisdictions, liaising with and lobbying politicians to ensure their platform remains on the right side of opinion in power, they can find it difficult to create different versions of their platform for different countries. It’s why YouTube steered well clear of China from the outset, and why parent company Google remains reluctant — albeit less so recently — to develop a China-specific version of its search engine.
It becomes impossible to run parallel services with different rules, particularly when operating at scale. YouTube has two billion monthly active users, watching more than a billion hours of content a day, while the firehose spews out 500 hours of video created by users around the world every minute.
We’re going to see the European Union terms of service become the new baseline for all YouTube users in the near future.
It’s already confirmed. Last month YouTube confirmed in a tweet it’d “roll out similar updates [to the European one happening today] globally later this year.” (A spokesperson for YouTube initially expressed eagerness to participate in this story in June; in July, they ignored an interview request.)
Asymmetrical regulation does not work for vast multinational organisations.
That’s troubling. Few people disagree that platforms like YouTube and Facebook, both of whom are dealing with the threat of regulation from all corners, now require outside intervention. They’ve had more than a decade to police themselves, and in that time have shown little capability to provide a safe environment for some of their most vulnerable users. But when politicians in different countries focus on different areas of concern on platforms like YouTube, we see the threat of death by a thousand cuts.
The European Union’s copyright changes, in the wake of Article 17 being passed by legislators earlier this year, could cause YouTube to become more stringent in how it deals with user generated content. Because it quickly becomes costly and complicated to have different rules for one person and another, those new copyright rules could become the standard globally.
“The history of the internet has always been a fight for who gets their hands on the steering wheel,” says Rose. “America has traditionally shaped the way the rest of the world looks at the internet. And we basically in many cases set the standard, through soft or hard power.”
If current reports that the FTC is shortly to rule on YouTube’s liabilities under COPPA are correct, the video sharing platform will soon have to rework its approach to minors, and videos appealing to and featuring them. It would make little sense for YouTube to have a version of its website that (for instance) bans all children’s videos from the service for users in Seattle, Washington, while visitors to YouTube 150 miles north in Vancouver, British Columbia, can still see them. A sensible, streamlined organisation would revert to the simplest solution, which is banning all children’s videos for all users, making it compliant with the most draconian laws.
Reverting to a global baseline is problematic because different countries are tackling different aspects of YouTube’s more questionable issues. In April, the UK government released a white paper on “online harms”, which included potential provisions that would require tech companies to disclose more clearly how their algorithms work in order to “establish that companies are adequately fulfilling the duty of care”. A committee of parliamentarians have opened an inquiry into “immersive and addictive technologies” — the framing of which suggests an outcome that may request sites like YouTube make their platforms a little less addictive in the future.
When politicians in different countries focus on different areas of concern, we see the threat of death by a thousand cuts
“I think we need to establish the standards we want [sites like YouTube] to adhere to,” explains Damian Collins, the British politician who leads that parliamentary committee.
Collins believes the regulator could not just look at examples of harmful content, but could probe YouTube on the ethics and responsibilities of the user experience, pushing back on the fears that people are being channelled towards inappropriate or misleading content. “At the moment, these sorts of reviews are done largely by the company themselves with very little or any outside scrutiny,” he says. “I think if we have a proper regulatory system, a regulator could go into a company with legal powers and say: ‘Show us you did enough and let’s discuss whether more could have been done to stop this spreading.’”
The politician thinks regulation is necessary because of the way YouTube has become an extension of and mainstream alternative to television. “YouTube, with a big audience of subscribers, is just as much television as many shows on what we call traditional TV,” he says. He’s worried about a loophole where the producers of a reality TV show like Big Brother, which currently is not broadcast on traditional TV in the UK, could shift production to a version distributed on YouTube. “You’d have a TV format in a totally unregulated space.”
He’s not alone in thinking that. In Ireland, the country’s Broadcasting Authority of Ireland (BAI) announced proposals for a statutory regulator for video sharing platforms like YouTube.
“Changes in audience behaviors in the ten years since the previous iteration of the AVMSD [a Europe-wide regulation on audio and visual content] have been central to the rationale to extend and strengthen the protections for audiences to on-demand audiovisual media services (“on-demand services”) like Netflix and YouTube channels,” the BAI’s submission to the Irish government reads.
In India earlier this year, Delhi High Court judgements in cases brought by T-Series, the world’s most subscribed-to channel, against PewDiePie, their nemesis, compelled YouTube to temporarily remove salacious content for Indian sensitivities. At the same time, politicians are debating Draft Information Technology Intermediary Guidelines (Amendment) Rules that would compel platforms to more proactively police content.
And in France, the government recently published a report it commissioned in preparation for it to regulate social media.
“There has always been a dialogue about how technological innovations jive with our fundamental principles as nations and cultures,” says Rose. “And I think that conversation is happening anew right now with particular fervor. I think some of the conversation is much more well-informed than others.”
Make no mistake: regulation is coming. The consultation on the UK government’s white paper closed earlier this month. Collins told me that he believes that the UK will see legislation passed in the next 18 months that would establish a duty of care principle for social media companies including YouTube, and place them under the eye of a regulator with statutory powers to oversee them.
Each one of these actions, in each one of those countries, could have an impact far beyond its borders.
“If we have a proper regulatory system, a regulator could go into a company with legal powers and say: ‘Show us you did enough and let’s discuss whether more could have been done to stop this spreading.’”
There is currently asymmetry in regulation of sites like YouTube across the world. For the past 18 months Germany has enforced a law — the Netzwerkdurchsetzungsgesetz (NetzDG) law — that compels social networks to remove hate speech, fake news and illegal content, as defined by 22 German criminal statues, within 24 hours of being notified, or face a fine of up to €50 million ($57 million). That has not seen a step change in YouTube’s approach to censorship worldwide, in large part because the law’s bark is worse than its bite, and the threat of massive fines is not levied in practice. YouTube has failed to remove around 5% of takedown requests under the NetzDG law within 24 hours as required, though has not lost vast sums of money under the legislation.
However, Germany was an outrider in imposing regulation on social networks, acting earlier than most. Other countries are catching up to the need to regulate — and as the burden of complying with tens of different forms of rules rather than a handful grows, it seems likely that YouTube and other platforms will seek to streamline the process.
“I think we’ll see different countries trying different things and learning from our relative experiences over that,” says Collins. “Over time we’ll see a common policy develop.” Like GDPR, the MP believes that the US will adopt the spirit of European regulation, if not the exact letter of the law.
The fear is that we could soon move from one extreme to another: from a wild west devoid of regulation, where hate speech festered and bad actors learned how best to make the rules work for them, to an overly censorious state of play, where social networks are so snowed under with poorly drafted overregulation that they have to default to the harshest rules.
That would be bad for tech firms — and for consumers — in the long run. But ultimately, sites like YouTube had 14 years to tackle the problem themselves and have shown themselves either unwilling or unable to face the substantive problem head-on. “Cyber corporations in a lot of cases have just as much power as a state to govern how this speech happens online,” says Rose. “The price for having a bigger and more powerful megaphone for your speech is agreeing to have the company that made the megaphone police your speech — on behalf of itself and various state actors.”
Prepare yourself for some big changes around the corner — not all of them good.