On the heels of Facebook CEO Mark Zuckerberg’s plea for more government regulation, the UK wants to begin cracking down on harmful content. Declaring the “era of self-regulation for online companies” as over, Minister Jeremy Wright states that larger companies could be fined up to 4% of their annual turnover if actions to remove violations are deemed insufficient.
Regulation has become a hot topic in recent years, as Facebook, YouTube, Apple and other companies have come under fire for fake news and other harmful content. While Facebook is in the midst of implementing further livestreaming safeguards following the New Zealand shootings on Christchurch, the UK wants to place more power in the hands of a third-party regulator who will oversee new laws.
The Department for Digital, Culture, Media and Sport (DCMS) and the Home Office are in agreement that self-regulation has proven unsuccessful, and are now suggesting that legislation be put in place to give internet-based companies someone to answer to. The Online Harms White Paper is a joint proposal from the two government bodies that intends to realise a new “code of practice,” currently undergoing a 12-week public consultation according to the BBC.
“If you look at the fines available to the Information Commissioner around the GDPR rules, that could be up to 4% of company’s turnover… we think we should be looking at something comparable here,” Wright told BBC Breakfast. Companies could even be given as little as one hour to axe the violating content before they face whatever fine is agreed upon.
Despite this, there needs to be further clarification as to what ‘harmful content’ definitively is, otherwise Wright risks impeding free speech. Illegal materials such as revenge porn, hate crimes and terrorism are, of course, encompassed in the proposal, but the paper wants to go beyond that. In the case of 14-year-old Molly Russell, who took her own life in 2017 after being exposed to content pertaining to self-harm and suicide, the anticipated legislation hopes to ban such advocating material.
Technology companies are cautiously optimistic about the proposal, with Rebecca Stimson, Facebook’s head of UK policy, agreeing that “new regulations are needed” for a standardised approach but they should also “support innovation, the digital economy and freedom of speech.” Katy Minshall, Twitter’s head of UK public policy also said that the social network is looking forward to “striking an appropriate balance between keeping users safe and preserving the open, free nature of the internet.”
KitGuru Says: Government regulation can be hit-or-miss at times, but it’s quite clear that companies have been given too much freedom in the past to make as many mistakes as they have. Do you think that a third-party regulator can get things back on track, protecting the population from harmful content?