The Legal Journal covers the most significant legal news in the UK
At long last, after many delays, the UK government is set to introduce the world’s “first online safety laws”. This follows a consultation that began in April of last year. It also responds to concerns about the increased distribution of child sexual abuse images, terrorist propaganda, “lawful but harmful” activity and disinformation.
Ultimately, while the government has asserted that it is “unashamedly” pro-tech, it also said that it’s time to regulate the “wild west” internet. Apparently this time there will be “no more empty gestures”.
The government has outlined that the new legislation will hold big tech to account and that there will be significant penalties for companies that refuse to comply.
The proposals have received mixed reviews. Some MPs believe the legislation does not go far enough. Meanwhile, some civil liberties groups argue that there are not enough safeguards to protect users’ freedom of expression and privacy.
The Online Harms White Paper Consultation began in April 2019. At the time, former Prime Minister (PM) Theresa May said that legislative measures were required to protect users from harmful content online. She argued that internet companies “[had] not done enough” for “too long”.
Subsequently, it was suggested that a legal duty of care needed to be introduced, to make online companies “[take] responsibility”. Ultimately, the government signalled that the days of self-regulated “wild west” internet were soon to be over.
A number of measures were set out in the white paper. This included:
In response to the consultation, on 15 December 2020, Home Secretary Priti Patel and Culture Secretary Oliver Dowden released a joint ministerial statement advocating the new measures. In it, they said that criminal law needed to be adjusted and made appropriate for the digital age by safeguarding victims and internet users. In addition to this, they expressed that COVID-19, specifically, had “shone a light” on the risks of harmful activity and content online. This specifically related to disinformation, misinformation and child sexual abuse images and videos.
Shockingly, the Internet Watch Foundation, reported that during a month-long period over lockdown, it blocked 8.8 million attempts to access child sexual abuse images and videos in the UK. September in particular saw an all-time-high in public reports of online child sexual abuse images, with 15,000 reports. This is an increase of 5,000 from the previous year. The Safer Internet Centre, also saw reports of indecent images increase by 50%.
Disinformation and misinformation were also identified to be an evolving threat. According to research conducted by Ofcom during the first lockdown, almost half of UK adults were exposed to disinformation about the Coronavirus, which continued throughout the pandemic.
Other areas that are set to be regulated by the new legislation includes activity related to terrorism and lawful but harmful activity, such as online bullying. According to the government, it is also working with the Law Commission to review whether the online advocacy and promotion of self-harm and suicide should be made illegal, following the death of Molly Russell.
The new proposals set out by the government are wide-ranging, and will cover online safety, cybersecurity, data and data use and competition, among other areas. However, it has been strongly emphasised that the legislation will avoid a “one size fits all approach”. Instead, the framework will be tiered, and therefore there will be differentiated expectations depending on a company’s categorisation.
Overall, the majority of companies will be treated as a “Category 2 Service”. Elsewhere, companies that deliver high-risk or high-reach services will be treated as a “Category 1 Service,” and subsequently will be responsible for complying with more stringent measures.
But, who exactly will the new regulatory framework apply to? Well, according to the government, fewer than 3% of businesses. Specifically, the framework will apply to companies that host user-generated content, accessible by UK-users, and companies that “facilitate public or private online interaction between service users”. Search engines will also be covered by this framework.
The new legislation, in theory, will lead to more transparency and accountability. This means that the companies to whom the legislation applies, will have a duty of care towards their users, and a responsibility to safeguard against illegal activity and content. More specifically, the legislation is tailored to target big tech and hold it to a higher level of accountability than smaller companies. Overseeing the implementation and enforcement of the legislation will be Ofcom, which will ensure compliance with the duty of care.
Of course, the Online Harms laws are going to have a significant and wide-reaching impact. Non-compliance, for example, refusal to remove illegal or harmful content, will lead to companies being faced with so-called “mega fines” from Ofcom.
These mega fines will equate to 10% of companies’ annual global turnover, or £18m, whichever figure is greater. In addition to this, in cases of non-compliance, Ofcom will also have the ability to block services from being accessible in the UK.
Further regulation will be enforced through the new Digital Markets Unit, within the Competition and Markets Authority (CMA). Working in conjunction with Ofcom, and the Information Commissioner’s Office (ICO), the Unit will introduce a new code of conduct. This new code will require more transparency from online platforms, while the new unit will also oversee a pro-competition regime, intended to level the playing field for smaller businesses.
Commenting on the need to diversify choice and the importance of users maintaining control over their data, Business Secretary Alok Sharma, said: “Digital platforms like Google and Facebook make a significant contribution to our economy and play a massive role in our day-to-day lives – whether it’s helping us stay in touch with our loved ones, share creative content or access the latest news. But the dominance of just a few big tech companies is leading to less innovation, higher advertising prices and less choice and control for consumers”.
He added: “Our new, pro-competition regime for digital markets will ensure consumers have choice, and mean smaller firms aren’t pushed out”.
While the government has said that the legislation will lead the way to “a new age of accountability” others are not so sure. Some have argued that the legislation does not go far enough. Writing about “missing areas” in the new law, Chris Elmore MP, said that the lack of provisions for criminal sanction is disappointing. Further to this, the MP for Ogmore said the legislation was just not comprehensive enough and overlooked economic online crimes such as scams and online phishing cons.
Of course, there are opponents to the bill who argue that the legislation’s framework has serious implications for both freedom of expression and privacy. When it comes to regulating legal but harmful content, there is certainly a risk of infringing on civil rights. This was a concern raised by Ruth Smeeth, former Labour MP and CEO of Index on Censorship, who said that “the concept of legislating for cultural change,” is worrying. Further to this, Smeeth added that instead of “banning language,” a more effective strategy would be to implement educational programs.
The Open Rights Group has also expressed the same concern around the regulation of free speech. The organisation stated that requiring content providers to regulate content which is “legal but harmful” will be incredibly difficult. It said that this: “Creates an obligation to measure the risk, likelihood, and results of any given ‘harm’ within subjective standards to achieve objective legal compliance”.
Further to this, the group also outlined its concerns around the government’s failure to provide its definition of harm or risk. The group also critiqued the government’s introduction of a two-tier system. While this system places “greater burdens” on larger social media companies, the group outlined that many of the “riskier behaviours” are actually discovered on smaller and emerging services.
Elsewhere, Index on Censorship voiced concerns that in order to ensure compliance with the new legislation, big tech companies will implement more “automatic censoring” of content, and this may lead to mass closure of users’ accounts. This, it says, is dangerous and could lead to the internet no longer being an “open, democratic and diverse” space.
One thing is for sure, the legislation, while extensive, overlooks a significant number of key areas. Overall, despite delays, the bill appears rushed, and lacking in sufficient clarity. It is also unlikely that changes will be implemented any time soon, and it could be 2022 before any movement on this takes place.
If your law firm is based in the UK, then a listing on The Legal Journal could really help your firm to reach new clients that are searching for legal services.Add Your Law Firm