In recent years, internet platforms which host user generated content (including major social media sites) have been the subject of widespread scrutiny. Whether fairly or unfairly, a range of social ills, from cyber bulling, to election interference, to terrorism have been linked in some way with these platforms. Of particular concern has been a perception that these sites hide behind their legal categorisation as mere ‘intermediaries’ or ‘distributors’ of content. Unlike publishers (e.g. traditional media outlets) – who are required to exercise editorial control and responsibility over their content – these websites remain one step removed. However, the incredible power and influence that many of these websites undoubtedly have, and their often ever-present role in the daily lives of their users, has led to demands for greater regulation and oversight, in order to force the websites to do more to curb the perceived harms caused by the content which they host.
In the United Kingdom, the Department for Digital, Culture, Media and Sport (“DCMS“) has this week published its ‘Online Harms White Paper’. This paper sets out a framework for tackling online content which is harmful to individual users, particularly children, or threatens key values within the UK.
The Framework (an overview)
The Government will establish a new statutory duty of care which will require companies to take more responsibility for the safety of their users. All companies who fall within the scope of this framework will need to be able to show that they are fulfilling their duty of care.
Clear and accessible terms and conditions will need to be displayed to users, including to children and other vulnerable users. An independent regulator will then assess how effectively these terms are being enforced as part of any regulatory action. This regulator will have an array of powers in order to allow it to take effective enforcement action against companies who have breached the new statutory duty of care.
To assist the companies in complying with the new legal duty, the regulator will provide a code of practice.
Who will be affected?
The Government is proposing that the framework should apply to companies that allow users to share or discover user-generated content or interact with each other online. As this would cover a substantial number of companies, the regulator would take a risk based approach, focusing on companies which pose the greatest risk of harm to users.
The Regulator
The Government is currently considering whether the regulator should be a new or existing body. In the medium term, the Government is anticipating that the regulator will be funded by industry. The Government is also exploring options such as fees, charges or levies to enable the regulator to be more sustainable. The anticipation is that, once the regulator is secure, it will be able to fund the production of the codes of practice, the enforcement of the duty of care and the preparation of transparency reports.
Powers of the Regulator
The Government hopes for the regulator to have a range of enforcement powers. These would include the power to impose significant fines which would assist in ensuring that all companies within scope of the framework comply with their duty of care.
Comment
As these proposals evolve, it will be interesting to see how some of the broad objectives of the framework are turned into concrete obligations, and how these obligations will align with existing legal frameworks – in particular data protection, consumer protection and intellectual property laws. Already, the UK’s data protection regulator (the Information Commissioner’s Office, “ICO“) has issued a public statement on the paper, in which it welcomes the paper, but also clearly points to the fact that it has already been doing work in this space (in particular its on-going investigation into political campaigning and alleged election interference, which has already led to well-publicised enforcement action against Facebook). Overseas, the Irish Data Protection Commission (the lead supervisory authority for many well-known social media sites) is investigating Facebook, Instagram, Twitter and LinkedIn in relation to potential breaches of the GDPR.[1] Clearly, the UK Government will need to think carefully about how a new set of obligations and a new regulator can best work alongside the GDPR and the ICO.
James Clark, Senior Associate, and Christopher Wilkinson, Trainee Solicitor, DLA Piper UK LLP
[1] See the DPC’s 2018 Annual Report https://www.dataprotection.ie/sites/default/files/uploads/2019-03/DPC%20Annual%20Report%2025%20May%20-%2031%20December%202018.pdf