On 10 July 2018, the Information Commissioner published an update on its investigation into the use of data analytics in political campaigns. The report follows the launch of the ICO’s formal investigation in May 2017 into allegations of ‘invisible processing’ of personal data and targeting of political adverts during the EU referendum. The investigation remains ongoing but the Information Commissioner had committed to providing an update to the Department of Digital Culture Media and Sport (which is leading an inquiry into ‘fake news’) prior to the summer and details are also needed by overseas regulators to allow them to progress their own related investigations.
Key highlights from the report are set out below (and five key lessons for businesses are included further below):
- Facebook: Notice of Intent to issue a monetary penalty of £500,000
Of particular significance, the report sets out that the ICO has issued Facebook with a Notice of Intent to issue a monetary penalty of £500,000 for lack of transparency and security issues in relation to the harvesting of data (in breach of the first and seventh principles of the Data Protection Act 1998 (“DPA”), broadly equivalent to the ‘lawfulness, fairness and transparency’ and ‘integrity and confidentiality’ principles under Article 5 of the GDPR). Representations are due from Facebook later this month, which the ICO will consider before finalising its views. The public interest in this case has led the ICO to publish the Notice of Intent (not something it would normally issue).
Facebook was one of the organisations identified by the ICO in its investigation into the use of data analytics for political purposes (along with Cambridge Analytica and AggregateIQ). The background to the investigation has been well rehearsed in international media, but by way of brief summary: a Facebook based application (ultimately used by approximately 87 million people) was developed by researchers who later entered into a contract with SCL Elections Ltd (a company affiliated with Cambridge Analytica). Significantly, the App was designed to capture data relating not just to its users, but also the friends of those users. It is believed that some of the data (from personality test results) was combined with other data on psychological patterns and then used for the granular targeting of political communications over Facebook. It is also suggested that the researchers shared Facebook data with other third parties.
Whilst the App required users to sign up to terms and conditions that allowed access to their Facebook data and that of their friends, the ICO does not consider this to have been sufficiently informed consent and that various matters were not made clear enough: how and where data would be sold to third parties and how the data would be used and processed. The range and scope of the data obtained by the App is also considered by the ICO to have been a breach of Facebook’s platform policy in place at that time. The report outlines the ICO’s concerns that data was accessed from Facebook, used for purposes it was not intended for and that the individuals involved would not reasonably have expected. Questions in relation to the robustness of Facebook’s technological and organisational measures to verify the terms of service of App developers have also been raised.
Whilst a £500,000 fine is clearly not a substantial sum for an organisation of Facebook’s size, it is the maximum amount the regulator can fine under the DPA and indicates its view on the seriousness of the breaches involved. Under GDPR, the ICO now has the power to issue significantly higher fines (up to 4% of annual global turnover (which for Facebook would be north of $1 billion).
- Political parties: Warning letters issued
As part of the investigation, the ICO’s investigation team wrote to UK political parties requesting information about how they obtain and use personal data and also how they comply with data protection laws and guidance. The ICO has concluded that there are risks in relation to their processing activities and has issued 11 warning letters, requiring action by the main political parties, together with assessment notices for audits later in the year. Concerns raised by the ICO include: the purchasing of marketing lists and lifestyle information without sufficient due diligence on how the data has been gathered, a lack of fair processing information, use of third party analytics companies (with insufficient checks on whether consents have been correctly obtained).
- Data brokers: Notice of Intent for regulatory action against Emma’s Diary (Lifecycle Marketing (Mother and Baby) Limited) and credit reference company audits
During the ICO’s investigations, it has been found that some political parties purchased data from data brokers for election and campaign purposes. The report outlines that the ICO has evidence that consent for such use was not lawfully obtained and not compliant with the fairness and transparency requirements in the DPA.
The ICO has outstanding enquiries with a number of data brokers and has indicated its intention to take formal action against Emma’s Diary (an organisation established to provide advice and support to new mothers), who now have an opportunity to make representations before the ICO finalises its decision.
What lessons can be drawn from the ICO’s on-going investigation?
Whilst the investigation by the ICO is on-going, these latest developments make for instructive (and in places, sobering) reading for businesses.
- Transparency: an obvious, but critically important lesson, is the importance of clear and detailed privacy notices. The purposes for which data will be used must be explicit, particularly where uses are unexpected. In respect of disclosures to third parties, those third parties should either be named or, failing that, categorised precisely so that there is no ambiguity for the data subject about where their data will end up. Interestingly, developments overseas with the California Consumer Privacy Act show that issues of data transparency are becoming a concern for law makers globally.
- Alignment with platform terms of service: where data is collected from third party platforms (including, in particular, social media platforms), and whether or not that data is collected by way of an App, companies must ensure that both their initial collection of the data and its subsequent use is in compliance with the platform’s terms of service or any equivalent set of rules.
- Risks associated with buying data from third party brokers: whilst a market for personal data will likely always exist, there are definite risks associated with buying such data from third party brokers, particularly where that data will be used for marketing, profiling or targeting purposes. Careful due diligence (coupled with protective contractual terms) are key, and in particular focus should be paid to any consents which have purportedly been obtained for the benefit of the buyer, as such consents can be difficult to obtain under the high threshold for lawful consent set by the GDPR.
- Intent of the regulator: this is comfortably the largest and most high profile investigation ever undertaken by the ICO, and is one of the most prominent by any data protection authority globally. The ICO’s intention to levy the maximum possible fine (under the old regime) on Facebook is sobering, and the ICO has shown that it is not afraid of using the full range of its powers, including warrant based powers of inspection and seizure (used in relation to Cambridge Analytica), compulsory audits, enforcement notices and fines. As the importance of data to the UK and global economy grows exponentially, it appears that the ICO is keen to show its willingness to flex an increasingly sophisticated enforcement arm to catch those on the wrong side of the law.
A full copy of the update report is available here.
By James Clark (Senior Associate, UK) and Lesley Lazenby (Professional Support Lawyer), DLA Piper UK LLP.