avatar

Editor

Author's details

Date registered: June 11, 2013

Latest posts

  1. FRANCE: The French Data Protection Authority (CNIL) Publishes 6-Step Methodology For Compliance With GDPR — March 20, 2017
  2. FRANCE: France’s Highest Administrative Court Requests a Preliminary Ruling from the ECJ on the Right To Be Forgotten — March 13, 2017
  3. FRANCE: New Law Introduces Class Actions for Data Protection Violations — November 30, 2016
  4. FRENCH LAW FOR A DIGITAL REPUBLIC ADOPTED – Part III: Significant Changes are in Store for Online Platforms, Telecom Operators and Online Communication Providers — November 4, 2016
  5. FRENCH LAW FOR A DIGITAL REPUBLIC ADOPTED – PART II: Impending Changes to the French Data Protection Law include Post Mortem Rights and Protability Rights for Consumers — November 2, 2016

Author's posts listings

FRANCE: The French Data Protection Authority (CNIL) Publishes 6-Step Methodology For Compliance With GDPR

By Carol A.F. Umhoefer (carol.umhoefer@dlapiper.com) and Caroline Chancé (caroline.chance@dlapiper.com)

 

On March 15, 2017, the CNIL published a 6-step methodology for companies that want to prepare for the changes that will apply as from May 25, 2018 under the EU the General Data Protection Regulation (“GDPR”).

The abolishment under GDPR of registrations and filings with data protection authorities will represent fundamental shift of the data protection compliance framework in France., which has been heavily reliant on declarations to the CNIL and authorizations from the CNIL for certain types of personal data processing. In place of declarations, the CNIL underscores the importance of “accountability” and “transparency”, core principles that underlie the GDPR requirements. These principles necessitate taking privacy risk into account throughout the process of designing a new product or service (privacy by design and by default), implementing proper information governance, as well as adopting internal measures and tools to ensure optimal protection of data subjects.

In order to help organizations get ready for the GDPR, the CNIL has published the following 6 step methodology:

 

Step 1: Appoint a data protection officer (“DPO”) to “pilot” the organization’s GDPR compliance program

Pursuant to Article 37 of the GDPR, appointing a DPO will be required if the organization is a public entity; or if the core activities of the organization require the regular and systematic monitoring of data subjects on a large scale, or if such activities consist of the processing of sensitive data on a large scale. The CNIL recommends appointing a DPO before GDPR applies in May 2018.

Even when a DPO is not required, the CNIL strongly recommends appointing a person responsible for managing GDPR compliance in order to facilitate comprehension and compliance in respect of GDRP, cooperation with authorities and mitigation of risks of litigation.

Step 1 will be considered completed once the organization has appointed a DPO and provided him/her with the human and financial resources needed to carry out his/her duties.

 

Step 2: Undertake data mapping to measure the impact of the GDPR on existing data processing

Pursuant to Article 30 of the GDPR, controllers and processors will be required to maintain a record of their processing activities. In order to measure the impact of the GDPR on existing data processing and maintain a record, the CNIL advises organizations to identify data processing, the categories of personal data processed, the purposes of each processing, the persons who process the data (including data processor), and data flows, in particular data transfers outside the EU.

To adequately map data, the CNIL recommends asking:

  • Who? (identity of the data controller, the persons in charge of the processing operations and the data processors)
  • What? (categories of data processed, sensitive data)
  • Why? (purposes of the processing)
  • Where? (storage location, data transfers)
  • Until when? (data retention period)
  • How? (security measures in place)

Step 2 will be considered completed once the organization has identified the stakeholders for processing, established a list of all processing by purposes and categories of data processed, and identified the data processors, to whom and where the data is transferred, where the data is stored and for how long it is retained.

 

Step 3: Based on the results of data mapping, identify key compliance actions and prioritize them depending on the risks to individuals

In order to prioritize the tasks to be performed, the CNIL recommends:

  • Ensuring that only data strictly necessary for the purposes is collected and processed;
  • Identifying the legal basis for the processing;
  • Revising privacy notices to make them compliant with the GDPR;
  • Ensuring that data processors know their new obligations and responsibilities and that data processing agreements contain the appropriate provisions in respect of security, confidentiality and protection of personal data;
  • Deciding how data subjects will be able to exercise their rights;
  • Verifying security measures in place.

In addition, the CNIL recommends particular caution when the organization processes data such as sensitive data, criminal records and data regarding minors, when the processing presents certain risks to data subjects (massive surveillance and profiling), or when data is transferred outside the EU.

Step 3 will be considered completed once the organization has implemented the first measures to protect data subjects and has identified high risk processing.

 

Step 4: Conduct a privacy impact assessment for any data processing that presents high privacy risks to data subjects due to the nature or scope of the processing operations

Conducting a privacy impact assessment (“PIA”) is essential to assess the impact of a processing on data subjects’ privacy and to demonstrate that the fundamental principles of the GDPR have been complied with.

The CNIL recommends to conduct a PIA before collecting data and starting processing, and any time processing is likely to present high privacy risks to data subjects. A PIA contains a description of the processing and its purposes, an assessment of the necessity and proportionality of the processing, an assessment of the risks to data subjects, and measures contemplated to mitigate the risks and comply with the GDPR.

The CNIL has published guidelines in 3 volumes to help organizations conduct PIAs (see here, here and here).

Step 4 will be considered completed once the organization has implemented measures to respond to the principal risks and threats to data subjects’ privacy.

 

Step 5: Implement internal procedures to ensure a high level of protection for personal data

According to the CNIL, implementing compliant internal procedures implies adopting a privacy by design approach, increasing awareness, facilitating information reporting within the organization, responding to data suject requests, and anticipating data breach incidents.

Step 5 will be considered completed once the organization has adopted good practices in respect of data protection and knows what to do and who to go to in case of incident.

 

Step 6: Document everything to be able to prove compliance to the GDPR

In order to be able to demonstate compliance, the CNIL recommands that organizations retain documents regarding the processing of personal data, such as: records of processing activities, PIAs and documents regarding data transfers outside the EU; transparency documents such as privacy notices, consent forms, procedures for exercising data subject rights; and agreements defining the roles and responsibilities of each stakeholder, including data processing agreements, internal procedures in case of data breach, and proof of consent when the processing is based on the data subject’s consent.

Step 6 will be considered completed once the organization’s documentation shows that it complies with all the GDPR requirements.

 

The CNIL’s methology includes several useful tools (template records, guidelines, template contract clauses, etc.) and will be completed over time to take into account the WP29’s guidelines and the CNIL’s responses to frequently asked questions.

 

For more information, please contact carol.umhoefer@dlapiper.com or caroline.chance@dlapiper.com

Permanent link to this article: http://blogs.dlapiper.com/privacymatters/france-the-french-data-protection-authority-cnil-publishes-6-step-methodology-for-compliance-with-gdpr/

FRANCE: France’s Highest Administrative Court Requests a Preliminary Ruling from the ECJ on the Right To Be Forgotten

By Carol A.F. Umhoefer (carol.umhoefer@dlapiper.com) and Caroline Chancé (caroline.chance@dlapiper.com)

 

On February 24, 2017, France’s highest administrative court (the “Conseil d’Etat”) submitted to the European Court of Justice (“ECJ”) a series of questions raising serious issues with regard to the interpretation of the 1995 Data Protection Directive in light of the ECJ’s 2014 ruling in the Google v. Costeja case[1].

 

The Conseil d’Etat had received from four individuals their appeals against decisions of the French data protection authority (“CNIL”). In each case, the CNIL rejected the appellant’s complaint seeking an order that Google Inc. remove certain links from the list of results displayed following a search of each appellant’s name. The links direct to content on third party sites relating to the appellants, and specifically:

 

  • a 2011 video, posted anonymously, that explicitly reveals the nature of the relationship that the first appellant was deemed to have entertained with a person holding a public office, and alleges that such relationship was beneficial for the appellant’s political career;
  • a 2008 press article relating to the suicide of a Church of Scientology member, mentioning that the second appellant was the public relations manager of that Church; the appellant no longer holds that position;
  • various articles dating from 1995 relating to criminal proceedings for illegal political party financing in which the third appellant was charged; the appellant was acquitted in 2010; and
  • articles (date not mentioned) relating to the conviction of the fourth appellant for sexual assault of minors and mentioning intimate details relating to the appellant that were revealed at the trial.

 

Noting that in each case the published information is either sensitive data or concerns offenses and criminal convictions, the Conseil d’Etat questions whether and to what extent the prohibition on processing such data applies to search engine operators, as they are only required to comply with data protection requirements “within the framework of [their] responsibilities, powers and capabilities” (para. 38 of the Google v. Costeja case).

 

The Conseil d’Etat has therefore declined to rule, stating that appellants’ claims raise serious questions of interpretation regarding the implementation of the right to be forgotten, and has deemed it necessary to refer to the ECJ for a preliminary ruling on the following questions:

 

  1. Considering the specific responsibilities, powers and capabilities of search engine operators, does the prohibition on processing sensitive data and data relating to offenses and criminal convictions, subject to certain exceptions, apply to search engine operators as controller of the processing in the search engine?
  2. If yes:
    1. Does this mean that search engines must systematically delist links to webpages processing sensitive and/or data relating to offenses and criminal convictions, whenever the relevant individual so requests?
    2. How do the exceptions to the prohibition apply? In particular, can search engine refuse to delist links if they find, for example, that the data subject consented to the processing of their personal data or that the data has been disclosed to the public by the data subject or is necessary for the establishment, exercise or defense of legal claims?
    3. Can search engines refuse to delist links to websites processing such data for journalistic purposes
  3. If no:
    1. What data protection law requirements must the search engines comply with, considering their specific responsibilities, powers and capabilities?
    2. When search engines find that webpages contain illicit content and their delisting is requested:
      1. Are the search engines required to remove the links to those webpages from the search results?
      2. Or are they required to take this circumstance into account when assessing the delisting request?
      3. Or does this circumstance have no impact on such assessment?
      4. If it does have an impact, how must the lawfulness of a publication be appreciated when the personal data contained in such publication originates from processing that fall outside the territorial scope of the 1995 Directive and Member State laws?
  4. Irrespective of the response to the first question:
    1. Irrespective of the lawfulness of the publication:
      1. If an appellant demonstrates that his/her personal data has become incomplete, inaccurate or outdated, do search engines have to delist the links?
      2. More specifically, if an appellant demonstrates that the information regarding a past judicial procedure no longer reflects his/her current situation, do search engines have to delist links to webpages containing such information?
    2. Does information regarding an individual’s indictment or trial, and the subsequent conviction, constitute data relating to offenses and criminal convictions? More generally, do webpages containing this type of information fall within the scope of these requirements?

 

Almost three years after the Google v. Costeja case, and many intense debates around its interpretation and implementation, the right to be forgotten returns to its progenitor for much needed clarification.

 

For more information, please contact carol.umhoefer@dlapiper.com or caroline.chance@dlapiper.com

[1] Case C-131/12

Permanent link to this article: http://blogs.dlapiper.com/privacymatters/france-frances-highest-administrative-court-requests-a-preliminary-ruling-from-the-ecj-on-the-right-to-be-forgotten/

FRANCE: New Law Introduces Class Actions for Data Protection Violations

By Carol A.F. Umhoefer (carol.umhoefer@dlapiper.com) and Caroline Chancé (caroline.chance@dlapiper.com)

France’s Law on the “Modernization of the judiciary in the 21st century”, adopted on November 18, 2016, creates a new general framework for class actions in France and a specific class action right for violations of the French data protection law.

Although the introduction of a data protection class action represents a ground-breaking development in French data protection law, the conditions for bringing a class action are restrictive, and the permissible remedies are limited.

After the introduction in 2014 of consumer class actions by the so-called “Hamon law”, and health class actions earlier this year by the so-called “Touraine law”, the new law on the “Modernization of the judiciary in the 21st century” (the “Law”) expands the scope of the class action mechanism to data protection violations (as well as discrimination and environmental law violations).

The Law lays down the legal and procedural framework for all class actions in France (except for consumer class actions, which remain subject to the Hamon law), and creates a new Article 43 ter in the French data protection law, with specific provisions regarding data protection class actions.

Who can file a class action? Data protection class actions may only be brought by:

  • Associations that have been duly registered for at least 5 years and whose statutory purpose is the protection of privacy and personal data;
  • Consumer protection associations recognized at national level and approved in accordance with Article L. 811-1 of the French Consumer Code, when the personal data processing affects consumers; and
  • Trade unions representing employees, civil servants or judges, when the processing affects the interests of those persons.

In what circumstances can a class action be filed? When several individuals who are in a similar situation suffer a loss resulting from a violation of the French data protection law committed by a data controller or a data processor, a class action may be filed before a civil or administrative court having jurisdiction.

The substantive scope of the class action is very broad as it concerns any violation of the French data protection law.

It is also interesting to note that French data protection law places nearly all data protection obligations on the controller; but under the Law, class actions may also be filed against the processor. Direct processor liability is however consistent with Article 28 of the GDPR, which enshrines a principle of data processor liability in specific circumstances.

Finally, the Law is ambiguous as to whether the plaintiff must have received / collected complaints from several victims in order to launch a class action. Indeed, whereas the new Article 43 ter of the data protection law remains silent on this issue, Article 62 of the Law, which applies subject to Article 43 ter, provides that a class action may be exercised “in view of the individual cases presented by the plaintiff”.

For what purpose? Unlike other class actions, data protection class actions can only seek injunctive relief; the class action cannot be used to claim damages. While this restriction could conceivably be explained by the fact that it may be difficult to prove individual damages, it should be noted that Article 80 of the GDPR allows Member States to provide that certain bodies, organizations and associations have the right to  exercise a data subject’s rights to an effective judicial remedy, including financial compensation.

The fact that class action litigants cannot claim damages will undoubtedly limit the impact of the Law, although unwelcome publicity and harm to the defendant’s reputation can certainly still ensue from the filing of a class action, let alone an injunctive order.

How? The action must be filed in accordance with the rules set forth in the French Civil Procedure Code or the French Administrative Justice Code, as applicable. Pursuant to Article 64 of the Law, the plaintiff must, prior to introducing a class action, send a formal notice to the defendant. The class action cannot be filed before the expiration of a 4 month period after the receipt of the formal notice, and in such case the judge may automatically declare the action inadmissible. We note that this notice period is longer than the ones usually given by the French data protection authority (the “CNIL”) when issuing cease and desists (see e.g., recent cease and desists against companies like Facebook[1], Microsoft[2] or CDiscount[3] granting three months to comply; other cease and desists, such as the one against W.M.G (Gossip app)[4], have given controllers only one month to comply).

For more information, please contact carol.umhoefer@dlapiper.com or caroline.chance@dlapiper.com

[1] Decision No. 2016-007 of January 26, 2016

[2] Decision No. 2016-058 of June 30, 2016

[3] Decision No. 2016-083 of September 2016

[4] Decision No. 2016-079 of September 26, 2016

Permanent link to this article: http://blogs.dlapiper.com/privacymatters/france-new-law-introduces-class-actions-for-data-protection-violations/

FRENCH LAW FOR A DIGITAL REPUBLIC ADOPTED – Part III: Significant Changes are in Store for Online Platforms, Telecom Operators and Online Communication Providers

By Carol A.F. Umhoefer (carol.umhoefer@dlapiper.com) and Caroline Chancé (caroline.chance@dlapiper.com

As reported earlier here and here, France’s Law for a Digital Republic (“Law”) introduces important amendments to French data protection law. But once implementing decrees are adopted (expected later this year and in March 2017), the Law will also bring significant changes to online platform operators, telecom operators and online communication providers, as described below.

New consent requirement to ensure confidentiality of electronic correspondence

The Law amends the Postal and Electronic Communications Code by requiring telecom operators and online public communication service providers to maintain the confidentiality of user correspondence, which includes: the content of the message, the correspondents’ identity and, where applicable, the subject line and attachments. The automatic analysis of such correspondence for advertising, statistical or service improvement purposes is prohibited, except with the user’s express, specific and time-limited consent. The period of validity of such consent (which cannot be longer than one year) will be specified by an implementing decree expected by the end of 2016.

However, electronic correspondence can still be automatically analyzed without users’ express, specific and time-limited consent whenever the analysis is for purposes of displaying, sorting or dispatching messages, or detecting unsolicited content or computer malware.

Telecom operators and online public communication service providers will be required to inform their employees of the new confidentiality obligations.

New definition of “online platform operators”

The Law introduces in the French Consumer Code a new definition of online platform operators: Any individual or legal entity offering, on a professional basis, whether for free or for consideration, an online public communication service consisting of either (i) ranking or referencing content, goods or services offered or uploaded by third parties, by using computerized algorithms (e.g., online price comparison tools); or (ii) bringing together several parties (intermediation) for the sale of a good, the provision of a service or the exchange or sharing of content, a good or a service (i.e., marketplaces).

Enhanced transparency and fairness obligations vis-à-vis consumers

Under the Law, online platform operators are required to provide fair, clear and transparent information regarding (i) the general terms of use of any intermediation service, (ii) the referencing, ranking and dereferencing criteria for content, goods and services offered or uploaded, (iii) the existence of any contractual relationship, capitalistic relation or direct remuneration for the operator’s benefit that influences the classification or referencing of the content, goods or services offered or uploaded, (iv) any person acting as an advertiser and (v) when consumers are put in contact with professionals or non-professionals, the rights and obligations of each party under civil and tax laws. Implementing decrees are expected by March 2017 to specify these obligations.

In addition, online platform operators whose activity generates connections above a certain threshold (to be defined by implementing decree by March 2017) must establish and make available to consumers good practices guidelines aimed at strengthening the obligations of clarity, transparency and fairness mentioned above.

Marketplaces will be required to provide professionals with a space that allows them to comply with their own information obligations vis-à-vis consumers. The implementing decree specifying requirements for this space is expected in March 2017.

The regulator is empowered to conduct audits of platform operators’ business practices. The regulator will publish the results of these evaluations and a list of platform operators that do not comply with the information obligations.

Finally, websites that collect, moderate or disseminate consumer reviews or opinions will be required to provide a host of new information in a fair, clear and transparent manner regarding the conditions for publishing and processing these reviews or opinions. Here again, an implementing decree will specify the requirements for providing this information.

For more information, please contact carol.umhoefer@dlapiper.com or caroline.chance@dlapiper.com

Permanent link to this article: http://blogs.dlapiper.com/privacymatters/french-law-for-a-digital-republic-adopted-part-iii-significant-changes-are-in-store-for-online-platforms-telecom-operators-and-online-communication-providers/

FRENCH LAW FOR A DIGITAL REPUBLIC ADOPTED – PART II: Impending Changes to the French Data Protection Law include Post Mortem Rights and Protability Rights for Consumers

By Carol A.F. Umhoefer (carol.umhoefer@dlapiper.com) and Caroline Chancé (caroline.chance@dlapiper.com)

France’s Law for a Digital Republic, under discussion for more than a year, was published on Oct. 7, 2016 and creates significant new obligations for data controllers and online platform operators. As reported here, some key data protection provisions of the Law are immediately effective, whereas other data protection provisions will take effect in 2017 and 2018.

Post mortem rights to control one’s data

The Law creates a new right for each data subject to issue directives relating to the disposition of his or her personal data after death. Those directives may be general or specific or both; general directives can be stored with a third party certified by the CNIL and the CNIL will keep a record of those directives (the publication of the implementing decree relating to the CNIL record is expected in March 2017). Specific directives are stored with the relevant data controller.

As previously reported, other provisions governing post mortem rights could be considered of immediate application:

  1. The data subject can designate a person to exercise his or her rights after death.
  2. Except where the decedent’s directives specifically state otherwise, heirs are entitled to exercise the decedent’s rights for purposes enumerated in the Law, including to ensure that controllers take into account the data subject’s death, close the decedent’s user accounts and stop processing decedent’s personal data.
  3. Online communication service providers must henceforth inform users what is done with their personal data upon death, and must allow users to decide whether their personal data should be transferred upon death.

Data portability enshrined in the French Consumer Code

As from May 25, 2018, consumers will have a right of data portability for all their data. Personal data portability will be determined by the EU General Data Protection Regulation, and portability of all other data will be determined by the French Consumer Code.

Under this new right of consumer data portability, providers of online communication services to the public will be required to offer consumers a free service to recover (i) all files uploaded by the consumer, (ii) all data that result from the use of the consumer’s user account and that can be consulted by the consumer (except data that has been significantly enriched by the provider); and (iii) other data associated with the consumer’s user account (a) that simplifies a change of provider, or access to other services, or (b) where identification of the data takes into account the value of the services, competition between providers, usefulness for the consumer, and the frequency and economic impact of the use of the services.

These provisions will not apply to providers with active user accounts below a certain threshold, which will be determined by decree. A decree will also set forth a list of types of data enrichment that will be presumed insignificant and that consequently will not justify a refusal to “port” data.

Given the CNIL’s expansive interpretation of the definition of personal data, and the vagueness of the new Consumer Code provisions, this new right could prove difficult to apply in practice.

For more information, please contact carol.umhoefer@dlapiper.com or caroline.chance@dlapiper.com

Permanent link to this article: http://blogs.dlapiper.com/privacymatters/french-law-for-a-digital-republic-adopted-part-ii-impending-changes-to-the-french-data-protection-law-include-post-mortem-rights-and-rtbf-for-consumers/

FRENCH LAW FOR A DIGITAL REPUBLIC ADOPTED – PART I: Important Changes to the French Data Protection Law, Including Maximum Administrative Fines of EUR 3 Million

By Carol A.F. Umhoefer (carol.umhoefer@dlapiper.com) and Caroline Chancé (caroline.chance@dlapiper.com)

France’s Law for a Digital Republic (“Law”), adopted earlier this month, creates significant new obligations for data controllers and online services providers – particularly platform operators. Some key data protection provisions of the Law are immediately effective: increased maximum administrative fines; expanded notice obligations for data controllers; and a specific Right To Be Forgotten for minors. Those provisions are described below.

Other provisions, including the creation of a right to direct the use of one’s data after death, enhanced transparency and fairness obligations vis-à-vis consumers and enhanced confidentiality obligations for telecom operators and online public communication services providers, will not be fully effective until the adoption of implementing decrees. A new right of data portability for consumers, similar to the data portability right under the EU General Data Protection Regulation (“GDPR”), will take effect May 25, 2018. Those provisions will be described in other blog posts.

Maximum administrative fine raised from EUR 150,000 to EUR 3M; CNIL enforcement authority reinforced

Under the Law, and effective immediately, the French data protection authority (the “CNIL”), is empowered to order administrative fines of up to EUR 3M. Previously, the maximum fine was EUR 150,000 , or EUR 300,000 for a repeat violation. The Law specifies that when determining the amount of a fine the CNIL must take into account several factors, which largely echo those set forth in the GDPR.

In cases of extreme urgency the CNIL is now also entitled to issue a cease and desist to comply within 24 hours. When the infringing party does not comply, the CNIL may issue a warning, a fine or an injunction. When it is not possible in fact for the infringing party to comply with the law, the CNIL can order a fine without first issuing a cease and desist (but due process must still be followed).

The CNIL will also be able to conduct inspections on behalf of comparable authorities in non-EU countries that offer an adequate level of protection to personal data. The CNIL must enter into an agreement describing the relations between the authorities.

Expanded notice requirements

Effective immediately, notices to data subjects must specify the period during which personal data will be retained; where this is impossible, the criteria for determining the retention period must be specified.

Notices must also mention the data subject’s right to issue directives for the disposition of personal data after death.

Post mortem rights

The Law creates a new right for each data subject to issue directives relating to the disposition of his or her personal data after death. While some provisions relating to post mortem rights await an implementing decree to take effect, others could be considered of immediate application:

  1. The data subject can designate a person to exercise his or her rights after death.
  2. Except where the decedent’s directives specifically state otherwise, heirs are entitled to exercise the decedent’s rights for purposes enumerated in the Law, including to ensure that controllers take into account the data subject’s death, close the decedent’s user accounts and stop processing decedent’s personal data.
  3. Online communication service providers must inform users what is done with their personal data upon death, and must allow users to decide whether their personal data should be transferred upon death.

Right to be forgotten for minors

In a nod to Recital 65 of the GDPR, the Law provides that persons who were minors at the time their personal data was collected in connection with information society services are entitled to have their personal data erased promptly by the data controller. If the controller shared the data with another controller, the first controller must take reasonable measures (including technical measures) to inform the third party that the data subject has demanded the erasure of all links to the data, or any copy or replication of the data. If the data is not erased or the controller does not respond within one month, the data subject may petition the CNIL, which has 3 weeks to issue its decision. The data controller’s obligation to erase a minor’s personal data is subject to five exceptions, similar to those forth in the GDPR.

For more information, please contact carol.umhoefer@dlapiper.com or caroline.chance@dlapiper.com

Permanent link to this article: http://blogs.dlapiper.com/privacymatters/french-law-for-a-digital-republic-adopted-part-i-important-changes-to-the-french-data-protection-law-including-maximum-administrative-fines-of-eur-3-million/

EUROPE: The Applicability Of EU Data Protection Laws To Non-EU Businesses

By Carol Umhoefer (Carol.Umhoefer@dlapiper.com) and Caroline Chancé (Caroline.Chance@dlapiper.com).

This article first appeared in E-Commerce Law and Policy – volume 18 issue 03 (March 2016).

On 16 December 2015, the Article 29 Data Protection Working Party (“WP29”) updated their Opinion 8/2010[1] on applicable law in light of the landmark decision Costeja v. Google[2] rendered by the Court of Justice of the European Union (“ECJ”) on 13 May 2014.

In a context where local data protection authorities are increasingly scrutinizing cross-border data processing operations, companies worldwide need to identify whether and which EU data protection law(s) apply to processing of personal data taking place wholly or partially outside the EU.

Yet the extent of the territorial scope of the Directive has always raised many questions. In 2010, the WP29 concluded in their Opinion 8/2010 that Article 4(1)(a) of the Data Protection Directive 94/46/EC[3] (“Directive”), which provides that a Member State’s data protection law shall apply to data processing “carried out in the context of the activities of an establishment of the controller on the territory of the Member State“, suggests a very broad scope of application.

The exact extent of application remained rather unclear despite the WP29’s guidelines until four years later when the question of whether EU data protection laws should apply to a business based and processing personal data outside the EU came up before the ECJ in the so-called “right to be forgotten” case, Costeja v. Google. In its judgement, the ECJ held that Spanish law applied to the personal data processing performed by the search engine operated by Google Inc., a US-based controller, on the ground that it was “inextricably linked to“, and therefore was carried out “in the context of the activities of” Google Spain, whose advertising and commercial activities constituted the “means of rendering the search engine at issue economically profitable“.

The WP29 have recently updated their 2010 opinion to take into account Costeja. According to the WP29, the implications of the judgement are very broad and should certainly not be limited to the question of determining applicable law in relation to the operation of the Google search engine in Spain.  And indeed, Costeja confirms the broad territorial application of Article 4(1)(a) of the Directive that was espoused by the W29 in 2010.  In this respect, the WP29 recall that the notion of establishment in itself must be interpreted broadly, in line with recital 19 of the Directive, which provides that the notion of “establishment (…) implies the effective and real exercise of activity through stable arrangements[4], such as subsidiaries or branches for example. In Costeja, there was no doubt that Google Spain, the Google Inc. subsidiary responsible for promoting in Spain the sale of advertising space generated on the website google.com, fell under that definition. However, it was disputed whether the data processing in question, carried out exclusively by Google Inc. by operation of Google Search without any intervention on the part of Google Spain, was nevertheless carried out “in the context of the activities of” Google Spain.

The ECJ then introduced a new criterion: the “inextricable link” between the activities of a local establishment and the data processing activities of a non-EU data controller. As underlined by the WP29, the key point is that even if the local establishment is not involved in any direct way in the data processing, the activities of that establishment might still trigger the application of EU data protection laws to the non-EU controller, provided there is an “inextricable link” between the two.

What this “inextricable link” might be raises many questions. The WP29, while insisting on the importance of conducting a case-by-case analysis, consider that, depending on the role played by local establishments, non-EU companies offering free services within the EU, which are then financed by making use of the personal data collected from users, could also be subject to EU data protection laws. The same reasoning would apply, for example, tor non-EU companies providing services in exchange for membership fees or subscriptions, where individuals may only access the services by subscribing and providing their personal data to the EU establishments.

The WP29 are careful to say that being part of a same group of companies is not in itself sufficient to establish the existence of an “inextricable link“, and that additional factors are necessary, such as promotion and sale of advertising space or revenue-raising, irrespective of whether such proceeds are used to fund the data processing operations in the EU. But because the examples provided by the WP29 are almost solely based on revenue flow as the source of the “inextricable link“, it is difficult to conceive of what type of multinational will not have such an “inextricable link” between the activities of a subsidiary (let alone a branch) in the EU and a parent company outside the EU.  The long arm of the Directive is in effect stretched even further.

Will this criterion still be relevant when the General Data Protection Regulation[5] (“GDPR”) applies, likely by July 2018? Certainly, insofar as article 3(1) provides that the GDPR applies “to the processing of personal data in the context of the activities of an establishment of a controller… in the Union“. But the GDPR goes much farther: not only does it consecrate Costeja by specifying that the GDPR applies “regardless of whether the processing takes place in the Union”, it also applies to processing in the context of the activities of an establishment of a processor in the EU, even if the processing occurs outside the EU. Moreover, relying more explicitly on the “effect principle”, article 3(2) of the GDPR further extends the territorial scope of EU data protection law to any data controller based outside the EU that either: (i) offers goods or services to EU residents; or (ii) monitors the behaviour of EU residents.

Another important aspect the WP29 infer from the Costeja decision concerns the applicable law where a business has multiple establishments in the EU, with a designated “EU headquarters”, and this establishment alone carries out the functions of a data controller in relation with the processing operations in question. The WP29 note that, although the Court did not directly address this question, neither did it distinguish its ruling according to whether or not there is an EU establishment acting as a data controller or being otherwise involved in the processing activities.  For the WP29, this means that where there is an “inextricable link“, several national laws may apply to the activities of a business having several establishments in different Member States, regardless of whether one of them qualifies as data controller in respect of the processing in question. This position goes beyond the plain meaning of article 4(a) of the Directive, which provides that “when the same controller is established on the territory of several Member States, he must take the necessary measures to ensure that each of these establishments complies with the obligations laid down by the national law applicable”.[6]

In conclusion, although the WP29’s recent update provides some useful illustrations to help businesses determine whether they should comply with EU data protection law, it does not clarify its exact scope. In particular, WP29’s analysis mostly focuses on websites where data subjects have a connection with one EU establishment, leaving aside other scenarios, such as when data subjects have absolutely no connection with any EU establishment. And the question of how are companies to deal with conflicts of laws remains unanswered. The discussions over these questions promise to be challenging, even more so now with the prospect of the application of the GDPR.

For further information, please contact Carol.Umhoefer@dlapiper.com or Caroline.Chance@dlapiper.com.

[1] WP29, Opinion 8/2010 on applicable law, December 16, 2010

[2] Case C-121/12, Google Spain and Google Inc. v. Agencia Espanola de Protección de Datos (AEPD) and Mario Costeja Gonzalez, May 13, 2014

[3] Directive 95/46/EC of the European Parliament and of the Council of October 24, 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data

[4] Recital 19 of the Directive

[5] COM/2010/2011 final, Proposal for a Regulation on the protection of individuals with regard to the processing of personal data and on the free movement of such data

[6] The recitals of the Directive are admittedly puzzling. Recital (18) states that any processing of personal data in the Community must be carried out in accordance with the law of one of the Member States and processing carried out under the responsibility of a controller who is established in a Member State should be governed by the law of that State. But recital (19) provides that if a single controller is established on the territory of several Member States, particularly by means of subsidiaries, he must ensure that each of the establishments fulfils the obligations imposed by the national law applicable to its activities – thereby vitiating the entire concept of separate legal personality, and failing to denote whether those subsidiaries are to be considered controllers or processors.

Permanent link to this article: http://blogs.dlapiper.com/privacymatters/europe-the-applicability-of-eu-data-protection-laws-to-non-eu-businesses/

FRANCE: The CNIL Fines Google €100,000 Over Right To Be Forgotten

The French data protection authority (the “CNIL”) will not settle for a compromise, or so says its recent decision to fine Google Inc. €100,000 for failing to properly implement the so-called “right to be forgotten”.

By Carol Umhoefer (Carol.Umhoefer@dlapiper.com) and Caroline Chancé (Caroline.Chance@dlapiper.com).

Earlier this month, Google announced it was adapting its approach to the right to be forgotten following discussions between the Mountain View, California firm and EU data protection authorities, in particular the CNIL, which in May 2015 issued a cease and desist order against Google Inc. (see previous post here) and rejected its appeal in September 2015 (see previous post here).

Despite reports that some EU data protection authorities saw this as a potentially acceptable solution, on March 10, 2016, the French regulator ordered Google Inc. to pay a €100,000 fine for violation of individuals’ right to object to the processing of their personal data and the right to delete their personal data, in light of the landmark decision of the Court of Justice of the European Union (“ECJ”) in Costeja v. Google[1].

For the CNIL, in order to be compliant with French law, Google Inc. must delist links from all Google Search extensions globally, and unconditionally. Google Inc. argued that this extraterritorial reach of the right to be forgotten is likely to raise conflict of laws issues and impair other States’ sovereignty (see previous post here). In particular, Google expressed concerns that a global delisting would disproportionately undermine the freedom of expression and information. But the CNIL countered that the purpose of its decision is to ensure “effective and complete protection of data subjects“, as required by the ECJ.

A Google spokesman has already confirmed they will appeal the CNIL’s decision[2].

If the CNIL’s decision becomes definitive, Google will have to further adapt its approach to the right to be forgotten or face up to € 300,000 in additional administrative fines.

For further information, please contact Carol.Umhoefer@dlapiper.com or Caroline.Chance@dlapiper.com.

[1] Case C-131/12, Google Spain and Google Inc. v. Agencia Espanola de Protección de Datos (AEPD) and Mario Costeja Gonzalez, May 13, 2014

[2]France fines Google over ‘right to be forgotten’“, Julia Fioretti, Reuters, March 24, 2016 (http://www.reuters.com/article/us-google-france-privacy-idUSKCN0WQ1WX)

Permanent link to this article: http://blogs.dlapiper.com/privacymatters/france-the-cnil-fines-google-e100000-over-right-to-be-forgotten/

RIGHT TO BE FORGOTTEN: Google Adapts Its Approach To The EU Right To Be Forgotten

Will the arm wrestling between Google and the EU data protection authorities regarding the implementation of the so-called “right to be forgotten” come to an end?  Almost a year after the CNIL issued a cease and desist against Google, the search engine announced it will expand the right to be forgotten to all Google domains, based on geolocation, starting this week.

By Carol Umhoefer (Carol.Umhoefer@dlapiper.com) & Caroline Chancé (Caroline.Chance@dlapiper.com).

On March 4, 2016, Google announced that it will use geolocation signals (like IP addresses) to restrict access to delisted URL on all Google search engine domains, including google.com, when accessed from the country of the person requesting the removal. This new approach will be applied prospectively but also “retrospectively”, to all previous delistings by Google under the ECJ’s decision in Costeja v. Google[1].

What does this change? Until now, Google delisted search results from all EU versions of the Google search engine, such as google.fr, google.co.uk or google.de, as well as from the Andorra, Icelandic, Liechtenstein, Norwegian and Swiss extensions, regardless of the country of origin of the request. This meant that delisted results were no longer accessible to Internet users using those extensions, but were still available on other versions of Google, such as google.com, google.ca or google.co.jp.

The EU data protection authorities did not consider Google’s approach to be compliant. In the view of the French data protection authority, the CNIL, the various geographic extensions are simple means of access to processing. Therefore, if a search engine agrees to delist a result, it must do it on all the extensions. The CNIL’s reasoning is that to do otherwise deprives the right to be forgotten of its effectiveness. In fact, the CNIL issued a cease and desist to Google, Inc. in May 2015, ordering it to de-index the entirety of Google’s indexing services and thus all extensions of the search engine.  Google appealed to no avail (see previous posts here and here).

Google has now proposed, in addition to its existing practice, to delist results from all extensions, but only for persons searching in the specific country where the delisting request was made. This means that users in other EU countries will still be able to find those results and the search engine will still be processing the data of the person requesting the delisting, even though the negative consequences will obviously be mitigated as people in the same country won’t have access to the delisted links, whatever extension they use.

Will this new approach satisfy the EU data protection authorities? The CNIL has not yet issued its position. Nevertheless, filtering may be an acceptable (or possibly interim) compromise, particularly if applied to the entire EU, as opposed to limiting it to the country where the request was made. People in other EU countries presumably have a lesser interest in finding information regarding the person who made the delisting request. Moreover, if results are completely delisted in the country where the request was made, completely delisting in the EU should not be a problem, either technically or legally. As for the rest of the world, the right to be forgotten could still conflict with other jurisdictions’ laws.

It will therefore be interesting to see whether EU regulators will insist that links be completely delisted for anyone worldwide, as the CNIL first requested in its formal notice, essentially putting search engines in a situation where they would certainly be exposed to financial sanctions in the EU or violate other jurisdictions’ freedom of speech principles  (see previous post here).

In any case, the right to be forgotten will not be forgotten, and in fact has been taken up outside the EU. For example, it has been reported[2] that a Japanese court recently ordered Google to delete from its search engine news reports of Japanese man convicted of a sex offense involving minors who invoked his right to be forgotten.

For further information, please contact Carol.Umhoefer@dlapiper.com or Caroline.Chance@dlapiper.com.

[1] Case C-121/12, Google Spain and Google Inc. v. Agencia Espanola de Protección de Datos (AEPD) and Mario Costeja Gonzalez, May 13, 2014
[2] Justin McCurry, “Japan recognises ‘right to be forgotten’ of man convicted of child sex offences”, The Guardian, March 1, 2016 (http://www.theguardian.com/technology/2016/mar/01/japan-recognises-right-to-be-forgotten-of-man-convicted-of-child-sex-offences)

Permanent link to this article: http://blogs.dlapiper.com/privacymatters/right-to-be-forgotten-google-adapts-its-approach-to-the-eu-right-to-be-forgotten/

POLAND: A New Law Expanding Police and Secret Services Surveillance Powers Comes Into Force in Poland

As of 7 February 2016, access to electronic communications and digital data is easier for state authorities, due to an amendment that expands surveillance powers and restricts citizens’ rights to privacy.

By Paweł Tobiczyk

The Constitutional Tribunal’s judgment

The amendment of the Police Act and legislation regarding other secret services aims to enact Polish Constitutional Tribunal judgment dated 6 August 2014 finding that law enforcement operations did not have proper oversight.

Under previous legislation, law enforcement was entitled to collect billing data from telecommunication companies indirectly, immediately and without limits.  The Tribunal ordered implementation of legislation to oversee the disclosure of such data, and the immediate destruction of surveillance material subject to professional secrecy. The Tribunal also considered that a monitored person must be informed about activities relating to her/him once monitoring has finished. The Tribunal held too that the maximum duration of surveillance should be precisely defined in the law.

New police and secret services powers

The amendment brings important changes to the rules for accessing data and conducting surveillance of citizens’ activities, especially in relation to electronic communications and digital data.

  • Permissible surveillance techniques are much broader than previously, and notably include previewing and recording audio and video on premises, in transport and even in areas that are not public; controlling and storing the content of personal correspondence (including electronic); obtaining data from information media, telecommunications terminal equipment, and IT and telecommunications systems.
  • Surveillance generally does not require obtaining a court order, and may last up to 18 months, and in some cases indefinitely.
  • The court is entitled to oversee surveillance, but only ex post: police and secret services are obliged to provide the court twice a year with reports regarding the activities they have pursued concerning accessing personal data and confidential information of citizens.
  • Surveillance is allowed not only in case of reasonable suspicions that somebody committed an offence but also in order to prevent such situations.

Access to ‘Internet data’

The prerogatives of the police force were extended to so-called “Internet data”, which is not clearly defined by the Polish Act on Rendering Services by Electronic Means. Under this Act all data necessary for the provision of services by electronic means can be treated as Internet data, and potentially the definition can cover the content of private messages.The amendment raises serious doubts:

  • Internet data can be processed without consent of data subjects and the prior permission of a court.
  • Access to Internet data does not have to relate to a particular criminal proceeding, since police and other services are able to preview users “preventively” by checking which websites they visit and what activities are conducted, and then deciding whether an offence was committed.

By permitting access to such a breadth of data, taking into account the wide range of situations when surveillance is allowed, it is possible to understand with great precision the activities of internet users, including activities relating to their private life, which may infringe the right to privacy.

Incompatibility with the Constitutional Tribunal ruling

According to the majority of experts and the Constitutional Tribunal itself, the amendment does not correctly implement the Tribunal’s judgment and is not consistent with law currently in force:

  •  Persons who were under surveillance must be informed that their data is collected and processed.
  • The law must ensure independent oversight of each case of telecommunications data disclosure.
  • If telecommunications data of persons subject to professional secrecy (e.g., attorneys or journalists) is accessed, an independent authority (such as a court) must approve police or secret services collection of such data.
  • The conditions when access to telecommunications data is permitted must be limited (such access should be allowed only in case that use of other methods are ineffective).

As a consequence, it is likely that the amendment is contrary to national and EU law and a challenge could be mounted.

For more information, please contact Paweł Tobiczyk (pawel.tobiczyk@dlapiper.com).

Permanent link to this article: http://blogs.dlapiper.com/privacymatters/poland-a-new-law-expanding-police-and-secret-services-surveillance-powers-comes-into-force-in-poland/

Older posts «