Home / News & Blogs / Google’s appeal of the €50 million GDPR penalty: what is at stake? (part 3)

Google’s appeal of the €50 million GDPR penalty: what is at stake? (part 3)

| 6 March 2019

Recently, we provided a summary and initial analysis of the first major fine under the GDPR, which was issued by the French data protection authority (the CNIL). In a series of three blogs, we provide further analysis of this landmark decision and its implications. In the first part, we provided information about the motivation of the penalty and Google’s decision to appeal. In the second part of the series, we considered potential future GDPR fines and drew a link between privacy and competition considerations. In this third part, we will provide practical advice to improve compliance.

Practical advice to improve GDPR compliance

Whatever the eventual outcome of the Google case, providing full transparency and obtaining truly informed consent will remain a continuous challenge. Describing technical processing operations and data flows, which are often highly complex, in a manner which is simultaneously easy to understand, accurate, complete, up-to-date, and provided at the right moment and in the right manner is no easy task. However, we can draw some specific points of advice from the CNIL’s explanations of Google’s shortcomings, which will be relevant to others as well.

Integrate privacy settings and information within initial setup procedures

The privacy settings, information, and consent process for specific processing purposes and operations should be part of the initial setup or registration process for digital services. This is because the GDPR requires information to be provided at the time that personal data are obtained. For valid informed consent, providers are also required to present information with the request for consent and not after. A substantial challenge is doing this in a user-friendly way, without overwhelming users with information or asking them so much that they will simply click on without reading.

  • The setup process itself should include sufficient information and explanations. Don’t merely refer to other documents where further information can be found. The challenge is providing information about highly complex processing activities in such a way that users, even those without technical knowledge, can understand the impact on their privacy. Very difficult, we know, but crucial to give true best efforts to achieve.
  • If you want to rely on consent, don’t pre-check boxes or group many purposes and processing activities together. This will increase the risk that consent will not be deemed valid. The bar for consent under the GDPR is high, particularly as interpreted by the CNIL. Any box that is checked in advance is likely to raise concerns that consent is sought in an invalid way. Relying on a single consent action for various processing activities and purposes at once is also risky, because consent is required to be “specific”. To avoid having to bombard users with copious amounts of (not pre-checked) checkboxes, pop-ups, or other mechanisms to obtain informed, specific, unambiguous, freely given, affirmative (i.e. opt-in) consent — a high bar, indeed — it may be advisable to consider other legal grounds where possible, e.g. performance of a contract or legitimate interest (although it should be noted that those grounds also have their pitfalls).
  • Give users the possibility to configure privacy settings after the initial setup process as well. Of course, it is not advisable to provide privacy settings and information only at the initial setup stage. It is important that users can review and adjust their privacy settings at any time, as their preferences may change over time.

Provide information in layers, but minimise steps and choose topics and highlights carefully

Providing information in several layers, from a very concise, high-level overview, to more detailed information provided after clicking “read more”, is considered a best practice. This approach tries to meet the difficult challenge of providing information that doesn’t require too much time and effort from the user to access and understand, while also providing information that is sufficiently accurate and complete. Particular care should be taken, however, to minimise the number of clicks or actions required. If it takes too many clicks to get to the detailed information, or if you group information under the wrong topic, you will risk non-compliance and fines.

Note: this point will be particularly interesting to follow in Google’s appeal. In the process of creating a new Google Account, it appears that far fewer steps are needed than stated by the CNIL to arrive at ad personalisation and other settings — at least via the web pages for services like Gmail or YouTube. Looking at this case from the outside, it is unclear whether this is due to a potential error by the CNIL, because of changes made by Google after the fact, or because the Google Account creation process is different in the specific context of setting up a new Android phone. It is also interesting that the CNIL’s criticism of Google’s “ergonomic choices” to provide more information upon clicking “read more”, does not appear to acknowledge very well that such layering is indeed generally considered a best-practice, also by the European Data Protection Supervisor.

Include videos to make privacy information more accessible and easier to understand & provide guides on how these settings can be used

Videos can be a good medium to make information more easily accessible and understandable than long explanatory texts and could therefore help you meet a higher standard of transparency and informed consent. It may be relevant to note, however, that Google is already doing this and still received the fine from the CNIL.

Perform a data protection impact assessment (DPIA) for your services to help determine which information you should provide and how

As the Google case shows, supervisory authorities may set the bar for consent and transparency higher when services include processing activities that have a higher potential impact on users. In other words, if you process a larger amount of data and/or the data categories which you process are more sensitive in nature, your privacy notices and consent procedures may be held to a higher standard. Performing a DPIA, even when it may not be mandatory under article 35 of the GDPR, can help you better understand the impact and risks of your services to users’ privacy and help you determine which information you should provide them and how.

Take particular care concerning personalized advertising, profiling, and data sharing

  • The information that is usually provided today with respect to personalisation of advertisements and how cookies work or impact user privacy — or other kinds of device IDs and technologies, like programmatic advertising or real-time bidding (RTB) — may be insufficient to meet GDPR standards. (An interesting report on this can be found here.)
  • Creating profiles to inform (automatic) decisions about individuals, including which ads or content to show them, can trigger important privacy concerns and may raise the bar for transparency and consent. It is important to provide sufficient information and choices about whether or not personalisation will occur, which data categories may be included in the profile, after which timeframe data will be deleted from the profile, whether other parties will receive profile data, or whether the profile data may result in different treatment by others, even if they don’t receive the profile data itself. It is also important to be transparent about the underlying logic that is used to create profiles and explain why certain characteristics (inferred or otherwise) will result in being shown certain types of content or advertisements. The information typically provided about this, e.g. when clicking “Why am I seeing this ad?”, currently appears to be highly generic in many cases, which may cause suspicions that the explanation may be incomplete or inaccurate and that far more is going on beneath the surface. It is also advisable to make clear whether the profiles for content and/or ad personalisation will be deleted upon switching this off. (At present, Google still does not make this clear.)
  • Sharing personal data with others is obviously a privacy concern. It is particularly important to be clear and transparent about which parties may receive the data, as well as what they may do with the data and why. Whenever possible, providers should give a complete list of organisations that will receive or obtain access to personal data and for what purposes, particularly if consent is used as the legal basis for the transfer.

Conclusion and further developments

The CNIL’s imposition of the €50 million fine on Google is certainly a landmark event in the enforcement of the GDPR. It will be very interesting to see how the Conseil d’Etat, the French administrative court that is to hear Google’s appeal, will rule on the matter. The Conseil d’Etat may in turn refer certain questions to the Court of Justice of the European Union (CJEU). This does not appear to be unlikely. Indeed, the clash of opinions between the CNIL and Google about two fundamental GDPR concepts — transparency and consent — appears to provide an excellent opportunity for guidance to be sought from the highest court in the EU.

While the final outcome of the case is not easy to predict, it appears likely that at least parts of the CNIL’s decision will be upheld. On the one hand, it is clear that Google has indeed made significant efforts to improve the privacy information and choices provided to its users, as the CNIL has also acknowledged. On the other hand, even more transparency and better consent procedures may be expected under the GDPR, particularly from a tech giant like Google, and particularly in relation to personalisation of ads and content, profiling, and combining or sharing data across different services or companies. One thing is certain: there are interesting times ahead.