pages tagged dataProtectionhroy.euhttps://hroy.eu/tags/dataProtection/hroy.euikiwiki2023-10-30T20:55:21ZThe EU General Data Protection Regulation explained by Americanshttps://hroy.eu/posts/gdprExplainedByUS/2023-10-30T20:55:21Z2020-06-22T22:00:00Z
Bashing the European Union’s General Data Protection Regulation (GDPR) seems to have become one of American activists’ favourite hobbies in the tech field. Some criticism is entirely justified. But many claims that the GDPR is counterproductive or misses the point are based on misconceptions, rather than an accurate understanding of European data protection laws. As a result, several US privacy advocates have therefore suggested alternative principles or rules… many of which have been part of EU data protection law since 1995.<hr><br><p>Bashing the European Union’s General Data Protection Regulation
(GDPR) seems to have become one of American activists’ favourite hobbies
in the tech field. Some criticism is entirely justified. But many claims
that the GDPR is “counterproductive” or “misses the point” are based on
misconceptions, rather than an accurate understanding of European data
protection laws.</p>
<p>As a result, several US privacy advocates have therefore suggested
alternative principles or rules… many of which have already been part of
EU data protection law since 1995.</p>
<p>So, <a
href="https://twitter.com/hugoroyd/status/1246024493144911873">as
promised</a>, here is:</p>
<p><strong>The GDPR as accidentally explained by people in the US who
criticize the GDPR for its pitfalls, while calling for what’s actually
in the GDPR</strong></p>
<p>If you have other examples to illustrate this, let me know so I can
add them to this post. I may update this post from time to time, so <a
href="https://hroy.eu/tags/gdprExplainedByUS/index.atom">subscribe to the feed</a> to
get notified!</p>
<p><em>A short note:</em> My intention with this post is to help you, my
reader from the US or elsewhere, understand better the GDPR and what’s
in it. I have great respect for many of the people mentioned below (some
of whom I consider or have considered personal heroes) and, as the
saying goes in French, <em>qui aime bien châtie bien</em>.</p>
<h2 id="update2021-09">Summary</h2>
<p>5 things that US people call for having in a “good” privacy law to
address pitfalls of the GDPR, that are actually addressed in the
GDPR:</p>
<ol type="1">
<li><p><strong>Snowden/Tim Wu</strong>: Good laws must start from
regulating data collection, not just data use.</p>
<p>The GDPR regulates data collection from the start, and all other
processing operations on personal data (Article 4(2)).</p></li>
<li><p><strong>Stallman</strong>: Good laws must ensure that systems are
designed not to collect data that they don’t need.</p>
<p>The GDPR puts “data protection by design” and “by default”
requirements, including to minimise collection of data (Article 25) and
encourages developers and manufacturers to implement these in their
products (Recital 78).</p></li>
<li><p><strong>Cegłowski</strong>: Good laws must not focus on consent
as a silver bullet, strong legal limits are needed.</p>
<p>In the GDPR, “consent” is only one of six other bases for giving a
lawful ground to a data processing (Article 6). The GDPR includes 9
general principles putting strong limits, and dozens of compliance
obligations (Articles 5 to 50).</p></li>
<li><p><strong>Stallman</strong>: Good laws must ensure it is not easy
to trick users into giving some kind of broad consent for any
purpose.</p>
<p>The GDPR prohibits broad, meaningless consent (Articles 4, 6, 7) and
requires any purpose to pass lawful and compatibility tests (Articles
5-6).</p></li>
<li><p><strong>Tufekci</strong>: Good laws must not be limited to
individual regulation, but they must include a collective approach.</p>
<p>The GDPR requires to take into account collective risks of processing
personal data (Articles 15, 20, 24, 25, 32-36) and allows collective
actions and enforcement (Article 80).</p></li>
</ol>
<p>Read on for actual quotes and details.</p>
<h2 id="collection">1. The problem starts not with “data use” but with
“data collection”</h2>
<h3 id="snowdenwu">Edward Snowden and Tim Wu (November 2019)</h3>
<blockquote>
<p>Snowden also directed some criticism at data privacy authorities that
have tried to step up regulation on companies over how they handle user
data. He said the EU’s General Data Protection Regulation […] “misplaces
the problem.”</p>
<p>“The problem isn’t data protection, the problem is data collection,”
Snowden said. <a
href="https://www.cnbc.com/2019/11/04/edward-snowden-warns-about-data-collection-surveillance-at-web-summit.html">Source</a></p>
</blockquote>
<blockquote>
<p><em>Edward Snowden said this at the Web Summit: “I think GDPR is not
the solution, but the problem is with data collection not data use. It
gives a false sensation of reassurance.” What are your thoughts on
this?</em></p>
<p>[Tim Wu:] I think he has a point…that’s what my criticism of GDPR is.
It doesn’t actually stop anyone from doing anything. Collect all you
want…and I think that’s where the problem starts. I think he’s onto
something. <a
href="https://gdpr.report/news/2019/11/18/privsecny-tim-wu-on-gdpr-and-data-privacy-practices-in-the-us/">Source</a></p>
</blockquote>
<p>Snowden and Wu argue that regulations on data use are not sufficient
to protect people. For them, a good regulation should start with data
collection.</p>
<p>That is why, since 1995<a href="https://hroy.eu/tags/dataProtection/#fn1" class="footnote-ref"
id="fnref1" role="doc-noteref"><sup>1</sup></a>, EU data protection law
regulates not only data use, but also the collection of personal
data.</p>
<p>More specifically, the GDPR covers the processing of personal data.
Processing is defined in the GDPR as “<em>any operation</em>” performed
on personal data. <a
href="https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32016R0679#d1e1489-1-1">Article
4(2) of the GDPR</a> includes data “<em>collection</em>” explicitly.</p>
<p><a
href="https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32016R0679#d1e1797-1-1">Article
5(1)</a> sets the principle of “<em>data minimisation</em>”, and also
provides that personal data must be “<em>collected for specified,
explicit and legitimate purposes</em>.”</p>
<p>If personal data is collected in breach of these rules, the company
responsible for the infringement may be fined up to 4 % of their global
annual turnover (or EUR 20,000,000 if higher). Authorities may also
order the company to destroy the data collected in breach – regardless
of whether the data was ever used or not.</p>
<h3 id="richard-stallman-april-2018-or-december-2019">Richard Stallman
(April 2018 or December 2019)</h3>
<blockquote>
<p>There are so many ways to use data to hurt people that the only safe
database is the one that was never collected. Thus, instead of the EU’s
approach (in the GDPR) of mainly regulating how personal data may be
used, I propose a law to stop systems from collecting personal data.</p>
<p>The robust way to do that, the way that can’t be set aside at the
whim of a government, is to require systems to be built so as not to
collect data about persons. The basic principle is that a system must be
designed not to collect certain data, if its basic function can be
carried out without that data. <a
href="https://stallman.org/articles/real-privacy-laws.html">Source</a></p>
</blockquote>
<p>Stallman argues that laws must prohibit data collection if it is not
necessary or not justified, and that systems must be designed not to
collect certain data.</p>
<p>That is why, since 1995, EU data protection law regulates not only
data use, but also the collection of personal data (see <a
href="https://hroy.eu/tags/dataProtection/#snowdenwu">above</a>).</p>
<p>Moreover, since 2018, the GDPR has extended the set of rules with the
principles of data protection “by design” and “by default”.</p>
<p><a
href="https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32016R0679#d1e3063-1-1">Article
25</a> provides, specifically, that systems must be designed to
implement data minimisation effectively. In addition, technical and
organisational measures must by default ensure that “<em>only personal
data which are necessary for each specific purpose of the processing are
processed. That obligation applies to the amount of personal data
collected […].</em>”</p>
<p>Moreover, Recital 78 provides that: “<em>When developing, designing,
selecting and using applications, services and products that are based
on the processing of personal data or process personal data to fulfil
their task, producers of the products, services and applications should
be encouraged to take into account the right to data protection when
developing and designing such products, services and applications and,
with due regard to the state of the art, to make sure that controllers
and processors are able to fulfil their data protection
obligations.</em>”</p>
<h2 id="consent">2. Individuals’ consent is not the right approach for
privacy</h2>
<h3 id="maciej">Maciej Cegłowski (April 2020)</h3>
<blockquote>
<p>The European approach to privacy legislation has been to add layers
of complexity, based on a kabuki dance of individual consent, where all
that is needed are some strong legal limits on what data can be
collected and how long it can be stored. <a
href="https://twitter.com/Pinboard/status/1245699326522712064">Source</a></p>
</blockquote>
<p>Cegłowski argues that basing privacy legislation on individual
consent is not the right approach and, instead, regulations should
provide strong legal limits on data collection and data retention.</p>
<ul>
<li><p>That is why, since 1995, individual consent is only <em>one</em>
among six legal bases that allow lawful collection of personal data. <a
href="https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32016R0679#d1e1888-1-1">Article
6</a> of the GDPR requires at least one of six legal bases to be
applicable. In many circumstances, “consent” is not considered as an
adequate basis (e.g. in employee-employer relationships).</p>
<p><span id="consent-conditions">Even where consent may be considered
adequate, it must fulfil strong conditions: to be a “<em>freely given,
specific, informed and unambiguous indication</em>” of agreement
expressed “<em>by a statement or by a clear affirmative action.</em>”
(<a
href="https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32016R0679#d1e1489-1-1">Article
4(11)</a>)</span></p>
<p>If you thought that it’s sufficient to obtain consent to anything by
checking a box to read and agree to the terms of service, or that merely
browsing a website meant accepting cookies — you’ve been misled by the
kabuki dance of people who wished the GDPR was centered around weak
individual consent. In spite of GDPR strengthening conditions for
consent, the online ad and tracking industry is still trying with their
complex cookie banners and settings!</p>
<p>And, where sensitive data is concerned:</p>
<ul>
<li>collection is prohibited as a general rule,</li>
<li>unless explicit consent has been obtained and no EU or national laws
rule out consent<a href="https://hroy.eu/tags/dataProtection/#fn2" class="footnote-ref" id="fnref2"
role="doc-noteref"><sup>2</sup></a>, or</li>
<li>unless one of the nine other exemptions listed in Article 9(2)
applies (many of which require a EU or national law).</li>
</ul></li>
<li><p>Additionally, the GDPR provides the principle of “<em>storage
limitation</em>” in <a
href="https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32016R0679#d1e1797-1-1">Article
5(1)(c)</a>.</p>
<p>Pursuant to this principle, personal data must be “<em>kept in a form
which permits identification of data subjects for no longer than is
necessary for the purposes for which the personal data are
processed</em>” i.e. when data is not necessary any more, it must be
destroyed or anonymized.</p>
<p>The GDPR allows, however, for retaining data longer, in particular
for archival, research or statistical purposes - subject to certain
conditions (see Article 5 and Article 89 among others).</p></li>
<li><p>For the argument on data <a href="https://hroy.eu/tags/dataProtection/#collection">collection</a>,
see above.</p></li>
</ul>
<p><small>To be entirely fair, and exhaustive — specific regulations may
provide otherwise. The GDPR as its title suggests, is a “general” body
of law. In certain circumstances, specific rules may contravene the
general principles stated above. For instance, since 2009, the “EU
cookie directive” (which modifies the “ePrivacy Directive” of 2002)
requires consent as the <em>only</em> available basis for storing
information or identifiers on a user’s device, or for accessing such
stored information. There are, however, some exceptions, i.e. if
necessary to provide services expressly requested…</small></p>
<h3 id="richard-stallman-april-2018-or-december-2019-1">Richard Stallman
(April 2018 or December 2019)</h3>
<blockquote>
<p>The EU’s GDPR regulation is well-meant, but does not go very far. It
will not deliver much privacy because its rules are too lax. They permit
collecting any data if it is somehow useful to the system, and it is
easy to come up with a way to make any particular data useful for
something.</p>
<p>The GDPR makes much of requiring users (in some cases) to give
consent for collection of their data, but that doesn’t do much good.
System designers have become expert at manufacturing consent (to
repurpose Chomsky’s phrase). Most users consent to a site’s terms
without reading them; a company that required users to trade their
first-born child got consent from plenty of users. Then again, when a
system is crucial for modern life, like buses and trains, users ignore
the terms because refusal of consent is too painful to consider.</p>
<p>To restore privacy, we must stop surveillance before it even asks for
consent. <a
href="https://stallman.org/articles/real-privacy-laws.html">Source</a></p>
</blockquote>
<p>Stallman argues that it is too easy to trick users into consenting to
the collection of their data, and that is it too easy to claim that data
is “somehow useful”. Instead, we should stop surveillance before it even
asks for consent.</p>
<p>That is why, under EU data protection laws:</p>
<ul>
<li><p>Where individual consent is an adequate legal basis (which is not
always the case), the GDPR puts strong legal conditions on what valid
consent requires (see <a href="https://hroy.eu/tags/dataProtection/#consent-conditions">above</a>), so it
cannot be considered “easy” to trick people into giving
consent.</p></li>
<li><p>Where data is collected and used in connection with a contract
(e.g. terms and conditions of an online service), only data that is
“<em>necessary</em>” may be lawfully processed.</p>
<p><a
href="https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32016R0679#d1e1888-1-1">Article
6</a> provides that processing in that context must be “<em>necessary
for the performance of a contract to which the data subject is party or
in order to take steps at the request of the data subject prior to
entering into a contract</em>” — it is, therefore, not sufficient to
consider that the data is “somehow useful”.</p></li>
<li><p>In any event, personal data must be processed for
“<em>legitimate</em>” and explicit purposes. Surveillance that is no
legitimate must be stopped, even where consent has been obtained or
where there is a contract. The real question, therefore, is what sort of
surveillance can be considered legitimate or not. This is not only a
legal, but also a political and social question, and your views may
vary… Data protection law does not exist in a vacuum.</p></li>
</ul>
<p>On a related note, Stallman seems to be conflating several issues
here.</p>
<p>It is indeed a problem that most users consent to a site’s terms
without reading them. I know this problem well enough for having started
<a href="https://hroy.eu//tosdr.org">Terms of Service; Didn’t Read</a>. Sure, terms of
service may contain <a
href="http://www.huffingtonpost.com/2010/04/17/gamestation-grabs-souls-o_n_541549.html">silly
things</a> - surveillance and data rights issues are just <a
href="https://hroy.eu//tosdr.org/topics.html#topics">one of the many issues with
this</a> - the origin of the problem lies elsewhere.</p>
<p>Another confusion that Stallman seems to make here relates to consent
and contracts. Data protection law and contract law are two separate
bodies of laws with their own rules. In some situations, these rules
stack up and interact with each other.</p>
<p>However, consenting or agreeing to a contract (e.g. accepting terms
of an online service) only implies that data strictly necessary to
perform the agreement (e.g. providing the online service) may be
processed. Agreeing to a contract does not equate giving “consent” to
any processing purposes in the meaning of “consent” under <a
href="https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32016R0679#d1e1489-1-1">Article
4(11)</a> of the GDPR. Contracts and consent are two separate legal
bases under the GDPR (see <a href="https://hroy.eu/tags/dataProtection/#maciej">above</a>).</p>
<h2
id="data-cannot-be-regulated-at-the-individual-level-a-collective-response-is-necessary">3.
Data cannot be regulated at the individual level, a collective response
is necessary</h2>
<h3 id="update2020-09">Zeynep Tufekci (January 2018 and August
2020)</h3>
<blockquote>
<p>Data privacy is not something that can be effectively regulated at
the individual level because it is something akin to air pollution, a
public good that requires a collective response. That’s why GDPR in
Europe doesn’t work. <a
href="https://twitter.com/zeynep/status/1298972608935927808">Source</a></p>
<p>Data/tech has to be examined and regulated at the society level as a
collective problem; not one merely of individual consent (though that
also matters). <a
href="https://twitter.com/zeynep/status/1299077535477772289">Source</a></p>
</blockquote>
<p>Tufekci argues that the political issues arising out of our
data-driven age will not be solved by “regulating at the individual
level” but “requires a collective response.” The reference to air
pollution seems to point to privacy as more of an <a
href="https://hroy.eu/posts/moglen_privacy_ecological/"
title="A metaphor of privacy that I have long agreed with">ecological
issue</a> that cannot be managed “person-by-person through a system of
individualized inform consent” (<a
href="https://twitter.com/zeynep/status/1298972608935927808/photo/1">quoting</a>
Tufekci from <a
href="https://www.nytimes.com/2018/01/30/opinion/strava-privacy.html">this
NYT opinion</a>).</p>
<p>That is why the GDPR does <em>not</em> follow a one-size-fits-all
approach based on individual consent (see <a href="https://hroy.eu/tags/dataProtection/#consent">above</a>),
and that is also why the GDPR introduced (i) risks-based analyses and
(ii) collective enforcement of rights, into EU data protection law.</p>
<ul>
<li><p>The GDPR creates several instances in which the risks of natural
persons must be taken into account when processing data, regardless of
whether or not these natural persons have given consent, and regardless
even of whether the data relates to them. A third-party individual whose
rights and freedoms are impacted by the processing of data must also be
taken into account. This is as wide as you can get in terms of the
<em>personal</em> scope of human rights: this is not an individual-based
regulation.</p>
<p>Examples of this can be found <em>inter alia</em>:</p>
<ul>
<li>In Articles 35 and 36, which impose obligations to conduct impact
assessments or to consult data protection authorities prior to launching
high-risks data processing (something that data protection authorities
regularly check when launching investigations)</li>
<li>In Article 25, which defines the obligations of data protection by
design and by default</li>
<li>In Articles 33 and 34, which deal with obligations arising where a
data breach happens</li>
<li>In Articles 24 and 32, which define the general responsibility of a
data controller and their security obligations</li>
<li>Even in Articles 15 and 20, where the rights and freedoms of
individuals must be taken into account to limit the data subjects’
personal rights to obtain data about themselves.</li>
</ul></li>
<li><p>The GDPR also introduced into data protection law the collective
representation of individuals, and the collective enforcement of data
protection rights, as permitted in Article 80. However, the exact extent
to which collective responses are made possible through complaints or
through courts, depends on implementations into national laws.</p></li>
</ul>
<hr />
<p>If you have other examples to illustrate this, or questions or
comments on the above, let me know!</p>
<h2 id="see-also">See also</h2>
<p>Gabriela Zanfir-Fortuna, <a
href="https://medium.com/@gzf/10-reasons-why-the-gdpr-is-the-opposite-of-a-notice-and-consent-type-of-law-ba9dd895a0f1">10
reasons why the GDPR is the opposite of a ‘notice and consent’ type of
law</a>, March 2019</p>
<h2 id="post-scriptum">Post-Scriptum</h2>
<p>The goal of this post is neither to contribute to some anti-American
sentiment, nor to claim that the GDPR is perfect, or that European laws
are generally better than US laws. I do not think that’s true. Time will
tell how effective the GDPR is going to be. Two years is short a short
time to evaluate that. Even then, there is a larger context not directly
related to the GDPR as such: enforcement actions are usually slow;
Europe still lacks a culture of litigition for civil rights, and the
powerful non-profit organisations to activate them that match those in
the US. US class action lawsuits are, also, far away from Europe’s
judicial systems (where they are not seen as scarecrows).</p>
<p>Nevertheless, we should acknowledge the fact that EU law has got many
of the foundational principles around data protection right. As almost
all of the examples above show: while the intention was to criticise the
GDPR, the authors actually call for the very same principles that the
GDPR, and the 1995 Directive before it, set forth.</p>
<p>So why has the USA not enacted an equivalent federal general data
protection legislation? There are, already, strong protections for the
privacy of Americans in the US Constitution. And access to electronic
communications content and data by US authorities has received increased
protection by US courts, in particular the US Supreme Court in the
<em>Carpenter</em> case recently. Some of these safeguards were ahead of
their time, while some are reminiscent of the EU top court’s own case
law. There is, however, still no GDPR-equivalent data protection law at
the federal level in the US. Although it seems that, with the CCPA (and
maybe others), some states like California are pushing in this
direction.</p>
<p>In 1973, a US official committee submitted the “<a
href="https://aspe.hhs.gov/report/records-computers-and-rights-citizens">Records,
computers and the rights of citizens</a>” report. The title of this
report is almost identical to the French data protection law of 1978 (on
computing, records and freedoms). What also strikes me is that the
recommendations of this report share strong similarities with the GDPR
in many ways (see the list of data subject rights, the main principles,
and the obligations on data controllers in the “Summary and
Recommendations”).</p>
<p>So, what happened? How’s <a
href="https://twitter.com/WolfieChristl/status/1274081674360508418">this</a>
permitted?</p>
<hr />
<p><strong>Thanks</strong> to <a href="https://lori.is">Lori
Roussey</a>, Alexandre Rogers, <a href="https://michae.lv/">Michael
Veale</a><!--[Your name/link here :-)]--> for reviewing drafts of
this.</p>
<aside id="footnotes" class="footnotes footnotes-end-of-document"
role="doc-endnotes">
<hr />
<ol>
<li id="fn1"><p>The GDPR, adopted in 2016, reuses and extends on most of
the basic definitions and principles of the 1995 Directive. For those
curious, the difference between a EU Directive and an EU Regulation is
that a Directive is a law that’s supposed to give EU Member States a
goal to achieve, a direction, which implies that Member States need to
implement measures in their national laws; while a Regulation (like the
GDPR) is a law directly applicable in all EU territories, which thus
contributes to achieving greater harmonisation of laws in Europe.<a
href="https://hroy.eu/tags/dataProtection/#fnref1" class="footnote-back" role="doc-backlink">↩︎</a></p></li>
<li id="fn2"><p>For instance, in France, it is prohibited for a company
to get DNA from people. Consent cannot override this prohibition.
Derogations are available for medical research, etc. See the <a
href="https://www.cnil.fr/fr/les-donnees-genetiques-premier-titre-de-la-nouvelle-collection-point-cnil"
hreflang="fr">French data protection authority’s book on this</a>.<a
href="https://hroy.eu/tags/dataProtection/#fnref2" class="footnote-back" role="doc-backlink">↩︎</a></p></li>
</ol>
</aside>
https://hroy.eu/posts/whatsGoingOnWithDataP/
Privacy notices under CC-0
https://twitter.com/neil_neilzone/status/543036681759514624
2023-10-30T19:54:26Z2014-12-11T14:27:22Z
<p>The <a href="https://hroy.eu/tags/EUCJ/">EUCJ</a> has just published <a
href="http://curia.europa.eu/juris/documents.jsf?num=C-212/13">another
decision</a> regarding <span class="selflink">data protection</span>
that got me puzzled (but I’m not the only one!).</p>
<p>This one is primarily concerned with the interpretation of exceptions
to the 1995 directive, but it also has interesting things to say
regarding the infamous <a href="https://hroy.eu/tags/rightToBeForgotten/">so-called right
to be forgotten</a> decision where
<a href="https://hroy.eu/posts/rtbf-what-cjue-got-wrong/">legitimate interests in
personal data processing</a> were involved.</p>
<p>The facts are simple: someone puts a camera to monitor the entrance
of his house. One day, people break in, but they are later identified
thanks to the camera. Then, these suspects challenge the legality of the
camera system on the grounds that they were not notified of the
processing of their personal data.</p>
<p>Article 3 of the 1995 directive provides:</p>
<blockquote>
<p>2 This Directive shall not apply to the processing of personal data:
[…]</p>
<p>– by a natural person in the course of a purely
<strong>personal</strong> or <strong>household activity</strong>.’</p>
</blockquote>
<p>But for the Court, (emphasis is mine)</p>
<blockquote>
<p>33 To the extent that video surveillance such as that at issue in
the main proceedings <em>covers, even partially, a public space and is
accordingly directed outwards from the private setting</em> of the
person processing the data in that manner, it cannot be regarded as an
activity which is a purely ‘personal or household’ activity</p>
</blockquote>
<p>This is a strange reasoning in my opinion, as it seems to make no
distinction between <em>purely personal activities</em> and <em>purely
household activities</em>–they are now combined under the criteria of
the “private setting.”</p>
<hr />
<p>So here’s how this applies to us: thanks to <a
href="http://neilzone.co.uk/">Neil</a>, we already have a solution!</p>
<div class="embedexternal">
<blockquote class="twitter-tweet" lang="en">
<p>
Following today’s CJEU ruling, I have launched new data protection
compliance stickers for everyone with a smartphone
<a href="http://t.co/WHa72i05iM">pic.twitter.com/WHa72i05iM</a>
</p>
— Neil Brown (<span class="citation"
data-cites="neil_neilzone">@neil_neilzone</span>)
<a href="https://twitter.com/neil_neilzone/status/543013244894711808">December
11, 2014</a>
</blockquote>
<script async src="https://hroy.eu//platform.twitter.com/widgets.js" charset="utf-8"></script>
<blockquote class="twitter-tweet" lang="en">
<p>
Don’t want to be filmed or photographed? Get my new, trendy “right to
object” jumper-based notification:
<a href="http://t.co/sCwmsXtgGd">pic.twitter.com/sCwmsXtgGd</a>
</p>
— Neil Brown (<span class="citation"
data-cites="neil_neilzone">@neil_neilzone</span>)
<a href="https://twitter.com/neil_neilzone/status/543035098732711936">December
11, 2014</a>
</blockquote>
<script async src="https://hroy.eu//platform.twitter.com/widgets.js" charset="utf-8"></script>
</div>
<h2 id="how-does-this-relate-to-the-so-called-right-to-be-forgotten">How
does this relate to the so-called right to be forgotten?</h2>
<p>The Court notes that:</p>
<blockquote>
<p>34 At the same time, the application of Directive 95/46 makes it
possible, where appropriate, to take into account — in accordance, in
particular, with Articles 7(f), 11(2), and 13(1)(d) and (g) of that
directive — <em>legitimate interests pursued by the controller, such as
the protection of the property, health and life of his family and
himself</em>, as in the case in the main proceedings.</p>
</blockquote>
<p>I wish the Court followed the same approach in the so-called Right to
be forgotten decision. But instead, the
<a href="https://hroy.eu/posts/rtbf-what-cjue-got-wrong/">legitimate interest of
the public to access published information has not been taken into
account</a>.</p>
Some comments on the EU’s draft Privacy Iconshttps://hroy.eu/posts/encryptionEuDataIcons/2023-10-30T19:54:26Z2014-11-12T16:37:57Z
<p>The European Union is currently reviewing the regulatory framework of
personal data protection. In the current draft, a standardised icon set
would be mandatory in some circumstances.</p>
<p>I’m not convinced this is the best implementation, and there’s even
one icon in the set that I’m really concerned about: “Encryption”. This
proposal could undermine years of activism in favour of better
encryption for users.</p>
<hr />
<p>As I’ve been working on Terms of Service; Didn’t Read for a couple of
years now, I have some experience and idea about how this sort of things
might work and how it compares to existing projects, especially in the
fields of “Privacy Icons” where several projects coexist and keep
raising much attention (including, it seems, from European
legislators).</p>
<p>First, some context for those who haven’t followed (feel free to skip
to the second part if you’ve followed personal data regulations updates
in the EU). In January 2012, the European Commission <a
href="http://europa.eu/rapid/press-release_IP-12-46_en.htm?locale=en">announced</a>
a plan to revise data protection laws in the European Union with a <a
href="https://en.wikipedia.org/wiki/General_Data_Protection_Regulation">draft
regulation</a>. Currently, most of the European Union’s laws on the
protection of personal data come from a <a
href="https://en.wikipedia.org/wiki/Data_Protection_Directive">1995
European Union directive</a>. (Unlike a directive, a <em>EU
regulation</em> is law that applies EU-wide without the need for each
state to make their own internal legal implementation.)</p>
<p>So, this is going to be 20 years old soon. It’s quite extraordinary
that even now, the directive does not seem too far off. The intentions
are good and it’s a great thing that legislators foresaw the need to
enhance people’s privacy back then (France and Germany already had a law
for that by the end of the 1970s). But today, all this is in the middle
of <a
href="http://www.janalbrecht.eu/themen/datenschutz-und-netzpolitik/lobbyism-and-the-eu-data-protection-reform.html">a
huge battle</a>.</p>
<p>After several steps through the European Union’s lawmaking process,
the regulation is now in a <a href="https://hroy.eu/tags/dataProtection/DPRConsolidated.pdf">consolidated
draft</a>.</p>
<p>I want to focus on the draft article 13a (in Chapter Ⅲ, Section 1:
Transparency and modalities) which provides:</p>
<blockquote>
<ol type="1">
<li><p>Where personal data relating to a data subject are collected, the
controller shall provide the data subject with the following particulars
before providing information pursuant to Article 14:</p>
<ol type="a">
<li>whether personal data are collected beyond the minimum necessary for
each specific purpose of the processing;</li>
<li>whether personal data are retained beyond the minimum necessary for
each specific purpose of the processing;</li>
<li>whether personal data are processed for purposes other than the
purposes for which they were collected;</li>
<li>whether personal data are disseminated to commercial third
parties;</li>
<li>whether personal data are sold or rented out;</li>
<li>whether personal data are retained in encrypted form.</li>
</ol></li>
<li><p>The particulars referred to in paragraph 1 shall be presented
pursuant to Annex X in an aligned tabular format, using text and
symbols, in the following three columns:</p>
<ol type="a">
<li>the first column depicts graphical forms symbolising those
particulars;</li>
<li>the second column contains essential information describing those
particulars;</li>
<li>the third column depicts graphical forms indicating whether a
specific particular is met.</li>
</ol></li>
<li><p>The information referred to in paragraphs 1 and 2 shall be
presented in an easily visible and clearly legible way and shall appear
in a language easily understood by the consumers of the Member States to
whom the information is provided. Where the particulars are presented
electronically, they shall be machine readable.</p></li>
<li><p>Additional particulars shall not be provided. Detailed
explanations or further remarks regarding the particulars referred to in
paragraph 1 may be provided together with the other information
requirements pursuant to Article 14.</p></li>
<li><p>The Commission shall be empowered to adopt, after requesting an
opinion of the European Data Protection Board, delegated acts in
accordance with Article 86 for the purpose of further specifying the
particulars referred to in paragraph 1 and their presentation as
referred to in paragraph 2 and in Annex 1.</p></li>
</ol>
</blockquote>
<h2 id="why-the-encryption-icon-is-a-bad-idea">Why the “Encryption” icon
is a bad idea?</h2>
<p><strong>TL;DR</strong> Storing sensitive data in data centers without
encrypting them first is just negligence and should not be allowed.
There’s no need for an icon that probably a large majority of users will
not really understand.</p>
<hr />
<p>In the draft proposal, when personal data is collected, the person
who’s subject of that data should get information in the form of a
standardised icon. One of the icons proposed is about encryption:</p>
<figure>
<img src="https://hroy.eu/tags/dataProtection/iconEncrypt.png" alt="Everything is Safe!" />
<figcaption aria-hidden="true">Everything is Safe!</figcaption>
</figure>
<p>If the data is stored encrypted, then the data controller can display
a huge green mark next to the icon. <em>All is fine!</em></p>
<p>Except that it’s not. I can really see how this could get very, very
confusing. It is very easy to claim that something “is encrypted” and
that thus, <em>everything’s good.</em> I’ve heard this argument several
times from Google employees: <em>Google stores the data in encrypted
forms, so don’t worry</em>. But still, when Google access the data to
process it, it is decrypted by them.</p>
<p>Let’s put this in context.</p>
<p>Following Edward Snowden’s revelations, it is very clear that
encryption is one part of the solution against the intrusion in our
lives that the NSA and other State agencies in the world are pursuing.
Thus, it is crucial that users understand that <strong>there are ways to
protect their communications against the intrusion of the
State</strong>, and also from companies or criminals. This is why
initiatives such as Cryptoparties and Privacy Cafés, where people help
each other understand and use encryption techniques, are so
important!</p>
<p>But encryption does not always mean the same thing in all contexts.
It requires basic technological understanding to grasp when encryption
is simply a security good practice against criminals, and when
encryption is actually a much more powerful tool.</p>
<p>For instance, when I send sensitive information over the web (like a
financial transaction, or like my user nick and password), it is very
important that the connection is encrypted (e.g. using HTTPS);
otherwise, it would not be difficult to intercept that sensitive
information. Enabling encryption for that kind of stuff should simply be
mandatory.</p>
<p>It’s a good idea to impose security obligations over storing personal
data. But I fail to see how showing an icon to users about storing data
in encrypted form will do any good. Worse, it might even confuse people
about what encryption really means in which context, thus making it even
harder to explain why encryption is important and why tools such as
GnuPG should be improved in usability.</p>
<h2 id="is-this-standardised-icon-set-really-good-anyway">Is this
standardised icon set really good anyway?</h2>
<p>Raising awareness about privacy rights online is important. This is
what I have been doing with <a href="https://tosdr.org">Terms of
Service; Didn’t Read</a> for about two years now. I’ve seen several
variations of the Privacy Icons idea, and this implementation as
suggested by the EU draft regulation shows that getting it right is not
easy.</p>
<p>The <a href="https://hroy.eu/tags/dataProtection/DPRConsolidated.pdf">consolidated draft</a> has an annex
showing how the icons could be:</p>
<figure>
<img src="https://hroy.eu/tags/dataProtection/dataCollect.png" alt="No unnecessary data collection" />
<figcaption aria-hidden="true">No unnecessary data
collection</figcaption>
</figure>
<p>Depending on whether that’s the case, the data controller would have
to display a green or a red mark next to this icon:</p>
<figure>
<img src="https://hroy.eu/tags/dataProtection/goodOrBad.png" alt="Good or Bad?" />
<figcaption aria-hidden="true">Good or Bad?</figcaption>
</figure>
<p>In <a href="https://tosdr.org">ToS;DR</a>, we also use this approach:
for each point, there’s an iconic indication whether this is a good or a
bad thing. Only, we allow for more variations:</p>
<figure>
<img src="https://hroy.eu/tags/dataProtection/ToSDRTitles.png" alt="Good points, and bad points" />
<figcaption aria-hidden="true">Good points, and bad points</figcaption>
</figure>
<figure>
<img src="https://hroy.eu/tags/dataProtection/thumbsDown.png" alt="… and blockers" />
<figcaption aria-hidden="true">… and blockers</figcaption>
</figure>
<p>But the major problem that I have with “Privacy Icons” is that they
are too <strong>difficult to grasp</strong>. If you actually remove the
text aside the icon itself, you realise that the icon itself is
<strong>far from self-explanatory</strong>. This gets even more
<em>complex</em> with the number of icons you add.</p>
<p>These icons are not universally understood. Here’s how <em>the same
concept</em> is rendered differently by different Privacy icons
sets:</p>
<figure>
<img src="https://hroy.eu/tags/dataProtection/dataProcessing.png" alt="EU draft" />
<figcaption aria-hidden="true">EU draft</figcaption>
</figure>
<hr />
<figure>
<img src="https://hroy.eu/tags/dataProtection/dataForPurpose.png"
alt="Mozilla’s Alpha version of Privacy Icons" />
<figcaption aria-hidden="true">Mozilla’s Alpha version of Privacy
Icons</figcaption>
</figure>
<figure>
<img src="https://hroy.eu/tags/dataProtection/dataNoPurpose.png"
alt="Mozilla’s Alpha version of Privacy Icons" />
<figcaption aria-hidden="true">Mozilla’s Alpha version of Privacy
Icons</figcaption>
</figure>
<hr />
<figure>
<img src="https://hroy.eu/tags/dataProtection/DisconnectIcons.png" alt="Disconnect.me icons" />
<figcaption aria-hidden="true">Disconnect.me icons</figcaption>
</figure>
<p>Compare these with how a similar point would be addressed in <a
href="https://tosdr.org">ToS;DR</a>:</p>
<figure>
<img src="https://hroy.eu/tags/dataProtection/ToSDRTitle.png" alt="The summary version" />
<figcaption aria-hidden="true">The summary version</figcaption>
</figure>
<p>which can be expanded with a plain-English paragraph and links to
contextualise if the user wants more information:</p>
<figure>
<img src="https://hroy.eu/tags/dataProtection/ToSDRParagraph.png" alt="The plain english version" />
<figcaption aria-hidden="true">The plain english version</figcaption>
</figure>
<p>There’s probably a way somewhere to learn from these different
approaches and make an implementation that gets it right for users.</p>
<p>The EU already made such a thing possible with the <a
href="https://en.wikipedia.org/wiki/European_Union_energy_label">energy
efficiency labels</a>. (They actually were a source of inspiration for
ToS;DR <a href="https://tosdr.org/classification.html">classes</a>.)</p>
<p>Let’s hope the next proposal gets it right with an icon system that
is easier to understand and which gets rid of the confusing bits.</p>