pages tagged privacyhroy.euhttps://hroy.eu/tags/privacy/hroy.euikiwiki2023-10-30T20:57:52ZDonnées sur le net : tous suspectshttps://hroy.eu/posts/donneesSurLeNetTousSuspects/Les Exégètes Amateurs2023-10-30T19:54:26Z2017-06-17T22:00:00Z
À l’occasion des actions engagées en faveur des droits au respect de la vie privée et à la protection des données personnelles, j’ai pu contribuer à la rédaction de cette tribune. Réaction à la position des autorités françaises face à la justice européenne, ce texte esquisse aussi brièvement ma critique de la société de suspicion généralisée — l’une des principales raisons de mes engagements avec « les Exégètes ».<hr><br><p><strong><a
href="https://exegetes.eu.org/posts/donnees-sur-le-net-tous-suspects/">Tribune
publiée</a> dans Libération en version papier et <a
href="http://www.liberation.fr/debats/2017/06/18/donnees-sur-le-net-tous-suspects_1577671">en
ligne</a>.</strong></p>
<p>Placer la totalité de la population sous surveillance préventive
n’est pas admissible dans une société démocratique. Voici la conclusion
tirée par la Cour de justice de l’Union européenne, dans deux arrêts
(<em>Digital Rights</em> en 2014 et <em>Tele2</em> en 2016) concernant
les «<em>données de connexion</em>». Pour la cour européenne, ces traces
numériques enregistrées dans le sillage des communications électroniques
de chacun (géolocalisation, appels téléphoniques, connexions internet,
etc.) révèlent des informations précises et sensibles de la vie des
personnes. Leur conservation ne peut donc pas être généralisée et
systématique ; elle doit à l’inverse être encadrée et limitée afin de
garantir le droit fondamental au respect de la vie privée.</p>
<p><a href="https://hroy.eu/posts/donneesSurLeNetTousSuspects/#more">Read ›</a></p>
https://hroy.eu/posts/UserDataManifesto2dot0/
<a href="https://xkcd.com/802/">Online Communities 2, by XKCD</a>,
used under a <a href="https://xkcd.com/license.html">Creative Commons
non-commercial licence</a>
2023-10-30T19:54:26Z2015-08-29T10:50:09Z
<p>This morning, we are officially publishing the <a
href="https://userdatamanifesto.org/2.0/">User Data Manifesto
2.0</a>.</p>
<p>Today, most of users’ data are not stored on their computer’s hard
drive any more, but rather online on a service provider’s server
somewhere in a data center.</p>
<p>While most computing happened on local machines, in the late years a
new kind of “computing“ has emerged in daily use. Marketers have called
this “Cloud computing”—but do not mistake, as there is no cloud, it’s
just some one else’s computer.</p>
<aside class="sidenote right">
<p>Bruce Schneier. <a
href="https://www.schneier.com/blog/archives/2013/10/the_battle_for_1.html">The
Battle for Power on the Internet (extract)</a>:</p>
<blockquote>
<p>I have previously characterized this model of computing as “feudal.”
Users pledge their allegiance to more powerful companies who, in turn,
promise to protect them from both sysadmin duties and security threats.
It’s a metaphor that’s rich in history and in fiction, and a model
that’s increasingly permeating computing today.</p>
</blockquote>
<blockquote>
<p>Medieval feudalism evolved into a more balanced relationship in which
lords had responsibilities as well as rights. Today’s Internet feudalism
is both ad-hoc and one-sided. Those in power have a lot of rights, but
increasingly few responsibilities or limits. <a
href="http://udm.branchable.com/tags/Internet_Feudalism/">Read
More</a></p>
</blockquote>
</aside>
<p><a href="https://hroy.eu/posts/UserDataManifesto2dot0/noCloudFSFE.png"><img src="https://hroy.eu/posts/UserDataManifesto2dot0/noCloudFSFE.png" width="681" height="682" alt="There’s no Cloud!" class="img" /></a></p>
<p>Most popular online services nowadays are gratis, but that does not
mean that they come at any cost. Instead of paying with money, people
are paying allegiance to service providers. In the land of “Minitel2.0”,
<a href="http://udm.branchable.com/tags/Internet_Feudalism">Google and
Facebook are like feudal lords of the Internet</a> and we are their mere
subjects. The exploitation of user data and of personally identifiable
information is leading to numerous privacy invasion, some of which were
only revealed thanks to Edward Snowden’s leak from the NSA.</p>
<table class="img">
<caption>
Online Communities 2, XKCD
</caption>
<tr>
<td>
<a href="http://xkcd.com/802_large/"><img src="https://hroy.eu/posts/UserDataManifesto2dot0/xkcdMap.png" width="740" height="860" alt="Map of Online Communities by XKCD" class="img" /></a>
</td>
</tr>
</table>
<p>If you’re looking to protect your privacy or if you want to know how
your rights can be affected when using these online services, you
usually don’t have many options but to look for <a
href="http://www.michaelzimmer.org/2012/05/07/how-to-adjust-your-facebook-privacy-settings-2012/">adjusting</a>
<a
href="http://personalweb.about.com/od/makefriendsonfacebook/a/faceprivsetting.htm">increasingly
complex</a> <a
href="http://www.wired.com/2013/08/facebook-privacy-settings/">privacy
settings</a> or you need to be a full time lawyer to read the lenghty <a
href="https://tosdr.org">terms of service</a>.</p>
<hr />
<p>The User Data Manifesto aims at defining the basic rights that users
should have on their own data when using online services. Recognising
these rights is an important first step towards a free society in the
digital age, along with <a href="https://fsfe.org">Free
Software</a>.</p>
<p>Indeed, users should have:</p>
<ol type="1">
<li><p>Control over user data access</p>
<p>User data should be under the ultimate control of the user. Users
should be able to decide whom to grant direct access to their data and
with which permissions and licenses such access should be granted.</p>
<p>Data generated or associated with user data (e.g. metadata) should
also be made available to that user and put under their control just
like the user data itself.</p></li>
<li><p>Knowledge of how user data is stored</p>
<p>When user data is uploaded to a specific service provider, users
should be informed about the geographic location that specific service
provider stores the data in, how long, in which jurisdiction that
specific service provider operates and which laws apply.</p>
<p>This point is not relevant when users are able to store their own
data on devices in their vicinity and under their direct control
(e.g. servers) or when they rely on systems without centralised control
(e.g. peer-to-peer).</p></li>
<li><p>Freedom to choose a platform</p>
<p>Users should always be able to extract their data from the service at
any time without experiencing any vendor lock-in.</p></li>
</ol>
<p><a href="http://udm.branchable.com">Read the full text of the
manifesto here</a></p>
<hr />
<p>I’m very happy that, with Frank and Jan, we are today announcing the
<a href="https://userdatamanifesto.org/2.0/">release of version 2.0 of
the manifesto</a> during the <a
href="https://owncloud.org/conf/">ownCloud conf keynote</a>.</p>
<p>Organisations and activists defending digital rights are joining in
this effort to support online services that respect users’ rights. I am
proud to be part of that effort and I hope it’s the start of a
constructive debate and, hopefully, a humble contribution to our
society.</p>
<p>I look forward to your feedback on the manifesto, which I view as a
starting point rather than an end in itself.</p>
<hr />
<p>Thanks to ownCloud, Inc for inviting me over to the ownCloud conf in
Berlin.</p>
Overview of FDN & La Quadrature’s challenge against Website Blockinghttps://hroy.eu/posts/overviewChallengeAgainstWebsiteBlocking/2023-10-30T19:54:26Z2015-04-16T16:37:07Z
This month, French Data Network and La Quadrature du Net filed a lawsuit to the Conseil d’État, one of the supreme courts, against the French government on website blocking. This is part 2 of our series .<hr><br><p>This month, French Data Network and La Quadrature du Net filed a
lawsuit to the Conseil d’État, one of the supreme courts, against the
French government on website blocking.</p>
<p>This is part 2 of our series <a
href="https://hroy.eu/tags/FdnContreGouvernement/">“French Data Network <em>versus</em>
the French Government”</a>.</p>
<p><a href="https://hroy.eu/posts/overviewChallengeAgainstWebsiteBlocking/#more">Read ›</a></p>
Overview of FDN & La Quadrature’s challenge against Data Retentionhttps://hroy.eu/posts/overviewChallengeAgainstDataRetention/2023-10-30T19:54:26Z2015-04-01T11:29:23Z
<p><a href="https://hroy.eu/posts/startingAgainstDataRetention/">Last month</a>,
French Data Network and La Quadrature du Net filed a lawsuit to the
Conseil d’État, one of the supreme courts, against the French
government. Our objective is simple: we want to take down French data
retention laws.</p>
<h2 id="who">Who?</h2>
<ul>
<li><p>the <a href="http://fdn.fr">French Data Network (FDN)</a>, the
oldest French internet access provider, and a nonprofit organisation
promoting the Internet and spreading knowledge on how it works.</p></li>
<li><p>the <a href="http://ffdn.org">Fédération FDN</a>, a federation of
ISP very much like FDN (FDN is one of the founding members of the
Fédération), created to spread and distribute efforts accross
geographical locations to serve the same goal.</p></li>
<li><p><a href="https://laquadrature.net">La Quadrature du Net</a>, an
organisation of activists (which used to be an unorganisation ;-))
defending our rights in the digital age. Maybe you know them for their
successful campaigns against <a
href="http://www.laquadrature.net/en/acta">ACTA</a>.</p></li>
</ul>
<h2 id="how">How?</h2>
<p>On December 24, the government issued a <em><a
href="http://en.wikipedia.org/wiki/Decree#France">décret</a></em>, an
order by the executive branch to enable the application of the law
(issued by the Parliament). Décrets can be challenged in court, directly
to the Conseil d’État, until two months after they are published. This
is the procedure we’re in.</p>
<p>Formally, our target is a décret of the 2013 law setting the strategy
for military operations and prerogatives for the near future (the “LPM”
law). Specifically, article 20 of this law set new ways for the state to
access data retained by telcos and internet ISPs.</p>
<p>For us, this was just a legal opportunity to seize in order to bring
our arguments in front of a judge, against the concept of general data
retention, i.e. keeping metadata and records on communications of the
whole population.</p>
<p>In the aftermath of the <a href="https://hroy.eu/tags/EUCJ/">European Union Court of
Justice</a>’s landmark decision in <em>Digital Rights Ireland</em>
(April 8, 2014; C‑293/12 & C‑594/12), data retention laws in Europe
are being cancelled, almost automatically, one by one (lately, in the
Netherlands, see the preliminary injunction by the Hague court, March
11, 2015). Almost automatically indeed, because national judges, in
matter of European Union law, have to apply EU principles and case law
directly.</p>
<p>So this is what we’re trying to do in France, albeit one difference.
Unlike other data retention laws in Europe, French laws predate the 2006
EU data retention directive; so our task seems a bit more difficult.</p>
<h2 id="what">What?</h2>
<p>Anyway, here comes an overview of our main arguments:</p>
<ul>
<li>the décret tries to fix the law; because the law did not define
correctly its own scope (the definition of the type of data subject to
the law). But that’s something the government is not supposed to do! The
scope of the law is a legislative power prerogative, not the
executive’s.</li>
<li>the décret had to organise the administrative control defined in the
law, but the décret doesn’t do it. Thus, the government did not fullfil
the obligations the law created.</li>
</ul>
<p>And, of course, the main argument (part 4.1 of our legal
writing):</p>
<ul>
<li>This is a matter of European Union law. As the 2002 directive (so
called ePrivacy directive) says in its article 15, measures of data
retention must be made according to EU law principles.</li>
<li>Thus, the EUCJ <em>Digital Rights Ireland</em> decision is directly
applicable to French laws on data retention.</li>
<li>As a consequence, the judge must realise that data retention, as set
in French law, is clearly against our fundamental rights to free speech
and to the respect of private life! The government cannot legally
mandate telcos and internet ISPs to keep metadata and records on the
communications of the whole population (and for <em>a whole year</em> at
least)!</li>
</ul>
<p>If you’re interested, you can <a
href="http://www.fdn.fr/2014-1576/recours.pdf">read the whole thing</a>
(in French).</p>
<h3 id="what-next">What next?</h3>
<p>I’ll keep you posted on the blog about the
<a href="https://hroy.eu/tags/d%C3%A9cretLPM/">procedure</a>. It should take at minimum a year,
if nothing unexpected happens (but it can be significantly longer
depending on prejudicial and accessory procedures…).</p>
<p>But as you may know, the government is currently trying to pass new
law giving extremely broad powers to the state with regard to
surveillance measures, including new ways to access our communications
and our data, all of this without effective judicial oversight.</p>
<p>Our legal challenge has thus taken a new level, against the <a
href="http://www.nytimes.com/2015/04/01/opinion/the-french-surveillance-state.html">French
surveillance state</a>.</p>
<hr />
<p><em>Related:</em> <a
href="https://www.laquadrature.net/en/in-france-la-quadrature-du-net-brings-legal-challenge-against-mass-surveillance">La
Quadrature’s press release</a></p>
Starting against Data Retention in Francehttps://hroy.eu/posts/startingAgainstDataRetention/2023-10-30T19:54:26Z2015-02-22T21:12:32Z
<p>If you’ve been wondering why I haven’t blogged lately, or why I
haven’t replied to your email yet, it’s because I have been quite busy
so far for this new year.</p>
<p>Besides starting at a law firm in Paris for 6 month (the last
internship required by the Bar school, at last!) I also joined French
Data Network, La Quadrature du Net and the Federation of Do-It-Yourself
Internet access/service providers in a lawsuit against the French
government on Data Retention.</p>
<p>This is just the beginning, but I’m quite thrilled about it
already.</p>
<p>If you read French, Benjamin Bayart will give you a good idea of what
it’s about <a
href="http://blog.fdn.fr/?post/2015/02/18/Recours-contre-le-decret-2014-1576">on
FDN’s blog</a>.</p>
<p>/me, now catching up on email of the week.</p>
Some comments on the EU’s draft Privacy Iconshttps://hroy.eu/posts/encryptionEuDataIcons/2023-10-30T19:54:26Z2014-11-12T16:37:57Z
<p>The European Union is currently reviewing the regulatory framework of
personal data protection. In the current draft, a standardised icon set
would be mandatory in some circumstances.</p>
<p>I’m not convinced this is the best implementation, and there’s even
one icon in the set that I’m really concerned about: “Encryption”. This
proposal could undermine years of activism in favour of better
encryption for users.</p>
<hr />
<p>As I’ve been working on Terms of Service; Didn’t Read for a couple of
years now, I have some experience and idea about how this sort of things
might work and how it compares to existing projects, especially in the
fields of “Privacy Icons” where several projects coexist and keep
raising much attention (including, it seems, from European
legislators).</p>
<p>First, some context for those who haven’t followed (feel free to skip
to the second part if you’ve followed personal data regulations updates
in the EU). In January 2012, the European Commission <a
href="http://europa.eu/rapid/press-release_IP-12-46_en.htm?locale=en">announced</a>
a plan to revise data protection laws in the European Union with a <a
href="https://en.wikipedia.org/wiki/General_Data_Protection_Regulation">draft
regulation</a>. Currently, most of the European Union’s laws on the
protection of personal data come from a <a
href="https://en.wikipedia.org/wiki/Data_Protection_Directive">1995
European Union directive</a>. (Unlike a directive, a <em>EU
regulation</em> is law that applies EU-wide without the need for each
state to make their own internal legal implementation.)</p>
<p>So, this is going to be 20 years old soon. It’s quite extraordinary
that even now, the directive does not seem too far off. The intentions
are good and it’s a great thing that legislators foresaw the need to
enhance people’s privacy back then (France and Germany already had a law
for that by the end of the 1970s). But today, all this is in the middle
of <a
href="http://www.janalbrecht.eu/themen/datenschutz-und-netzpolitik/lobbyism-and-the-eu-data-protection-reform.html">a
huge battle</a>.</p>
<p>After several steps through the European Union’s lawmaking process,
the regulation is now in a <a href="https://hroy.eu/tags/privacy/DPRConsolidated.pdf">consolidated
draft</a>.</p>
<p>I want to focus on the draft article 13a (in Chapter Ⅲ, Section 1:
Transparency and modalities) which provides:</p>
<blockquote>
<ol type="1">
<li><p>Where personal data relating to a data subject are collected, the
controller shall provide the data subject with the following particulars
before providing information pursuant to Article 14:</p>
<ol type="a">
<li>whether personal data are collected beyond the minimum necessary for
each specific purpose of the processing;</li>
<li>whether personal data are retained beyond the minimum necessary for
each specific purpose of the processing;</li>
<li>whether personal data are processed for purposes other than the
purposes for which they were collected;</li>
<li>whether personal data are disseminated to commercial third
parties;</li>
<li>whether personal data are sold or rented out;</li>
<li>whether personal data are retained in encrypted form.</li>
</ol></li>
<li><p>The particulars referred to in paragraph 1 shall be presented
pursuant to Annex X in an aligned tabular format, using text and
symbols, in the following three columns:</p>
<ol type="a">
<li>the first column depicts graphical forms symbolising those
particulars;</li>
<li>the second column contains essential information describing those
particulars;</li>
<li>the third column depicts graphical forms indicating whether a
specific particular is met.</li>
</ol></li>
<li><p>The information referred to in paragraphs 1 and 2 shall be
presented in an easily visible and clearly legible way and shall appear
in a language easily understood by the consumers of the Member States to
whom the information is provided. Where the particulars are presented
electronically, they shall be machine readable.</p></li>
<li><p>Additional particulars shall not be provided. Detailed
explanations or further remarks regarding the particulars referred to in
paragraph 1 may be provided together with the other information
requirements pursuant to Article 14.</p></li>
<li><p>The Commission shall be empowered to adopt, after requesting an
opinion of the European Data Protection Board, delegated acts in
accordance with Article 86 for the purpose of further specifying the
particulars referred to in paragraph 1 and their presentation as
referred to in paragraph 2 and in Annex 1.</p></li>
</ol>
</blockquote>
<h2 id="why-the-encryption-icon-is-a-bad-idea">Why the “Encryption” icon
is a bad idea?</h2>
<p><strong>TL;DR</strong> Storing sensitive data in data centers without
encrypting them first is just negligence and should not be allowed.
There’s no need for an icon that probably a large majority of users will
not really understand.</p>
<hr />
<p>In the draft proposal, when personal data is collected, the person
who’s subject of that data should get information in the form of a
standardised icon. One of the icons proposed is about encryption:</p>
<figure>
<img src="https://hroy.eu/tags/privacy/iconEncrypt.png" alt="Everything is Safe!" />
<figcaption aria-hidden="true">Everything is Safe!</figcaption>
</figure>
<p>If the data is stored encrypted, then the data controller can display
a huge green mark next to the icon. <em>All is fine!</em></p>
<p>Except that it’s not. I can really see how this could get very, very
confusing. It is very easy to claim that something “is encrypted” and
that thus, <em>everything’s good.</em> I’ve heard this argument several
times from Google employees: <em>Google stores the data in encrypted
forms, so don’t worry</em>. But still, when Google access the data to
process it, it is decrypted by them.</p>
<p>Let’s put this in context.</p>
<p>Following Edward Snowden’s revelations, it is very clear that
encryption is one part of the solution against the intrusion in our
lives that the NSA and other State agencies in the world are pursuing.
Thus, it is crucial that users understand that <strong>there are ways to
protect their communications against the intrusion of the
State</strong>, and also from companies or criminals. This is why
initiatives such as Cryptoparties and Privacy Cafés, where people help
each other understand and use encryption techniques, are so
important!</p>
<p>But encryption does not always mean the same thing in all contexts.
It requires basic technological understanding to grasp when encryption
is simply a security good practice against criminals, and when
encryption is actually a much more powerful tool.</p>
<p>For instance, when I send sensitive information over the web (like a
financial transaction, or like my user nick and password), it is very
important that the connection is encrypted (e.g. using HTTPS);
otherwise, it would not be difficult to intercept that sensitive
information. Enabling encryption for that kind of stuff should simply be
mandatory.</p>
<p>It’s a good idea to impose security obligations over storing personal
data. But I fail to see how showing an icon to users about storing data
in encrypted form will do any good. Worse, it might even confuse people
about what encryption really means in which context, thus making it even
harder to explain why encryption is important and why tools such as
GnuPG should be improved in usability.</p>
<h2 id="is-this-standardised-icon-set-really-good-anyway">Is this
standardised icon set really good anyway?</h2>
<p>Raising awareness about privacy rights online is important. This is
what I have been doing with <a href="https://tosdr.org">Terms of
Service; Didn’t Read</a> for about two years now. I’ve seen several
variations of the Privacy Icons idea, and this implementation as
suggested by the EU draft regulation shows that getting it right is not
easy.</p>
<p>The <a href="https://hroy.eu/tags/privacy/DPRConsolidated.pdf">consolidated draft</a> has an annex
showing how the icons could be:</p>
<figure>
<img src="https://hroy.eu/tags/privacy/dataCollect.png" alt="No unnecessary data collection" />
<figcaption aria-hidden="true">No unnecessary data
collection</figcaption>
</figure>
<p>Depending on whether that’s the case, the data controller would have
to display a green or a red mark next to this icon:</p>
<figure>
<img src="https://hroy.eu/tags/privacy/goodOrBad.png" alt="Good or Bad?" />
<figcaption aria-hidden="true">Good or Bad?</figcaption>
</figure>
<p>In <a href="https://tosdr.org">ToS;DR</a>, we also use this approach:
for each point, there’s an iconic indication whether this is a good or a
bad thing. Only, we allow for more variations:</p>
<figure>
<img src="https://hroy.eu/tags/privacy/ToSDRTitles.png" alt="Good points, and bad points" />
<figcaption aria-hidden="true">Good points, and bad points</figcaption>
</figure>
<figure>
<img src="https://hroy.eu/tags/privacy/thumbsDown.png" alt="… and blockers" />
<figcaption aria-hidden="true">… and blockers</figcaption>
</figure>
<p>But the major problem that I have with “Privacy Icons” is that they
are too <strong>difficult to grasp</strong>. If you actually remove the
text aside the icon itself, you realise that the icon itself is
<strong>far from self-explanatory</strong>. This gets even more
<em>complex</em> with the number of icons you add.</p>
<p>These icons are not universally understood. Here’s how <em>the same
concept</em> is rendered differently by different Privacy icons
sets:</p>
<figure>
<img src="https://hroy.eu/tags/privacy/dataProcessing.png" alt="EU draft" />
<figcaption aria-hidden="true">EU draft</figcaption>
</figure>
<hr />
<figure>
<img src="https://hroy.eu/tags/privacy/dataForPurpose.png"
alt="Mozilla’s Alpha version of Privacy Icons" />
<figcaption aria-hidden="true">Mozilla’s Alpha version of Privacy
Icons</figcaption>
</figure>
<figure>
<img src="https://hroy.eu/tags/privacy/dataNoPurpose.png"
alt="Mozilla’s Alpha version of Privacy Icons" />
<figcaption aria-hidden="true">Mozilla’s Alpha version of Privacy
Icons</figcaption>
</figure>
<hr />
<figure>
<img src="https://hroy.eu/tags/privacy/DisconnectIcons.png" alt="Disconnect.me icons" />
<figcaption aria-hidden="true">Disconnect.me icons</figcaption>
</figure>
<p>Compare these with how a similar point would be addressed in <a
href="https://tosdr.org">ToS;DR</a>:</p>
<figure>
<img src="https://hroy.eu/tags/privacy/ToSDRTitle.png" alt="The summary version" />
<figcaption aria-hidden="true">The summary version</figcaption>
</figure>
<p>which can be expanded with a plain-English paragraph and links to
contextualise if the user wants more information:</p>
<figure>
<img src="https://hroy.eu/tags/privacy/ToSDRParagraph.png" alt="The plain english version" />
<figcaption aria-hidden="true">The plain english version</figcaption>
</figure>
<p>There’s probably a way somewhere to learn from these different
approaches and make an implementation that gets it right for users.</p>
<p>The EU already made such a thing possible with the <a
href="https://en.wikipedia.org/wiki/European_Union_energy_label">energy
efficiency labels</a>. (They actually were a source of inspiration for
ToS;DR <a href="https://tosdr.org/classification.html">classes</a>.)</p>
<p>Let’s hope the next proposal gets it right with an icon system that
is easier to understand and which gets rid of the confusing bits.</p>
Right to be forgotten — When the EUCJ forgot our freedom of expressionhttps://hroy.eu/posts/rtbf-what-cjue-got-wrong/2023-10-30T20:57:52Z2014-09-04T18:24:52Z
<p>It’s been a few months now since the controversial EUCJ
Google_Spain_v._González_(C-131/12) decision has been published. And I’m
too busy, lagging behind: my draft (in French) on why I disagree a lot
with this decision is still in the making. But it will eventually come.
Meanwhile I got some interesting discussions, for instance with Neil
Brown. I’m still waiting for Neil to set up is Known profile online
somewhere so we can copy/paste our discussion there. Just now, Reuben
Binns <a
href="http://www.reubenbinns.com/blog/why-i-trust-wikipedia-with-privacy-censorship-and-the-right-to-be-forgotten/">sent
me</a> a paper pointing out that, yes, the <a
href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2491486">EUCJ
decision overlooked the balance needed</a> to protect our right to
freedom of expression. <small>By the way, Reuben has also written an
interesting piece on <a
href="http://www.reubenbinns.com/blog/why-i-trust-wikipedia-with-privacy-censorship-and-the-right-to-be-forgotten/">how
Wikipedia deals with person’s subjective rights</a> – I think you should
read it because I think Wikipedia is a very good illustration on how to
do this right, and thus also an incredibly strong illustration on how
the EUCJ’s so called “right to be forgotten” (RTBF for short) is
wrong.</small></p>
<p>So, roughly and quickly, I’d like to point out a few flaws that I
think are very worrying considering the wider context; namely, the
European Union Court of Justice getting more powerful as a court dealing
with fundamental rights (in addition to the European Court of Human
Rights).</p>
<h3 id="what-does-privacy-mean-anyway">What does privacy mean
anyway?</h3>
<p>You may disagree but I think there’s no such thing as a personal,
subjective right to “privacy”. A right to privacy is not the same thing
in my opinion as a right to the “respect of your private life”. There is
an important distinction to make. Maybe.</p>
<p>Privacy is <a href="https://hroy.eu/posts/moglen_privacy_ecological/">an
ecological thing</a> as Moglen says, it’s not an individual thing.
Privacy is often understood only in a given context: a technological
context and a social as well as cultural context. We have different
privacy expectations and understanding depending on who we communicate
with, what we communicate about, where we communicate, by which means we
communicate and based on the cultural background of the communicating
parties. Note that “communicate” needs to be understood broadly and may
not be the right word.</p>
<p>Privacy and the right to the respect of private life are intertwined,
but not the same thing.</p>
<p>One of the most interesting researchers working on explaining privacy
is danah boyd. She lately published a piece: <a
href="https://medium.com/message/what-is-privacy-5ed72c66aa86"><em>What
Is Privacy?</em></a> (You should read <a
href="https://medium.com/message/what-is-privacy-5ed72c66aa86">the
entire piece, it’s not long</a>) in which she wrote:</p>
<blockquote>
<p>The notion of private is also a <strong>social convention</strong>,
but privacy <strong>isn’t a state</strong> of a particular set of data.
It’s a practice and <strong>a process</strong>, an idealized state of
being, to be <strong>actively negotiated</strong> in an effort to have
agency.</p>
<p>[…]</p>
<p>While learning to read social contexts is hard, it’s especially hard
online, where the contexts seem to be constantly destabilized by new
technological interventions. As such, context becomes visible and
significant in the effort to achieve privacy. Achieving privacy requires
a whole slew of skills, not just in the technological sense, but in the
social sense. Knowing how to read people, how to navigate interpersonal
conflict, how to make trust stick. This is far more
<strong>complex</strong> than people realize, and yet <strong>we do this
every day</strong> in our efforts to control the social situations
around us.</p>
</blockquote>
<p>The core of the point is, privacy is not an individual’s subjective
legal right. It’s a social and fragile, but needed social process. And
we should wary of courts or governments intrusions into this social
process.</p>
<p>In the EUCJ’s RTBF decision, the court does not give enough weight to
the right of the public to access lawfully published information that
can be of public interest. This is very worrisome because that right is
substantially a consequence of our right to free speech.</p>
<p>The rationale, however, of the EUCJ analysis is unclear. To make
their arguments justified by fundamental rights, the EUCJ takes <a
href="https://en.wikisource.org/wiki/Charter_of_Fundamental_Rights_of_the_European_Union#Article_7_.E2.80.93_Respect_for_private_and_family_life">article
7 of the EU Charter</a>. This article is not a right to privacy,
otherwise it would say just that: “a right to privacy.” Instead, it is a
right to the “respect for private and family life” and that’s not the
same thing.</p>
<p>On the one hand, the right to respect for private life is well
established as a person’s subjective right. For instance, in France it
used to be under general tort law (art. 1382) but then has taken its own
stance in <a
href="http://www.legifrance.gouv.fr/UnArticleDeCode.do?code=CCIVILL0.rcv&art=9">article
9 of the <em>code civil</em></a> and under the Declaration of human
rights of 1789.</p>
<p>One important condition of such a right in a civil context is that
there is a need to demonstrate <em>préjudice</em>, i.e. harm has been
done to that persons’ in way of infringing their private life.</p>
<p>On the other hand, as already pointed out, privacy is a process. And
as you know if you’ve read [the ECJ decision][c131-12], there’s no such
<em>need to demonstrate prejudice</em> in order for the RTBF to
apply.</p>
<p>The legal basis thus is not clear. Is this new so called “right to be
forgotten” based on the right for the respect of private life (in this
case it needs to be demonstrated that there is prejudice) or is it based
on another part of the EU Charter, the one that recognises personal data
protection? Well, if it’s the latter, then I think we should question
the balance that the ECJ strikes with the RTBF. Should the RTBF be that
powerful against the freedom to access lawful information that has not
been demonstrated to cause any harm?</p>
<h3 id="the-eucj-new-general-rule-harms-freedom-of-expression">The EUCJ
new “general rule” harms freedom of expression</h3>
<p>The personal data protection directive says in article 7:</p>
<blockquote>
<p>’Member States shall provide that personal data may be processed only
if:</p>
<p>…</p>
<ol start="6" type="a">
<li>processing is necessary for the purposes of the legitimate interests
pursued by the controller or by the third party or parties to whom the
data are disclosed, except where such interests are overridden by the
interests [or] fundamental rights and freedoms of the data subject which
require protection under Article 1(1).’</li>
</ol>
</blockquote>
<p>In the case where the service in question is accessible and used by a
large majority of the population, it means that we are talking about the
<em>legitimate interests of the public</em>. Surely, the right to access
lawfully published information is <em>a priori</em> a legitimate
interest of the public. Otherwise, what good is a right to freedom of
expression if nobody else has the right to hear you and that someone can
block access to your article when they feel like?</p>
<p>Now, let’s have another read at the article above (article 7). It is
clear that the general rule is that processing of personal data is
allowed when the right of the public to freedom of expression is at
stake, <strong>except</strong> where the data subject’s fundamental
rights should override them.</p>
<p>But as already pointed out, there is a confusion between fundamental
rights and thus the whole analysis on balance breaks, at the detriment
of the public’s right to access lawfully published information.</p>
<p>In the decision, the Court indeed invents a new “general rule”:</p>
<blockquote>
<p>The rights to privacy of the data subject override “as a rule, not
only the economic interest of the operator of the search engine
<strong>but also the interest of the general public in finding that
information</strong> upon a search relating to the data subject’s name.”
(¶ <a href="https://hroy.eu/law/eucj/C-131_12/##97.">97</a>)</p>
</blockquote>
<p>It is clear now that there’s a problem. The rule and the exception
have been exchanged.</p>
<hr />
<p>Interesting fact: I just learned that the Spanish plaintiff, M.
González, is a lawyer… This whole case and the decision to me, is an
illustration of what goes wrong when we try to solve problems that
should be best solved freely with our social processes. Solving privacy
with this kind of ruling is doing us no favour.</p>
<p>The real privacy issues for us today come from massive surveillance
by the NSA and other mass-surveillance State agencies aroudn the world.
They also come from surveillance operated by companies.</p>
<p>Search engines giving access to lawfully published information is not
the real privacy issue! The RTBF is the wrong fight, and it’s actually
wasting our time; time that should be better spent fighting the real
issues of massive surveillance which makes much more harm to our right
to have a private life outside the reach of the State’s agents.</p>
<p>Finally, the ultimate irony of the decision is that Google and the
like are the ones who have to apply individual’s requests to be deleted
from search engines results relating to their names. Thus, giving the
role of defining privacy to… Google. <a
href="http://www.laquadrature.net/en/the-right-to-be-forgotten-dont-forget-the-rule-of-law">Well
done for the rule of law</a>.</p>
<p>We should demand that the European commission does not to pursue this
RTBF nonsense, but instead focuses on the real issues affecting our
privacy and our autonomy.</p>
What’s in the ECJ’s https://hroy.eu/posts/whats-in-C-131_12/2023-09-18T15:16:31Z2014-06-15T13:35:23Z
Reading the European Court of Justice decision C-131/12 and what’s in the by Google.<hr><br><p>One month ago, the <a
href="https://en.wikipedia.org/wiki/European_Court_of_Justice">European
Union’s highest court</a> published its “right to be forgotten” judgment
<span
class="createlink"><a href="https://hroy.eu/cgi-bin/ikiwiki.cgi?do=create&from=posts%2Fwhats-in-C-131_12&page=%2Flaw%2Feucj%2FC-131_12" rel="nofollow">?</a>C-131/12</span>
against Google. While it seems that European data protection authorities
welcomed the decision, there’s also a lot of criticism about the last
part on the right to be forgotten.</p>
<p>I’ve never been a big fan of the so-called
“<a href="https://hroy.eu/tags/rightToBeForgotten/">right to be forgotten</a>”. The first
time I heard about it was in 2009 when the French digital economy
minister launched this debate. Already then, I remember that a student
asked her during a conference, <a
href="https://twitter.com/nk_m/status/5645297133">how to erase their
name from Google</a>. (While at the same time the discussion was about
students publishing stuff on Facebook they might regret later…)</p>
<p>But is that really what it is? What’s in this decision, what does it
say compared to the <a
href="http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:EN:HTML">95/46/EC</a>
directive?</p>
<p>Let’s start by leaving out the obvious: No, this decision does not
lay out the bases to a censorship machine <a
href="http://www.bbc.com/news/technology-27423527">for politicians and
paedophiles</a>. <a
href="https://hroy.eu/law/eucj/C-131_12/##81.">Paragraph 81</a> clearly
insists on the analysis on balance that prevents it:</p>
<blockquote>
<p><a href="https://hroy.eu/law/eucj/C-131_12/##81.">81</a>. […] Whilst
it is true that the data subject’s rights protected by those articles
also override, as a general rule, that interest of internet users, that
balance may however depend, in specific cases, on the nature of the
information in question and its sensitivity for the data subject’s
private life and <strong>on the interest of the public in having that
information,</strong> an interest which may vary, in particular,
according to the role played by the data subject in public life.</p>
</blockquote>
<p>Second, the “right to be forgotten” is an unfortunate
characterisation. Let’s recall that article 9 of the <a
href="http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:EN:HTML">directive</a>
provides exemptions for freedom of expression:</p>
<blockquote>
<p>‘Member States shall provide for exemptions or derogations from the
provisions of this Chapter, Chapter IV and Chapter VI for the processing
of personal data carried out solely for journalistic purposes or the
purpose of artistic or literary expression only if they are necessary to
reconcile the right to privacy with the rules governing freedom of
expression.’</p>
</blockquote>
<p>However, it’s unfortunate that it remains unclear to which extent
this exemption could apply to new kinds of organising information that
do not strictly fall into the scope of “journalistic [or] artistic or
literary expression”.</p>
<p>Now that we see that this decision is not as extremely bad as some
would like to depict it, let’s dive a little bit into the details of
this “right to remove personal data that are no-longer-relevant for the
purposes for which they were processed” by search engines (that’s <a
href="https://support.google.com/legal/contact/lr_eudpa?product=websearch&hl=en">how
Google interprets the decision</a>).</p>
<ul>
<li><p>What was the question the Court had to answer? (see paragraph <a
href="https://hroy.eu/law/eucj/C-131_12/##89.">89</a>).</p>
<p>The question was whether the “data subject” (i.e. the person to whom
the personal data at stake relates to) can ask the search engine to
<strong><em>alter search results of their name</em></strong> in order to
remove true and lawfully published information.</p>
<p>By the directive,</p>
<blockquote>
<ol type="a">
<li>‘personal data’ shall mean any information relating to an identified
or identifiable natural person (‘data subject’); an identifiable person
is one who can be identified, directly or indirectly, in particular by
reference to an identification number or to one or more factors specific
to his physical, physiological, mental, economic, cultural or social
identity;</li>
</ol>
</blockquote>
<p><abbr title="Too Long; Didn’t Read">TL;DR</abbr>: the answer is Yes,
in most cases.</p></li>
<li><p>The Court <a
href="https://hroy.eu/law/eucj/C-131_12/##92.">starts</a> by repeating
that the data subject has the right to obtain rectification, erasure or
blocking of data when the <em>processing</em> does not comply with the
directive (see the directive, <a
href="https://hroy.eu/law/eucj/C-131_12/##10.">article 12(b) on “Right
to access”</a>).</p>
<p>The processing’s non-compliance with the directive can result from
different situations (see <a
href="https://hroy.eu/law/eucj/C-131_12/##7.">article 6(c) to 6(e) on
“data quality”</a>):</p>
<ul>
<li>when the data being <em>further processed</em> is inadequate,
irrelevant or excessive in relation to the ‘specified, explicit and
legitimate’ purposes for which personal data was
collected/processed</li>
<li>when the data is not accurate and, <em>where necessary</em> not kept
up to date</li>
<li>when the data permits identification of the data subject for
<em>longer than necessary</em></li>
</ul>
<!-- It is the responsibility of the data controller (here, Google) to comply with the directive and to give the data subject to right to obtain rectification, erasure or blocking.-->
<p>The Court <a href="https://hroy.eu/law/eucj/C-131_12/##92.">also
repeats</a> that in the course of time, lawful processing of data may
become incompatible with the directive:</p>
<blockquote>
<p>where those data are no longer necessary in the light of the purposes
for which they were collected or processed. That is so in particular
where they appear to be <strong>inadequate, irrelevant or no longer
relevant, or excessive in relation to those purposes and in the light of
the time that has elapsed.</strong> (¶ <a
href="https://hroy.eu/law/eucj/C-131_12/##93.">93</a>)</p>
</blockquote>
<p>(It should be noted nevertheless that the directive provides an
exception “for historical, statistical or scientific” use and
purposes.)</p></li>
<li><p>It seems that most analyses of the decision have focused on the
parts I have just highlighted. However, I think the analysis made under
article 14 is more worrisome. Indeed, it’s important to note that the
Court gives a very, very strong preference to data subjects’ rights
<em>even when there’s no prejudice</em>:</p>
<ul>
<li><p>data subjects can object at any time to their personal data being
processed because of their particular situation (see <a
href="https://hroy.eu/law/eucj/C-131_12/##11.">article 14 on “Right to
object”</a>). The Court of Justice finds here that it is not necessary
for the data subject to demonstrate that the <a
href="https://hroy.eu/law/eucj/C-131_12/##list+of+results+causes+prejudice">data
processing causes prejudice</a>.</p></li>
<li><p>the data processing’s can also be found not-compliant when the
data controller legitimate interests <strong>or</strong> the rights of
the public are overridden by the data subject’s interests [or]
fundamental rights and freedoms (see <a
href="https://hroy.eu/law/eucj/C-131_12/##8.">article 7 in “Criteria for
making data processing legitimate”</a>).</p>
<p>The rights to privacy of the data subject override “as a rule, not
only the economic interest of the operator of the search engine
<strong>but also the interest of the general public in finding that
information</strong> upon a search relating to the data subject’s name.”
(¶ <a href="https://hroy.eu/law/eucj/C-131_12/##97.">97</a>) What seemed
an exception under the directive under article 7 now is a general rule
under the circumstances of the case. So the right of the public in
accessing public information now is an exception:</p>
<blockquote>
<p>However, that would not be the case if it appeared, for particular
reasons, such as the role played by the data subject in public life,
that the interference with his fundamental rights is justified by the
preponderant interest of the general public in having, on account of
inclusion in the list of results, access to the information in
question.</p>
</blockquote>
<p>The justification for this can be found in paragraphs <a
href="https://hroy.eu/law/eucj/C-131_12/##80.">80</a> and <a
href="https://hroy.eu/law/eucj/C-131_12/##81.">81</a>.</p>
<blockquote>
<p><a href="https://hroy.eu/law/eucj/C-131_12/##80.">80</a>. It must be
pointed out at the outset that, as has been found in paragraphs 36 to 38
of the present judgment, processing of personal data, such as that at
issue in the main proceedings, carried out by the operator of a search
engine is liable to affect significantly the fundamental rights to
privacy and to the protection of personal data when the search by means
of that engine is carried out on the basis of an individual’s name,
since that processing enables any internet user to obtain through the
list of results a structured overview of the information relating to
that individual that can be found on the internet — information which
potentially concerns a vast number of aspects of his private life and
which, without the search engine, could not have been interconnected or
could have been only with great difficulty — and thereby to establish a
more or less detailed profile of him. Furthermore, the effect of the
interference with those rights of the data subject is heightened on
account of the important role played by the internet and search engines
in modern society, which render the information contained in such a list
of results ubiquitous (see, to this effect, Joined Cases C-509/09 and
C-161/10 eDate Advertising and Others EU:C:2011:685, paragraph 45).</p>
<p><a href="https://hroy.eu/law/eucj/C-131_12/##81.">81</a>. In the
light of the potential seriousness of that interference, it is clear
that it cannot be justified by merely the economic interest which the
operator of such an engine has in that processing. However, inasmuch as
the removal of links from the list of results could, depending on the
information at issue, have effects upon the legitimate interest of
internet users potentially interested in having access to that
information, in situations such as that at issue in the main proceedings
a fair balance should be sought in particular between that interest and
the data subject’s fundamental rights under Articles 7 and 8 of the
Charter. Whilst it is true that the data subject’s rights protected by
those articles also override, as a general rule, that interest of
internet users, that balance may however depend, in specific cases, on
the nature of the information in question and its sensitivity for the
data subject’s private life and on the interest of the public in having
that information, an interest which may vary, in particular, according
to the role played by the data subject in public life.</p>
</blockquote></li>
</ul></li>
</ul>
<hr />
<p>This is why the Court concludes that:</p>
<blockquote>
<p>since in the case in point there do not appear to be particular
reasons substantiating a preponderant interest of the public in having,
in the context of such a search, access to that information, a matter
which is, however, for the referring court to establish, the data
subject may, by virtue of Article 12(b) and subparagraph (a) of the
first paragraph of Article 14 of Directive 95/46, require those links to
be removed from the list of results.</p>
</blockquote>
<blockquote>
<p><a href="https://hroy.eu/law/eucj/C-131_12/##99.">99</a>. It follows
from the foregoing considerations that the answer to Question 3 is that
Article 12(b) and subparagraph (a) of the first paragraph of Article 14
of Directive 95/46 are to be interpreted as meaning that,</p>
</blockquote>
<blockquote>
<p>when appraising the conditions for the application of those
provisions, it should inter alia be examined</p>
<ul>
<li>whether the data subject has a right that the information in
question relating to him personally should, at this point in time, no
longer be linked to his name by a list of results displayed following a
search made on the basis of his name,
<ul>
<li>without it being necessary in order to find such a right that the
inclusion of the information in question in that list causes prejudice
to the data subject.</li>
</ul></li>
<li>As the data subject may, in the light of his fundamental rights
under Articles 7 and 8 of the Charter, request that the information in
question no longer be made available to the general public on account of
its inclusion in such a list of results, those rights override, as a
rule, not only the economic interest of the operator of the search
engine but also the interest of the general public in having access to
that information upon a search relating to the data subject’s name.
<ul>
<li>However, that would not be the case if it appeared, for particular
reasons, such as the role played by the data subject in public life,
that the interference with his fundamental rights is justified by the
preponderant interest of the general public in having, on account of its
inclusion in the list of results, access to the information in
question.</li>
</ul></li>
</ul>
</blockquote>
<hr />
<p>In the end, the Court of Justices gives a lot more strength to the
data subjects’ rights to oppose the processing of personal data when it
appers “no longer relevant” in light of the purposes (of the search
engine) and in light of the time — except for narrow exceptions
(scientific, historical or statistical data processing; journalistic or
artistic purposes and freedom of expression). In addition, the Court
makes it a general rule that data subjects’ rights to oppose data
processing should override the right of the public to access public
information on the grounds of the right to private life (but without the
need to demonstrate prejudice), unless it can be demonstrated that there
are particular reasons otherwise (justified by the preponderant interest
of the general public).</p>
<p>This will undeniably have practical implications for search engines
and other data controllers, not just Gooogle. It might be opportunistic
to see if there should be <a
href="http://www.husovec.eu/2014/05/should-we-centralize-right-to-be.html">a
new role to play here for data protection authorities</a> in order to
avoid that private actors decide for the public what constitutes
something of preponderant general public interest or not.</p>
<p>However, it still seems to me that this “general rule” is a bold
interpretation made by the Court of Justice.</p>
<p>I find it much less balanced than other remedies based on the right
to private life (such as France’s <a
href="http://www.legifrance.gouv.fr/UnArticleDeCode.do?code=CCIVILL0.rcv&art=9">article
9 of the civil code</a>) where prejudice must be demonstrated. Was all
that really necessary?</p>
https://hroy.eu/posts/gmail-most-email/
<a
href="http://creativecommons.org/publicdomain/zero/1.0/">CC0-1.0</a>
2023-10-30T19:54:26Z2014-05-18T12:25:15Z
<div class="inreplyto p-in-reply-to h-cite">
<span class="replyto arrow"></span><span
class="p-author h-card"><a class="u-url" href="http://mako.cc/"><img class="u-photo" src="https://hroy.eu//0.gravatar.com/avatar/c765934363224852356e0d9a992b3a23?s=80&d=http%3A%2F%2F0.gravatar.com%2Favatar%2Fad516503a11cd5ca435acc9bb6523536%3Fs%3D80&r=G" alt="" /><span
class="p-name">Benjamin Mako
Hill</span></a></span><a class="profile url" href="http://mako.cc/">http://mako.cc/</a><span
class="dt-published"><time datetime="2014-05-11T19:11:02+00:00">May 11,
2014</time></span><span
class="p-content"><a class="u-url" rel="in-reply-to" title="In reply to: Google Has Most of My Email Because It Has All of Yours" href="http://mako.cc/copyrighteous/google-has-most-of-my-email-because-it-has-all-of-yours">Google
Has Most of My Email Because It Has All of Yours</a></span>
</div>
<p>I wanted to find out too, how much of my email is in the hands of
Google.</p>
<h3 id="whole-email-archive">Whole email archive</h3>
<p>I haven’t very carefully archived all my email since 2007. For what
preceded 2007, I was using email through OVH’s POP server and I didn’t
really care about archiving email, so that’s lost.</p>
<table class="img">
<caption>
Emails I get each week (in red) including those going through Google (in
blue)
</caption>
<tr>
<td>
<a href="https://hroy.eu/posts/gmail-most-email/all/emails_gmail_over_time.pdf"><img src="https://hroy.eu/posts/gmail-most-email/all/emails_gmail_over_time.png" width="720" height="432" alt="R graph" title="Emails over time" class="img" id="allemail" /></a>
</td>
</tr>
</table>
<table class="img">
<caption>
Proportion of email going through Google for all emails (top) or for
emails I replied to (bottom)
</caption>
<tr>
<td>
<a href="https://hroy.eu/posts/gmail-most-email/all/emails_gmail_prop_over_time.pdf"><img src="https://hroy.eu/posts/gmail-most-email/all/emails_gmail_prop_over_time.png" width="720" height="576" alt="R graph" title="Gmail over time" class="img" id="allgmailproportion" /></a>
</td>
</tr>
</table>
<p>What’s interesting is that the proportion of Gmail is higher when I
had the highest quantity of email.</p>
<p>However, the Gmail proportion might seem particularily low because
most of it involves people active with the Free Software Foundation
Europe. However, it’s still much <a
href="http://blogs.fsfe.org/gerloff/2014/05/13/were-all-gmail-users-now/">higher
than Karsten’s</a>.</p>
<h3 id="personal-email-since-2011">Personal email since 2011</h3>
<table class="img">
<caption>
Personal emails I get each week (in red) including those going through
Google (in blue)
</caption>
<tr>
<td>
<a href="https://hroy.eu/posts/gmail-most-email/personal-since2011/emails_gmail_over_time.pdf"><img src="https://hroy.eu/posts/gmail-most-email/personal-since2011/emails_gmail_over_time.png" width="720" height="432" alt="R graph" title="Personal emails over time" class="img" id="personalemail" /></a>
</td>
</tr>
</table>
<table class="img">
<caption>
Proportion of personal email going through Google for all emails (top)
or for emails I replied to (bottom)
</caption>
<tr>
<td>
<a href="https://hroy.eu/posts/gmail-most-email/personal-since2011/emails_gmail_prop_over_time.pdf"><img src="https://hroy.eu/posts/gmail-most-email/personal-since2011/emails_gmail_prop_over_time.png" width="720" height="576" alt="R graph" title="Personal gmail over time" class="img" id="personalgmailproportion" /></a>
</td>
</tr>
</table>
https://hroy.eu/notes/avoid_ghostery-proprietary/
<a href="http://creativecommons.org/publicdomain/zero/1.0/">CC0-1.0</a> except for the XKCD graphic
2014-10-21T16:04:14Z2014-04-25T16:06:09Z
<p>I was reading an article by <a class="toggle" href="https://hroy.eu/tags/privacy/#notes-avoid-ghostery-proprietary.lcranor">Lorrie
Cranor</a> in the MIT Technology Review on <a href="http://www.technologyreview.com/view/526421/self-defense/">how it’s difficult even
for her to protect her privacy online</a>.</p>
<div class="toggleable" id="notes-avoid-ghostery-proprietary.lcranor"></div>
<p>I appreciate Lorrie Cranor’s work on privacy at Carnegie Mellon
University. I have extensively cited her study of the length of privacy
policies when I introduced <a href="http://tosdr.org">ToS;DR</a>.</p>
<div class="toggleableend"></div>
<p>However in this article, I was disappointed to see Ghostery
mentioned. Ghostery is an browser extension supposed to help users
against tracking and surveillance on the web. The main problem is
that Ghostery is not released as Free Software[^akaos]</p>
<p>[^akaos]: a.k.a Open Source. Both these terms designate the same
set of programs.</p>
<p><a class="toggle" href="https://hroy.eu/tags/privacy/#notes-avoid-ghostery-proprietary.ghosterycode">Earlier on Twitter</a> I
quickly posted my frustration about this. <strong>People who promote web
privacy should stop promoting Ghostery</strong>, as it’s proprietary.
What’s their business model exactly? ;-)</p>
<div class="toggleable" id="notes-avoid-ghostery-proprietary.ghosterycode"></div>
<p>In <a href="https://twitter.com/hugoroyd/status/459618738091085824">my earlier
tweet</a> I
wrongly stated that the source code was not disclosed; but that’s
not accurate. There is some code disclosed (I suppose it’s
entirely readable and not obfuscated nor minified). But as you’ll
notice, the license is “All rights reserved” so, basically, users
have no rights.</p>
<div class="toggleableend"></div>
<p>Ghostery has been playing on the ambiguity for too long. This
hypocrisy must stop. <a class="toggle" href="https://hroy.eu/tags/privacy/#notes-avoid-ghostery-proprietary.ghosttweet">See these
tweets from years ago…</a></p>
<div class="toggleable" id="notes-avoid-ghostery-proprietary.ghosttweet"></div>
<blockquote class="twitter-tweet" lang="fr"><p><a href="https://twitter.com/accessjames">@accessjames</a> <a href="https://twitter.com/phisab">@phisab</a> We are currently working on making it open source, it's an ongoing project. Ghostery blocks, no need for opting out.</p>— Ghostery (@Ghostery) <a href="https://twitter.com/Ghostery/statuses/290834903745040384">14 Janvier 2013</a></blockquote>
<p><script async src="http://platform.twitter.com/widgets.js" charset="utf-8"></script></p>
<blockquote class="twitter-tweet" lang="en"><p>this is good news :) RT <a href="https://twitter.com/Ghostery">@Ghostery</a>: Currently, you can access Ghostery's code if you unpack the ext. We are still looking to open source, too</p>— Jeekajoo (@jeekajoo) <a href="https://twitter.com/jeekajoo/statuses/339332265215655936">May 28, 2013</a></blockquote>
<p><script async src="http://platform.twitter.com/widgets.js" charset="utf-8"></script></p>
<div class="toggleableend"></div>