Tag Archives: privacy

Finding a solution to the Google’s dilemma on the “right to be forgotten”, after the “political” ECJ decision.

The decision of the European Court of Justice on the Google case has re-opened the debate on the importance of remembering and forgetting in the digital age. Legal scholars, columnists and experts have either agreed with the position of the court on the right to be forgotten or, on the contrary, criticised the decision as an attempt to limit the freedom of expression.

Now, the dust is settling and the first “transparency report” published by Google shows a limited effect of removals on freedom of expression, although the report presents only a few aggregated data.

For this reason, the time has come to assess the long-term effect of the decision.

From this perspective, the consequences should not be overestimated. This is not a decision on the right to be forgotten, since the news is still available in newspaper archives. It concerns the worldwide access via search engines to online information.

Nor is it a decision against the freedom of expression, since the court explicitly required a balancing test between individual rights and access to information.

Nevertheless, it is a controversial decision. It transforms each search engine into a judge, which should decide when the freedom of expression prevails and in which cases “the publicity goes to unreasonable lengths in revealing facts about one who has resumed the private, lawful and unexciting life led by the great bulk of the community”, as stated by the 2nd Restatement Torts, in the US.

The critical aspect is not the private nature of the company that makes the balancing test. In a number of legal systems across Europe, the same balancing test is used by media companies in cases regarding privacy, right to be forgotten or defamation. However, in those cases, the test is made by journalists, who take responsibility for checking the facts they publish and have the professional skills to make the above-mentioned test.

On the contrary, Google, as well as any other search engine, neither investigates and checks the facts, nor has the professional expertise of a media company.

For this reason, I consider this mainly a “political” decision, in the sense it pertains to citizens (from Greek polítes “citizens”). Remembering and forgetting are fundamental aspects of our individual and social life, and the balance between remembering and forgetting has a substantial impact on our digital society (Mayer-Schönberger, V. 2011).

In spite of that, the decision has pointed to the direction, but has not built the path.

The direction is represented by the strong support to data subject’s rights (“the data subject’s rights protected by those articles [7 and 8, Charter of Fundamental Rights of the European Union] also override, as a general rule, that interest of internet users [in having access to information]”) and, more specifically, by the support to the right to erasure of personal information that have not been ”fairly and lawfully” processed. This is not a new right, as it has been represented in various comments, but an already existing right, which has been recognized both by European law and national courts in Europe.

Even though the direction has been defined, the technical solution provided by the courts (the “path”) is still inadequate. It should be noted that the reason for this lies in the fundamental inadequacy of the existing legal framework. This was written during the 90’s and now it has to address the issues arising from a completely different digital environment.

From this perspective, the decision puts the trade-off between remembering and forgetting at the centre of the debate and it (hopefully) induces to reconsider the provisions of the Article 17 of the EU Proposal for a General Data Protection Regulation. This is the “political” value of the decision.

In the light of the above, the future EU regulation should consider the peculiar nature of search engines as data controllers. It should introduce an ad hoc legal provision, which excludes the direct enforcement of the right to erasure carried out by data controllers and requires a complaint direct to a court or data protection authority (DPA). This avoids that search engines play the (improper) role of judges in these cases.

At the same time, this provision should also impose to data controllers the temporary removal of the links in dispute, when they receive a motivated request from a data subject. This “freeze” of the link will be maintained for a short period of time (e.g. 20-30 days). If the data subject does not take legal action within this time, the link will be reactivated and no legal action can be made in the future for the same link, except in the case of change of the surrounding circumstances.

The added value of this approach is represented by the fact that it combines a short temporary restriction to information access with a model based on a decision adopted by a court or DPAs, not by a private entity.

On the contrary, there are still some aspects that need to be further investigated and improved. They regard the above described process and the related need to track the requests. Nevertheless, this seems to be an easy-to-solve problem considering that the solution should be implemented by the major IT companies.

Advertisements

A few notes about the Google case and the right to be forgotten

The decision of the Court of Justice of the European Union reopens the debate on the right to be forgotten (see Mantelero, 2013).

The Court has affirmed:

“As the data subject may, in the light of his fundamental rights under Articles 7 and 8 of the Charter, request that the information in question no longer be made available to the general public on account of its inclusion in such a list of results, those rights override, as a rule, not only the economic interest of the operator of the search engine but also the interest of the general public in having access to that information upon a search relating to the data subject’s name. However, that would not be the case if it appeared, for particular reasons, such as the role played by the data subject in public life, that the interference with his fundamental rights is justified by the preponderant interest of the general public in having, on account of its inclusion in the list of results, access to the information in question.”

The most controversial aspect of the decision is the evaluation of the opposing interests (right to be forgotten vs freedom of expression). (Zittrain, 2014)

The Court suggests that “supervisory authority or judicial authority” may order search engines “to remove from the list of results displayed following a search made on the basis of a person’s name links to web pages published by third parties containing information relating to that person”. Nevertheless, the provisions of the Directive 95/46/EC do not exclude that the request can be made directly by the data subject to data controllers (i.e. search engines). In this case, to avoid lawsuits and claims for damages, search engines should promptly perform a balancing test of the interest in the person in his or her privacy versus the interest in the public at large to be informed, but this kind of test should be made by judicial authorities or DPAs and not by a private company.

In the past, DPAs ordered to media to modify the robot.txt file in order not to make specific contents indexable by crawlers of search engines. In those cases, a prior balancing test may be also conducted by publishers, which have professional skills and the duty to check the newsworthiness of the news. For this reason, publishers are in better position than search engines to balance the opposing interests.

Anyway, the positive aspect of this decision is that it induces to reconsider positively the Article 17 of the EU Proposal for a General Data Protection Regulation, which is clearer that the scenario depicted by this decision. This provision admits a specific exception for freedom of expression and recognizes the role played by courts and regulatory authorities in deciding which data must be erased. Finally, it empowers the Commission to define detailed procedures and solutions to delete personal information.

Giving a “political” interpretation to the decision, it seems an anticipation of the provisions of the EU Proposal, although made in a way that should induce lobbies to reconsider their opposition against the “right of erasure” defined in the Proposal.

U.S. Concern about the European Right to Be Forgotten and Free Speech: Much Ado About Nothing?

Contents: 1. Introduction. – 2. The right to be forgotten in Europe. – 3. The right to be forgotten in the U.S. – 4. The EU Proposal for a General Data Protection Regulation. – 5.
False perspectives and real problems.

1. – After the official presentation of the EU Proposal for a Regulation on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), various opinions of lawyers and law scholars have been published in U.S. newspapers, legal blogs and law reviews. Many of these do not consider the entire framework of the new provisions and seem to use the rhetorical figure of synecdoche, considering a part (the right to be forgotten) for the
whole.

Read the full paper here

 

India: new rules on data protection and effects on outsourcing processes

New paper published:

La nuova normativa indiana in materia di data protection: la protezione dei dati declinata in maniera funzionale all’outsourcing, in Contratto e Impresa Europa, 2011, pp. 708 ss.

keywords: comparative law – data protection – Indiaabstract: the paper analyzes the recent Indian regulation on data protection and highlights the influence that the industrial policy reasons have had in defining the new rules. The author examines the approach adopted by Indian law and the possible contrasts with EU data protection law.

EU “cookie law”: new rules and technical solutions concerning cookies and other device to profile internet users

New paper published:

Observatory on ICT Law: new rules and technical solutions concerning cookies and other device to profile internet users, in Contratto e Impresa Europa, 2011, pp. 787 ss.

keywords: European Union law – data protection – profiling

abstract: taking into consideration the recent privacy enhancing technology (Do Not Track technology), the paper analyzes the EU legislation concerning cookies and other device to profile internet users (Directive 2009/136/EC).

Complaints against Facebbok in Europe

An Austrian law student has made different complaints against Facebook concerning the privacy policies of the wellknown social network. Complaints concern different topics: shadow profiles, tagging, synchronizing, deleted postings, postings on other users’ pages, messages, privacy policy and consent, face recognition. access request,deleted tags, data security, applications, deleted friends, excessive processing of data, opt-out, like button, obligations as processor, picture privacy settings, deleted pictures, groups, new policies (the complaints are available here, see also this video).
Considering the provision 18 of Facebook’s terms of service (“If you are a resident of or have your principal place of business in the US or Canada, this Statement is an agreement between you and Facebook, Inc. Otherwise, this Statement is an agreement between you and Facebook Ireland Limited. References to “us,” “we,” and “our” mean either Facebook, Inc. or Facebook Ireland Limited, as appropriate”), the complaints have been submitted to the Irish Data Protection Authority.

The DPA will probably decide the case this week.

The Parliamentary Assembly of the Council of Europe has adopted a resolution on the protection of privacy and personal data on the Internet

The Parliamentary Assembly of the Council of Europe has adopted the Resolution 1843 (2011) on The protection of privacy and personal data on the Internet and online media. In this resolution the Assembly expresses its concern about the huge amount of personal information processed by an ever growing number of private and public bodies globally, especially on-line. This situation, combined with the increased cases of data breach, has led the Assembly to be “alarmed by such developments challenging the right to privacy and data protection”.

In response to such situation the Assembly stated that “cyberspace must not be regarded as a space where law, in particular human rights, does not apply” and suggested that technological solutions as well as the voluntary self-regulation may reduce the risk of interference with privacy and the harmful processing of personal data, but only “specific legislation and effective enforcement” can adequately protect the right to privacy and personal data.

In drawing its considerations the Assembly underlines the legal basis of data protection, explicitly recalling Article 17 of the International Covenant on Civil and Political Rights and Article 8 of the European Convention on Human Rights. The resolution makes also reference to previous resolutions such as the Resolution 428 (1970) on mass communication media and human rights (which affirms that “Where regional, national or international computer-data banks are instituted the individual must not become completely exposed and transparent by the accumulation of information referring even to his private life. Data banks should be restricted to the necessary minimum of information required for the purposes of taxation, pension schemes, social security schemes and similar matters”) and the more relevant Convention No. 108.

The Resolution also defines specific guidelines concerning the consent of the data subject and the use of cookies.

With regard to the first aspect the Resolution establishes that “personal data may not be used by others, unless the person has given his or her prior consent which requires an expression of consent in full knowledge about such use, namely the manifestation of a free, specific and informed will, and excludes an automatic or tacit usage”.

On the second aspect the Assembly puts the use of cookies, or similar devices, in relation with the secrecy of correspondence and affirms that “personal ICT systems, as well as ICT-based communications, may not be accessed or manipulated if such action violates privacy or the secrecy of correspondence; access and manipulation through “cookies” or other unauthorized automated devices violate privacy, in particular where such automated access or manipulation serves other, especially commercial, interests”.