Category Archives: internet

Finding a solution to the Google’s dilemma on the “right to be forgotten”, after the “political” ECJ decision.

The decision of the European Court of Justice on the Google case has re-opened the debate on the importance of remembering and forgetting in the digital age. Legal scholars, columnists and experts have either agreed with the position of the court on the right to be forgotten or, on the contrary, criticised the decision as an attempt to limit the freedom of expression.

Now, the dust is settling and the first “transparency report” published by Google shows a limited effect of removals on freedom of expression, although the report presents only a few aggregated data.

For this reason, the time has come to assess the long-term effect of the decision.

From this perspective, the consequences should not be overestimated. This is not a decision on the right to be forgotten, since the news is still available in newspaper archives. It concerns the worldwide access via search engines to online information.

Nor is it a decision against the freedom of expression, since the court explicitly required a balancing test between individual rights and access to information.

Nevertheless, it is a controversial decision. It transforms each search engine into a judge, which should decide when the freedom of expression prevails and in which cases “the publicity goes to unreasonable lengths in revealing facts about one who has resumed the private, lawful and unexciting life led by the great bulk of the community”, as stated by the 2nd Restatement Torts, in the US.

The critical aspect is not the private nature of the company that makes the balancing test. In a number of legal systems across Europe, the same balancing test is used by media companies in cases regarding privacy, right to be forgotten or defamation. However, in those cases, the test is made by journalists, who take responsibility for checking the facts they publish and have the professional skills to make the above-mentioned test.

On the contrary, Google, as well as any other search engine, neither investigates and checks the facts, nor has the professional expertise of a media company.

For this reason, I consider this mainly a “political” decision, in the sense it pertains to citizens (from Greek polítes “citizens”). Remembering and forgetting are fundamental aspects of our individual and social life, and the balance between remembering and forgetting has a substantial impact on our digital society (Mayer-Schönberger, V. 2011).

In spite of that, the decision has pointed to the direction, but has not built the path.

The direction is represented by the strong support to data subject’s rights (“the data subject’s rights protected by those articles [7 and 8, Charter of Fundamental Rights of the European Union] also override, as a general rule, that interest of internet users [in having access to information]”) and, more specifically, by the support to the right to erasure of personal information that have not been ”fairly and lawfully” processed. This is not a new right, as it has been represented in various comments, but an already existing right, which has been recognized both by European law and national courts in Europe.

Even though the direction has been defined, the technical solution provided by the courts (the “path”) is still inadequate. It should be noted that the reason for this lies in the fundamental inadequacy of the existing legal framework. This was written during the 90’s and now it has to address the issues arising from a completely different digital environment.

From this perspective, the decision puts the trade-off between remembering and forgetting at the centre of the debate and it (hopefully) induces to reconsider the provisions of the Article 17 of the EU Proposal for a General Data Protection Regulation. This is the “political” value of the decision.

In the light of the above, the future EU regulation should consider the peculiar nature of search engines as data controllers. It should introduce an ad hoc legal provision, which excludes the direct enforcement of the right to erasure carried out by data controllers and requires a complaint direct to a court or data protection authority (DPA). This avoids that search engines play the (improper) role of judges in these cases.

At the same time, this provision should also impose to data controllers the temporary removal of the links in dispute, when they receive a motivated request from a data subject. This “freeze” of the link will be maintained for a short period of time (e.g. 20-30 days). If the data subject does not take legal action within this time, the link will be reactivated and no legal action can be made in the future for the same link, except in the case of change of the surrounding circumstances.

The added value of this approach is represented by the fact that it combines a short temporary restriction to information access with a model based on a decision adopted by a court or DPAs, not by a private entity.

On the contrary, there are still some aspects that need to be further investigated and improved. They regard the above described process and the related need to track the requests. Nevertheless, this seems to be an easy-to-solve problem considering that the solution should be implemented by the major IT companies.

Advertisements

A few notes about the Google case and the right to be forgotten

The decision of the Court of Justice of the European Union reopens the debate on the right to be forgotten (see Mantelero, 2013).

The Court has affirmed:

“As the data subject may, in the light of his fundamental rights under Articles 7 and 8 of the Charter, request that the information in question no longer be made available to the general public on account of its inclusion in such a list of results, those rights override, as a rule, not only the economic interest of the operator of the search engine but also the interest of the general public in having access to that information upon a search relating to the data subject’s name. However, that would not be the case if it appeared, for particular reasons, such as the role played by the data subject in public life, that the interference with his fundamental rights is justified by the preponderant interest of the general public in having, on account of its inclusion in the list of results, access to the information in question.”

The most controversial aspect of the decision is the evaluation of the opposing interests (right to be forgotten vs freedom of expression). (Zittrain, 2014)

The Court suggests that “supervisory authority or judicial authority” may order search engines “to remove from the list of results displayed following a search made on the basis of a person’s name links to web pages published by third parties containing information relating to that person”. Nevertheless, the provisions of the Directive 95/46/EC do not exclude that the request can be made directly by the data subject to data controllers (i.e. search engines). In this case, to avoid lawsuits and claims for damages, search engines should promptly perform a balancing test of the interest in the person in his or her privacy versus the interest in the public at large to be informed, but this kind of test should be made by judicial authorities or DPAs and not by a private company.

In the past, DPAs ordered to media to modify the robot.txt file in order not to make specific contents indexable by crawlers of search engines. In those cases, a prior balancing test may be also conducted by publishers, which have professional skills and the duty to check the newsworthiness of the news. For this reason, publishers are in better position than search engines to balance the opposing interests.

Anyway, the positive aspect of this decision is that it induces to reconsider positively the Article 17 of the EU Proposal for a General Data Protection Regulation, which is clearer that the scenario depicted by this decision. This provision admits a specific exception for freedom of expression and recognizes the role played by courts and regulatory authorities in deciding which data must be erased. Finally, it empowers the Commission to define detailed procedures and solutions to delete personal information.

Giving a “political” interpretation to the decision, it seems an anticipation of the provisions of the EU Proposal, although made in a way that should induce lobbies to reconsider their opposition against the “right of erasure” defined in the Proposal.

Competitive value of data protection: the impact of data protection regulation on online behaviour

Abstract

  • The increasing demand from individuals to have their privacy respected or to take decisions about the management of their information assumes a significant role in business activities and it becomes an important element for building public trust in service providers.

  • In this scenario, keeping the focus of data protection only on the individual and his or her decisions is no longer adequate. If legislators consider data protection as a fundamental right, it is necessary to reinforce its protection in order to make it effective and not conditioned by the asymmetries which characterize the relationship between data subject and data controllers.

  • This aim is implemented by the EU proposal by means of three different instruments: data protection impact assessment, privacy by design/by default solutions, and the data minimization principle.

  • The competitive value of data protection can be assured and enhanced only if the user’s self-determination over personal data is guaranteed. From this point of view, countering the phenomena of data lock-in and ‘social’ lock-in is fundamental in order to offer privacy-oriented and trustworthy services, which increase user propensity to share data and stimulate the digital economy and fair competition.

International Data Privacy Law (2013), Oxford University Press

http://idpl.oxfordjournals.org/content/early/2013/07/14/idpl.ipt016.short

[electronic pre-print version]

Data Protection in a Global World

The initial approach to data protection was local, different countries adopted specific legislative measures, and in some cases concerned only specific sectors characterized by a high need for data protection. This approach is no more adequate in a world where data flows across national boundaries many times a second, and in the context of big data, where it is not possible to define in advance which kind of information is relevant and sensible.

Read the full article here (pdf format)

EU data protection: about a recent post on “Info/Law”

Yakowitz’s post reminds me of the idea of the synecdoche, the rhetoric figure in which a part is used for the whole; in the same way Yakowitz considers only the right to be forgotten and forgets the other ninety articles of the EU proposal of general regulation on data protection. She also forgets the entire EU legal framework on the protection of individuals (Treaty on Functioning of the European Union) and the historical evolution of data protection in the last thirty years.

There in no motivation to declare that “the European Commission has decided it’s a good time to retard the Internet some more”. This is not the place to compare the US and the EU legislation, but if we intend to do so a certain number of US political and legislative initiatives with a negative impact on internet should also be noted… here on Info/Law.

 

Facebook: report of data protection audit of Facebook Ireland published

In the report of an audit carried out in the period October-December 2011, the Irish Data Protection Commissioner affirmed that Facebook acts as a controller in processing the data of the users. The DPC identified two different critical areas: the appropriate controls and related information provided by Fecebook to the users who share their information with other users, the use of personal information to target advertising to the users.

The Irish DPC made various recommendations for “best practice” improvements and planned a formal review of progress in July 2012.

see also: Complaints against Facebbok in Europe

The Parliamentary Assembly of the Council of Europe has adopted a resolution on the protection of privacy and personal data on the Internet

The Parliamentary Assembly of the Council of Europe has adopted the Resolution 1843 (2011) on The protection of privacy and personal data on the Internet and online media. In this resolution the Assembly expresses its concern about the huge amount of personal information processed by an ever growing number of private and public bodies globally, especially on-line. This situation, combined with the increased cases of data breach, has led the Assembly to be “alarmed by such developments challenging the right to privacy and data protection”.

In response to such situation the Assembly stated that “cyberspace must not be regarded as a space where law, in particular human rights, does not apply” and suggested that technological solutions as well as the voluntary self-regulation may reduce the risk of interference with privacy and the harmful processing of personal data, but only “specific legislation and effective enforcement” can adequately protect the right to privacy and personal data.

In drawing its considerations the Assembly underlines the legal basis of data protection, explicitly recalling Article 17 of the International Covenant on Civil and Political Rights and Article 8 of the European Convention on Human Rights. The resolution makes also reference to previous resolutions such as the Resolution 428 (1970) on mass communication media and human rights (which affirms that “Where regional, national or international computer-data banks are instituted the individual must not become completely exposed and transparent by the accumulation of information referring even to his private life. Data banks should be restricted to the necessary minimum of information required for the purposes of taxation, pension schemes, social security schemes and similar matters”) and the more relevant Convention No. 108.

The Resolution also defines specific guidelines concerning the consent of the data subject and the use of cookies.

With regard to the first aspect the Resolution establishes that “personal data may not be used by others, unless the person has given his or her prior consent which requires an expression of consent in full knowledge about such use, namely the manifestation of a free, specific and informed will, and excludes an automatic or tacit usage”.

On the second aspect the Assembly puts the use of cookies, or similar devices, in relation with the secrecy of correspondence and affirms that “personal ICT systems, as well as ICT-based communications, may not be accessed or manipulated if such action violates privacy or the secrecy of correspondence; access and manipulation through “cookies” or other unauthorized automated devices violate privacy, in particular where such automated access or manipulation serves other, especially commercial, interests”.