To prevent automated spam submissions leave this field empty.
Moderator: Oonagh Fitzgerald, Centre for International Governance Innovation
The protection of the “digital individual” in human rights as well as in other areas of international law is a topic of great concern and some urgency. The rights advocates and the technology specialist on this panel set out the practical legal and technological questions. The investigation of what challenges exist in offering protection was the principal focus of the discussion, but panelist Eileen Donohoe established the premise that has been expressed by the United Nations Human Rights Council in its 2016 statement regarding human rights on the internet: in principle human rights must be protected in the online world just as in the offline world. The challenge is how to do it, and the investigation is underway.
Donohoe presented the questions along with a clear statement of the challenges as to make the protection of human rights effective in the online realm. She suggested that technology challenges the international law framework in these ways: 1) Metaphysical: Is digital identity core to human identity? Not all of world is connected, but where it is, this is an issue. International human rights law and international humanitarian law both say that human rights inhere in the human person and so how do our digital footprints count? 2) Conceptual: Digital technology operates as a transborder technology and is not localized whereas sovereign states have territorial boundaries; 3) Normative: The digitization of everything challenges the right to privacy that is key to all fundamental freedoms- if everything you do is tracked and monitored you feel less free to act and speak which is crucial to human rights; 4) Practical: the modus operandi of most of the internet is to collect data and to monetize us; governments use it for surveillance and police work. Ads support much of our internet access and to some extent this may be deemed acceptable (this panel did not cover the role of “liking” products or similar effects).
Emma Llanso used examples from Google to show how algorithms and algorithmic products work and how they impact lives and rights. She played Google's own clip of how search works. Google spokesman Matt Cutts explained spiders and how they follow the links that pages link to and in turn those to which those pages link, and so on. It is important for the casual user to understand that Google does not search the web but Google's index of the web and finds every page with the search terms. The search asks how often and where the search terms or synonyms appear, and also asks if the page of high or low quality. This results in page ranking. Google rates a page by how many outside links point to it and how important they are to give each page a score. Google’s Matt simplistically calls the result fair and objective without much detail; needless to say there is more to it, but this is the public-facing presentation, and it was informative enough to raise several questions.
First of all, Llanso points out the conflict of rights in this sphere by citing the now-famous Google case involving Spain and the desire of a user to “be forgotten.” The European Court of Justice decision could be a problem for free information advocates, many of whom actually are not in favor of removing information. Here, it clashes with a right to privacy. Although there is a difference between removal and telling the search engine to obscure the access to the outdated information, she asked if this was well-tailored to what the European Union is trying to achieve? Governments could use this approach to obscure information is ways that damage transparency, so in that context it might be bad precedent.
Llanso and Fitzgerald discussed further issues beyond even the European Data Protection Law by bringing in a case to be heard this spring in which the French data protection authority asserts that French law should apply to searches about French citizens regardless of who is doing the searches and from where. While van Valkenburgh later pointed out that Virtual Private Networks can get around the geographic IP identification, the discussion turned to Donohue’s suggestion that a transborder due process model might create a standard that can work across jurisdictional lines. International law must acknowledge the European bias towards privacy protection and the American bias towards free expression and when rights conflict, extending jurisdiction to enforce can be distinguished from jurisdiction to protect your own people. It may require a principle of “do no harm” to other jurisdictions' protections. Extraterritorial effects cannot be at the expense of credible concerns of other jurisdictions.
The panel moved from the practical considerations to the conceptual and metaphysical by showing the use of psychographics via a video from the audience-targeting firm Cambridge Analytica. This involves three methodologies: 1) Behavioral science, 2) Big Data on demographics and personality, and 3) gathering this data and applying it to targeting voters with psychographic and psychoanalytic methods such as an array of personality traits with the acronym OCEAN (e.g., openness, sociability). These were used to survey Americans toward a nuanced a message manipulating elements in the personality of the voter. Are they fear-based? Do they adhere to tradition and family? (Notably, a picture of a woman was used in a graphic on the video under the label of neuroticism.) Van Valkenburgh pointed out that while we do let other institutions credential us and keep our driving and tax records, we share in context and without that context it places very personal data alongside more general information. When we talk about centralized information control by governments, that situation poses dangers and yet so does fragmentation, specialization and even crowd-sourcing.
Finally, van Valkenburgh presented the phenomenon of the Blockchain with the idea that any distributed network may bring the user to a place in cyberspace where you can access digital assets peer to peer without intermediaries. He demonstrated a way to “bank” without financial intermediaries, or the central internet. A token, similar to digital currency, could represent an individual's right to vote in an election, for example. There is no corporation or government; there are just cyberspace and code rules, not jurisdictional rules. Donahoe mentioned the Internet and Jurisdiction Network. They are trying to get a due process model that people around the world will agree to in dealing with content takedown, domain name seizures, and data requests from law enforcement. Llanso indicated that there is a big incentive for law enforcement to work something out for judicial cooperation when outside law enforcement gets requests in that our U.S. probably cause standard for warrants, for example, will not be open to a lower standard.
There has been a focus on Google, Facebook and Twitter regarding hate speech: there the U.S. protects it but many jurisdictions do not. Can we overcome potential conflicts and create global codes through a convention? Pre-digital era, each jurisdiction could interpret rights for its own culture but with a reality check no matter how universal the rights. We might agree to protect voting with ledgers that resolve domain names or replace ICANN to get the corporations out of that. As Van Valkenburgh pointed out, a decentralized network cannot really be taken down but even that could be bad for privacy yet good for free speech. A future panel might consider the notion of embodiment and how our freedom is extended and yet risked by our tools of self-extension.
Marylin J. Raisch is the Associate Director for Research & Collection Development and Adjunct Professor of Law at Georgetown Law