Opening Remarks
Urs Gasser
Let me introduce some of the core themes of symposium by reporting about an April 2011 court decision of the Federal Administrative Court of Switzerland – my home country. In the April ruling, the Swiss Federal Administrative Court required Google Switzerland to take extra measures in the context of its StreetView service to be compliant with Swiss data protection law. The court required that Google not only uses automatic blurring technology to obscure people’s faces and number plates of vehicles, but manually blurs other identifying features, such as skin color and clothing, from people photographed in front of “sensitive establishments,” such as women’s shelters, retirement homes, prisons, schools, courts and hospitals. Also, Google is not allowed to provide views into gardens, backyards, and the like that a “normal pedestrian” couldn’t see when walking by.
The ruling – here used as only as a placeholder for the broader phenomenon – nicely illustrates some of the core themes and topics that this conference seeks to address.
First, the ruling illustrates the complexity of the phenomenon under investigation: the delineation of the private and the public in the digitally networked age. The StreetView case not only serves as a proxy for the ways in which digital technologies shift the boundaries between private and public. It also demonstrates the need for a much more nuanced, granular, and context-sensitive definition of what privacy means – including privacy in the public space such as streets – or libraries, to refer to David Weinberger’s excellent blog post.
Second, the Swiss StreetView ruling – against which Goolge has appealed – reminds us of the power of and responsibility for design choices. The power of design choices in this case obviously concerns the ways in which StreetView is built, from the design of the underlying code to the actual pictures taken. A number of “power effects” can be isolated:
• Design choices can have enabling effects: In the case of StreetView Switzerland, over thousand companies small and large are offering services that are built on top of StreetView.
• Design choices may have leveling effects: StreetView allows for instance disabled users to explore cities in ways that have only been available to people without handicap.
• Design choices of course also constrain what we can do with technology, how we use it and for what purposes. From a cyberlaw perspective, this constraints-perspective has gained much attention. Professor Lessig has popularized it with the equation “code is law”. But as with law, it would be a mistake in my view to put too much emphasis on design as a constraint on behavior. It’s only one facet, and arguably not the most important one.
• Finally, design choices can produce unintended consequences or spillover effects. Again, Streetview is a good example, where in fact the privacy of people pictured by Google cameras can be compromised.
One of the interesting questions is what are the corrective mechanisms available to balance this power of design choices, both ex ante (e.g. launch of a new app) or ex post. Question of accountability of designer seems largely unresolved.
Third, the StreetView ruling rises the question what role social norms play vis-à-vis technological innovations. How do we think about and evaluate the design that is bolstered by social norms? For example, in case of StreetView a significant percentage of Swiss inhabitants are using this service and seem to support the underlying choices that have been made. And then, perhaps even more importantly, how well equipped are we to respond to instances where certain design choices clash with social norms? Are public apologies a la Facebook earlier this week the appropriate responses, or do lawsuits a la Goolge Buzz do the trick? Or can we envision more productive, discursive responses?
Fourth observation: the StreetView ruling is symptomatic for the types of questions the legal system has to cope with when operating in a quicksilver tech-environment. Technological advancements, commercial practices, and shifts in user behavior put pressure on traditional legal concepts and definitions. E.g., what does “personal information” mean?
The StreetView ruling also nicely illustrates the typical response pattern taken by the legal system: it seeks to subsume new phenomena under old rules and only over time might react with innovation within the legal system – that is typically: the enactment of new rules or application of new doctrines (see Grokster).
Much more could be said about the law. Let me just add one particular aspect: The StreetView ruling is also interesting in the sense that it brings up the question of the legal system’s reasonable expectation vis-à-vis certain design choices and limitations. In this particular case, the law expects perfection: The court didn’t consider it to be sufficient that Google’s blurring technology catches 99% of the faces of individuals pictured.
One could distill several more dimensions and themes from the StreetView ruling for this conference, including economic considerations, which obviously play a key role across the board. Instead, let me conclude by sharing the following thought: In an environment where the lines between private and public spaces are blurring, technology is developing so rapidly, user behavior is in constant flux, complex feedback loops among technology, law, economics, and behavior exist, and where norms become increasingly contextual, fragmented, and ad hoc, our responsibility and challenge as designers – including law- and policy-makers as well as other professionals – should include the creation of advanced spaces for negotiation and conversation about privacy and its boundaries, the exploration of new types of interfaces among spheres and layers, and the creation of hybrid private/public spaces.
I hope today’s conference is a step into this direction.