Wednesday, November 12, 2014

Augmented Reality and Privacy Concerns: What Will the Law Have to Say


Augmented Reality and Privacy Concerns: What Will the Law Have to Say

 Written by Dan Cyr

With augmented reality becoming ever more “real” as the years pass by, what is the use of wearable glasses and other devices have on our human rights and laws that protect us? When it comes to augmented reality, there are “inputs” which capture and record the environment and “outputs” that overlay information over the actual environment in focus. In general, augmented reality senses properties about the real world, processed data in real time, recognizes real world object and outputs information to the user. How will this technology affect our privacy and sense of security?
Upon further review of the legal concerns and our rights here in the states, I came across a research paper by the UW Tech Policy Lab and Computer Science team at the University of Washington[1]. When it came to the collection or “input” of data from an augmented reality device, the following issues were found:
·       Today the courts treat nearly any expectation of privacy in public as unreasonable. But technologies such as GPS and drones that are capable of widespread or constant surveillance at low cost are testing the limits of this doctrine. AR will put additional pressure on this cracking edifice because it has the potential to record persistently, source and present related information from various sources to users, and blend seamlessly into the environment.
·       American constitutional law also assumes no reasonable expectation of privacy in information conveyed to a third party. AR has the potential to convey one’s entire stream of observation to a company for analysis and storage, with unclear constitutional import. Design choices about whether to store data locally or in the cloud (or to provide user with a choice) directly affect the level of legal privacy protections afforded that information vis-à-vis the user.
·        Historically, free speech interests have involved the right to express oneself in various media. AR tests the limits of a burgeoning free speech right, recognized by a handful of courts, to photograph public officials or matters of public interest.
·       AR complicates intellectual property law by gathering and potentially transforming copyrighted or trademarked material that appears in the real world. For example, recording copyrighted material likely constitutes copying, for purposes of copyright infringement, at the moment of capture—as well as when copies are saved to external (temporary or permanent) storage. Of course, the usual defenses to infringement (e.g., fair use) apply in these scenarios, but the potentially pervasive and persistent sensing of copyrighted material by AR technologies, combined with manipulation or output issues, raises difficult new questions about how existing intellectual property law will apply to new situations made possible my AR.
·       The form factor of recording equipment has an effect on rulings in the legal landscape. In areas where there is a reasonable expectation of privacy, the presence of obvious recording equipment—like a shouldercam—is considered to serve as a cue that recording may be taking place. While early AR rigs— such as those worn by Steve Mann or Thad Starner— were fairly obvious, modern AR systems are leaning towards more inconspicuous form factors. This, in turn, can have an effect on legal rulings regarding captured footage.
·         AR systems might also be designed to allow remote environmental triggers to control when sensing capabilities should be disallowed (for example, a movie theater may limit the ability of devices to record while a movie is screening). This possibility raises novel questions about limits on First Amendment information gathering rights, device ownership, intellectual property protections, and personal privacy.
 
When it comes to the output or display of the data, the following issues have been raised:
·       Users of AR may rely upon data that leads to their injury or other harm. Information provided to AR users may be false, incomplete, or misleading. Scenarios range from obscuring a road sign or distracting a driver, to misidentifying a plant or mushroom as safe to eat or failing to inform a user when a potentially dangerous situation is sensed by the technology. This capacity will test the limits of product liability law, among other areas, and the specific design of these systems (e.g., whether they are designed for specific or general purposes) may alter the legal outcomes.
·       AR can furnish users with truthful information they should not have, or at least that they cannot legally use to make decisions. Thus, for instance, a system could use facial recognition to pull up a job candidate’s mug shot, social media profile, or relationship status in a jurisdiction that does not permit employers to discriminate based on arrest history, marital status, or other information that may be available through technological intervention. Thus, the use of AR could contribute to forms of illegal discrimination, raising possible legal liability for users and developers.
·       AR could even prove the source of a new category of “digital assault,” i.e., intentional interference with an AR user to cause fear or other harm. Tort law purports to cover such transgressions, but there are next to no test cases to date. There are, however, preliminary examples—for instance, hacking a website for epileptics to attempt to induce seizures, or advertising for exterminator services by creating the illusion that a spider has run across the user’s screen. These factors suggest that the use of AR to surprise, scare, or harm an AR user (particularly when the technology can sense the user is in a vulnerable situation; for example, while driving a car or when the person is depressed or unhappy) may lead to potential liability for something akin to digital assault.
 
So what does this all mean? This means that there are plenty of hours and negotiations left in the works for augmented reality companies and lawyers. Privacy is a basic necessity that most of us crave as human beings. Companies need to respect that in order to make their sales.



[1] Roesner, F., et.al., Augmented Reality: Hard Problems of Law and Policy ACM International Joint Conference. Proc. Pages 1283-1288 http://dl.acm.org/citation.cfm?doid=2638728.2641709
 

No comments:

Post a Comment