Glass, feature creep and the ‘end of privacy’

It’s been a year since Google co-founder Sergey Brin introduced the world to Project Glass, igniting debates about what’s cool and creepy about the specs.

On the one hand, the technology has the potential to disrupt numerous industries – education, medical, and law enforcement to name a few.  But, at the same time, Glass raises obvious privacy concerns, as the web-enabled specs allow users to capture photos, take videos and share live footage.

Google has implemented various measures aimed at alleviating concerns about the privacy implications of Glass.  For instance, Google incorporated a red light intended to put the public on notice when the camera is in use.  The Glass developer policy also provides the following notice:

 “Don’t use the camera or microphone to cross-reference and immediately present personal information identifying anyone other than the user, including use cases such as facial recognition and voice print. Applications that do this will not be approved at this time.”

And, Google states that it intends to remotely block apps and disallow automatic software updates in an effort to prevent unintended uses of Glass.

However, for all of Google’s efforts, preventing feature creep will be a futile exercise.  For example, hacker Stephen Balaban of Lambda Labs is working on developing an alternative operating system that allows users to incorporate facial recognition into the specs.  A quick review of the #ihackglass twitter stream suggests that he’s not the only one…    This reality has prompted some privacy advocates to suggest that Glass may be the end of privacy as we know it.

Is Glass the end of privacy?  Probably not – and certainly not any more so than other emerging technologies that have widespread privacy implications, such as drones (which at least for now are much cheaper to obtain than Glass).  But even if we accept that Glass is the most privacy-invasive technology on the immediate horizon, the net effect of Glass is not necessarily going to be bad for privacy.

Technologies like Glass provide immense opportunity for innovation in privacy.  You know all of those people who are worried about Glass?  They represent a mass consumer market for developers of responsive privacy enhancing technologies.

Admittedly, the trajectory of technological innovation has favored privacy-diminishing products – but this need not be the case.  For instance, NYU researcher Adam Harvey is reverse engineering facial recognition technology with the goal of developing makeup that blocks the technology from being able to read human faces.  As this example suggests, at least from the perspective of innovation in privacy, Glass feature creep may not be such a bad thing.

Advertisement