The future of privacy: the necessary data or a march towards a future of corporatocracy?

Printer-friendly versionPrinter-friendly version

Prism logoThe combination of governmental and corporate data tracking are enabling outside entities to track increasingly granular details about our lives and interactions. Most of the reasons offer positives and are focused on protecting citizens or providing services we find useful. In order to do these jobs more efficiently, inroads to more data are critical in order to query across enough channels to create accurate connections such as identifying a terrorist organization's leader or helping us make certain we get enough exercise per day.

If we look at most science fiction, especially far future depictions of our living context, it usually involves technologies and interfaces that would require exactly the type of data we are so concerned about at the moment. Retinal scans, focus awareness, life details so the virtual butler can coordinate with others, effortless communication, self-driving vehicles and more. All of these, in order to simplify our lives and enjoy the parts we want, require constant tracking of some form.

There are, of course, exceptions. The Millennium Falcon wasn't registering flight plans when it shot out of Tatooine's space docks. And Frank Herbert didn't seem overly concerned about Big Data on Arrakis. But both of these worlds are nearly as much fantasy as science fiction and were not necessarily focused on technological minutiae (which is part of what make them great art).

The concerning issues are not how easily Amazon ships our next box of cereal, whether by truck or by drone, or how readily the NSA is able to help Homeland Security identify a terrorist cell. These are the intended consequences of our technology development and why so many companies show up at CES every year trying to present the next big thing.

The concerning issues are the unintended consequences of new technologies and how they might cause us personal harm over time, especially in ways we cannot easily identify. Let's begin by understanding that, while most of this concern is currently front page in the mainstream press, mostly due to Edward Snowden's leak of NSA practices, it has been bubbling on the fringe for some time.

EFF logoOne of the major players watching with concern is the Electronic Frontier Foundation (Wikipedia article), which was formed nearly a quarter century ago in response to concerns about how new online communication channels could be used against citizens. While the work the EFF does is very important, I wonder what will win as we move forward. Will the citizens of democratic countries choose to refuse some useful technologies due to the risk or will they adopt first and deal with the fallout later?

The healthiest method to reaching a future where technologies manage minutiae while we humans take on higher level issues would require a market in which new technologies, systems and practices can be brought to market, but the most concerning of these are refused until the offering is released in a non-threatening format or law changes to provide use without unintended consequence.

Diffusion of Innovations cover artIn Everett Rogers' Diffusion of Innovations, the professor looks at how new ideas and technology are adopted or refused and why. (This was part of reading for my Interactive Technologies course and I highly recommend it if you're interested in the factors influencing wide-scale consumer behavior). In his work, Dr. Rogers identifies five adopter categories (follow link for details):

  • Innovators
  • Early adopters
  • Early majority
  • Late majority
  • Laggards
Diffusion of ideas, two curves showing success and failure

The graph (via Wikipedia) depicts both the adoption rate (in yellow) and the percentage of each adopter marketshare (in blue). In order for an idea or technology to work, it must reach critical mass, which means it has started to saturate the social system. Without critical mass, the adoption slows and, in most cases, eventually fails.

While groups like the EFF are watching specific responses by President Obama over NSA reform, there is also concern as Google moves into our homes via the purchase of the increasingly popular Nest smart thermostat and Protect smoke alarm. At the end of 2013, Nest was a technology media darling and considered an impressive combination of design and capabilities transforming what was considered a relatively stagnant market. In 2014, these two devices are suddenly viewed as a Trojan Horse for Google to track granular data such as energy use, heat and cooling settings, and even when the family is home.

I should mention here that i fall into this category. I received a Nest for Christmas after my wife watched me drool over them, trying to decide if they were worth the cost. So my thermostat is now a Google node, capable of pulling data about my life in ways I did not originally consent. I am not particularly concerned at this time, but this is a change worth watching in case my data can be misused.

Wired has two interesting articles on the subject. Marcus Wohlsen identifies the data Google gets from the purchase in his article. Dan Hon delves into more concerns in his article focused on whether we can trust Google, or other large corporations, to access this information. And while both are worth a quick read, I also suggest you read through some of the comments on each article. Both offer some insightful and passionate viewpoints from those who are very concerned and those who do not believe this is a major issue (this group appears to be in the minority, if article comments can be used for any accurate measure).

Still, the overriding concern seems to remain that too many consumers will simply focus on the perceived benefits of adopting new technologies and services in place of careful consideration of long-term issues. What are your thoughts? Can we, as consumers, reach a point where our awareness of risk overrides our interest in new, shiny objects? Or, are successful marketing and identity manipulation (it's a feature!) enough to assuage concerns and sell product?

About the author:

Daryl Weade photo Interested in the social impact of our future advancements, Daryl developed and built Regarding Tomorrow as a platform to share and discuss our collective hopes and fears of the future. Daryl's background is in education, including graduate studies in special needs and a masters in instructional technology from UVA's Curry School of Education. He has worked as a high school teacher and has over 10 years of university experience in the US and Canada.

Explicit: 

Aspects of human existence: 

Location of story: