It is striking that many in both groups see living a public life online as the new default, though they often made different arguments about whether this would be helpful to creating a widely accepted regime of privacy or a harmful development that would lead to the unstoppable erosion of privacy.
Beyond the broad thoughts listed above there were additional themes often touched upon among the diverse concerns and hopes of those who answered the initial question “no” or “yes.” A large sampling of the thousands of answers received is organized under these themes in the content that follows.
Themes commonly found in the answers of those who say they expect there will not be a widely accepted privacy infrastructure by 2025
Theme 1) Living a public life is the new default. It is not possible to live modern life without revealing personal information to governments and corporations. Few individuals will have the energy, interest, or resources to protect themselves from ‘dataveillance’; privacy will become a ‘luxury.’
Themes commonly found in the answers of those who say they expect there will not be a widely accepted privacy infrastructure by 2025
- Living a public life is the new default. It is not possible to live modern life without revealing personal information to governments and corporations. Few individuals will have the energy or resources to protect themselves from ‘dataveillance’; privacy will become a ‘luxury.’
- There is no way the world’s varied cultures, with their different views about privacy, will be able to come to an agreement on how to address civil liberties issues on the global Internet.
- The situation will worsen as the Internet of Things arises and people’s homes, workplaces, and the objects around them will ‘tattle’ on them. The incentives for businesses to monetize people’s data and governments to monitor behavior are extremely potent.
- Some communities might plan and gain some acceptance for privacy structures, but the constellation of economic and security complexities is getting bigger and harder to manage.
Themes in responses of those expecting a trusted and reliable privacy arrangement by 2025
- Citizens and consumers will have more control thanks to new tools that give them the power to negotiate with corporations and work around governments. Individuals will be able to choose to share personal information in a tiered approach that offers varied levels of protection and access by others.
- The backlash against the most egregious privacy invasions will bring a new equilibrium between consumers, governments, and businesses—and more-savvy citizens will get better at hiding things they do not want others to see.
- Living a public life is the new default. People will get used to this, adjust their norms, and accept more sharing and collection of data as a part of life—especially Millennials and the young people who follow them. Problems will persist and some will complain but most will not object or muster the energy to push back against this new reality in their lives.
A principal engineer at Cisco wrote, “I would like to eat all I want and lose weight, but that trick does not work either.” An anonymous respondent wrote, “Privacy rights will be managed by market solutions, with the affluent able to maintain better control of their privacy. Like luxury cars and summer homes, control over private data will be the privilege of winning financially.”
Kate Crawford, a professor and research scientist, responded, “The last 10 years have given us a discouraging surfeit of evidence that companies will preference their ability to extract, sell, and trade data than establish simple, easy-to-use privacy protecting mechanisms. In the next 10 years, I would expect to see the development of more encryption technologies and boutique services for people prepared to pay a premium for greater control over their data. This is the creation of privacy as a luxury good. It also has the unfortunate effect of establishing a new divide: the privacy rich and the privacy poor. Whether genuine control over your information will be extended to the majority of people—and for free—seems very unlikely, without a much stronger policy commitment. Optimistically, people are better informed about how their data can be used to discriminate against them and demand greater security, privacy, and access to due process. Pessimistically, people may want those things, but they have no real power to get them.”
The executive director of a nonprofit that protects civil liberties online responded, “I do not think policymakers or technology innovators have the incentives to create a privacy-rights infrastructure, but even if they did, I do not believe governments will stop mass surveillance. It breaks my heart, but I do not think we are going to get this cat back into its bag. Sadly, I think individuals will get used to the fact that mass surveillance exists and will not expect privacy by 2025.”
Bryan Alexander, technology consultant, futurist, and senior fellow at the National Institute for Technology in Liberal Education, wrote, “Too many state and business interests prevent this. Governments, from local to national, want to improve their dataveillance for all kinds of purposes: war fighting, crime detection, taxes, and basic intelligence about economics and the environment. Companies badly want data about customers, and some base their business models on that. I do not see this changing much. Citizen action is probably the best option, much as it was for crypto in the 1990s. But, I do not see that winning over governments and big business… In the United States, both political parties and the clear majority of citizens cheerfully cede privacy.”
Clifford Lynch, executive director for the Coalition for Networked Information (CNI) and adjunct professor at the School of Information at the University of California-Berkeley, wrote, “Government and industry have aligned and allied to almost totally eliminate consumer and citizen privacy. This will not be allowed to change at scale—it is too convenient and too profitable for all parties involved. Today, it is almost impossible for consumers to opt out of the corporate side of this data collection and tracking because it is so pervasive, and, in 2013, the stunning scale of the government side of data collection has become clearer, as well as the government’s willingness to either purchase or legally demand data collected by corporations that the government cannot collect directly. You will see a small fringe of technically savvy people who will try to continue to deploy technology to protect some privacy for some purposes, but this will be small and periodically attacked or placed under particularly intense surveillance. You will also continue to see the government try to punish corporations who try to side with their customers, and reward corporations who are helpful to government objectives.”
Cathy Davidson, co-director of the PhD Lab in Digital Knowledge at Duke University, and co-founder and principal administrator of the MacArthur Foundation Digital Media and Learning Competition, wrote, “I fear the coming of walled Internets, where there is security but also pay walls—and the security is partial. The relationship of privacy, security, and openness is not resolved, and I fear it will not be done in a way that allows for openness in the future.”
An anonymous respondent replied, “There will not be a trusted privacy-rights infrastructure allowing for individual choice… The overall public perception will be that the right balance has been struck, as privacy will be only a concern for cranks. Employer concerns about employee behavior off-hours will fade, as a generation will have come of age with shared party photos and selfies, and will reject current norms requiring either privacy or sanitized private behavior—a concept which will have little meaning.”
Theme 2) There is no way the world’s varied cultures, with their different views about privacy, will be able to come to an agreement on how to address civil liberties issues on the global Internet.
Alice Marwick, researcher of the social and cultural impacts of social media and author of Status Update: Celebrity, Publicity, and Branding in the Social Media Age, wrote, “It will be quite difficult to create a popularly-accepted and trusted privacy rights infrastructure. This is for a number of reasons. First, countries, regions, and cultures differ in their approaches to privacy. For example, the United States, European Union, and Canada all have different approaches to online privacy and what constitutes acceptable data collection.”
John E. Savage, chair in computer science at Brown University and a fellow of the IEEE and the ACM, wrote, “A secure, accepted, and trusted privacy-rights infrastructure on the Internet, at the global scale, is impossible for the foreseeable future. For too many large nations a tension exists between state security and privacy rights. They will not sacrifice the former for the latter—a position that is not going to change unless revolutions occur, which is highly unlikely in the more developed nations. In democratic countries, bilateral and multilateral agreements respecting the privacy of citizens for commercial purposes are likely to be developed. It is highly unlikely that nation states will forswear invasion of individual privacy rights for national security purposes.”
[personally identifying information]
There’s also a matter of the cultural differences between Internet business interests and governments’ interests. An anonymous respondent replied, “I have difficulty foreseeing policymakers and corporations coming to agreement on privacy issues when there is little current agreement. Also, security is clearly not a high priority for corporations, and there seems to be little effort on the policy side to compel them to take it seriously. Content and apps will take care of themselves. I also do not see privacy becoming a major norm without some very major, personally affecting event. There are already tools that do not get taken advantage of to help with privacy, and people make little effort to change their behavior to promote privacy.”
Theme 3) The situation will worsen as the Internet of Things arises and people’s homes, workplaces, and the objects around them will ‘tattle’ on them. The incentives for businesses to monetize people’s data and governments to monitor behavior are extremely potent.
An anonymous respondent wrote, “As long as greed plays a role in our society, it will always be dominant in how policymakers and corporations treat the individual. There will be less privacy and more access to everything, including your DNA.”
Vickie Kline, an associate professor at York College responded, “Medical privacy will be the most paradoxical; we will have unprecedented data at our fingertips to make proactive decisions about our health, but the objects around us, and even our clothes, will tattle in real-time about the choices we make. We have to work towards security, liberty, and privacy online, but government and corporate intelligence and hackers will always keep us outside of the comfort zone. I wonder if the expectation of privacy as a right will gradually fade as people experience less actual privacy in their lives.”
[personally identifying information]
A self-employed software designer and policy researcher wrote, “Policymakers and private industry will do what it takes to convince consumers that they are reasonably secure, while also continuing to permit industry to exploit consumer information (at individual and collective levels) in new ways for profit and for purposes that suit state ‘needs’ (these needs being determined by the dominant value system, which usually is framed in terms of promoting free market-based ‘innovation,’ state security, taxation, etc.). If we are speaking about so-called Western states: the young people today will be adults. They already have too much control as the main consumers of technology and as the voices that industry caters to and tries to manipulate through ‘identity empowerment.’ At that point, they will be the value definers. They already have completely different concepts of personal identity, privacy, etc… We can expect that ‘private’ will not be an adjective that commonly precedes ‘space’ or ‘life,’ and that public disclosure and exposure of intimate life or economic details may not even be described as such, that associating corporate brands with personal identities will continue to perpetuate until people do not even recognize branding as branding (actually, that is already the case — cf. ‘sent from my iPhone’ and logos on clothing). Even physical ‘private property’ may become more exposed and less private, as we increasingly turn to home automation technologies to remotely control our door locks, IP security cameras, lights, alarms, etc.”
Kalev Leetaru, Yahoo fellow in residence at Georgetown University, wrote, “While… people publicly discuss wanting more privacy, they increasingly use media in a way that gives away their privacy voluntarily—for example, broadcasting their location via phone GPS when posting to social platforms, photographing their entire lives, etc. People seem to want to be famous, documenting their lives to the most-minute detail, in ways that would have been unheard of to a past generation. Moreover, each time a major social platform reduces privacy even further, there is a roar of public backlash and promises that people will leave en masse, but no one actually leaves the platforms, and in fact, more sign up. Thus, people are not voting with their feet. Companies have no incentive to increase privacy, which reduces revenue possibilities in terms of selling advertising and products based on identity and desires… For my detailed thoughts on this, see the chapter Tony Olcott and I wrote for a volume on changing norms on privacy.”
Theme 4) Some communities might plan and gain some acceptance for privacy structures, but the constellation of economic and security complexities is getting bigger and harder to manage.
A pioneering academic computer scientist from Princeton University wrote, “I do not expect a comprehensive solution in this area, nor one that makes everybody happy. These will continue to be contested areas, with different parties using legal, political, and technological means to advance their interests. We will have a stronger and better-defined notion of how to protect vulnerable populations such as children. We will have a better-defined set of social norms around the use of private information. We will have a better understanding of how ‘pseudonymous’ information about behavior and relationships affects people’s privacy interests.”
[private video surveillance]
A distinguished engineer, working in networking for Dell, wrote, “There are too many challenges to maintaining privacy and providing security at the same time. In some ways, they are conflicting goals. People will become more aware of the lack of privacy, but, if at all, there will be less of it.”
Brian Butler, a professor at the University of Maryland, responded, “Within the United States, we have already largely decided to privilege the corporate use of personal data for ‘utilitarian’ purposes, to the point where is it difficult to see what could happen in the next twelve years to shift this… I suspect that there is a developing privacy ‘divide’—one group (the same group that ‘wins’ in the case of the digital divide) will have the technical and literacy skills to manage privacy—but because of this, they will be cavalier with this (hence the ‘privacy is outdated’ idea). On the other hand, individuals who lack the skills, ability, or time to manage this complex issue will be increasingly subject and resigned to less and less privacy and control. One thing that we have to remember is that it is hard to figure out the institutional and technical aspects of privacy if you are working two or three high-effort jobs and trying to stay awake.”
A principal engineer with Ericsson wrote, “The real danger here is not just the further invasions of privacy, but also the increasing impression that people can have their behavior modified ‘for the good’ through these means. The danger comes when people move from attempting to modify behavior for commercial reasons to trying to modify behavior for political ones, by examining what makes people think a certain way or prompts them to take action or causes them to believe certain things en masse. In fact, it can be argued that we are already seeing this sort of thing take place, with large data analytics firms, such as Google and Facebook, getting deeply involved in politics. The Internet will go in one of two directions: either people will reject behavior modification through data mining en masse, or we will become so habituated to having our behavior modified through data mining that we will not even consider the consequences by the time 2025 rolls around. It is hard to tell which direction things are going to go at this point, but if it is the former, the backlash against technology in general is going to be greater than we imagine, I think. By 2025, privacy will be a moot issue, most likely. Instead, we will be focusing on the moral issues behind using proven techniques of behavior modification, if there is any debate at all.”
Themes in responses of those expecting a trusted and reliable privacy arrangement by 2025
Theme 1) Citizens and consumers will have more control thanks to new tools that give them the power to negotiate with corporations and work around governments. Individuals will be able to choose to share personal information in a tiered approach that offers varied levels of protection and access by others.
[It]
David Weinberger, a senior researcher at Harvard’s Berkman Center for Internet & Society, observed, “They will because they have to. Unfortunately, the incentives are unequal: There is a strong incentive to enable strong privacy for transactions, but much less for enabling individuals to control their own info. So, of course, I do not actually know how this will shake out. I assume we will accept that humans do stupid things, and we will forgive one another for them. When your walls are paper, that is what you have to do.”
[in answering the survey question]
JP Rangaswami, chief scientist for Salesforce.com predicted, “I suspect that, in times to come, privacy rights will begin to look like the ‘Four Drivers’ in the Nohria-Lawrence ‘Driven’ model: the right to ‘defend’ private information; the right to ‘bond,’ or share, it; the right to ‘learn,’ or gain insights, from it; and the right to ‘acquire,’ or own, it. As we learn more about the value of personal and collective information, our approach to such information will mirror our natural motivations. We will learn to develop and extend these rights. The most important change will be to do with collective (sometimes, but not always, public) information. We will learn to value it more; we will appreciate the trade-offs between personal and collective information; we will allow those learnings to inform us when it comes to mores, conventions, and legislation.”
An anonymous survey participant who works in the US executive branch, commented, “Governments will have to learn to do more as public-private partnerships and active engagement with citizens to do crowdsourcing. The nation-state model is already being challenged; issues span borders and across sectors. The infrastructure will require transparency among governments as a trusted partner—but also recognizing that not all data can or should be made open. We will be trusting machines more; we will have our digital device (a smartphone, an embedded device in us, etc.) interface with systems to pre-negotiate what information we will and will not share. End-user licensing agreements will be machine-to-machine.”
David Bollier, a long-time scholar and activist focused on the commons, responded, “There are feasible alternatives already being developed, such as by ID3 in Boston. Here are two pieces that shine a light on this area—one by Doc Searls (summarizing Fred Wilson) … and the other, my own piece (with John Clippinger) on ‘authority and governance’ as the next big Internet disruption… The existing structures are highly unlikely to yield the infrastructure that we need—but an alternative system is still possible, if only because the latent network value of doing so is so huge. Assuming the infrastructure development pathway mentioned above comes to pass, privacy norms will be affirmatively structured and managed, mostly by tech systems amenable to meaningful human control, rather than ‘taken for granted’ as a natural social reality. This will require that ordinary individuals be empowered to protect their privacy rather than relying upon government surrogates to do so. We have seen how government is far too beholden to national security and incumbent corporate interests, and too centralized and bureaucratic in a networked age, to be an effective watchdog and implementer of larger collective concerns.”
Theme 2) The backlash against the most egregious privacy invasions will bring a new equilibrium between consumers, governments, and businesses—and more-savvy citizens will get better at hiding things they do not want others to see.
Peter McCann, a senior staff engineer in the telecommunications industry, responded, “There is a large momentum toward increasing privacy protections on the Internet in the wake of the Snowden revelations. A new infrastructure of pseudonymous communication and transaction will be created over the next few years, with robust privacy protections built in. These protections will take the form of a distributed database, where cooperation among many entities will be required to reveal personal information about a user, making the secret warrant useless, and warrantless intrusions on privacy impossible. There will be a broad expectation of privacy unless social norms are violated in an obvious way, in which case, the offender will be rapidly tracked down and sanctioned.”
Christian Huitema, a distinguished engineer with Microsoft, replied, “I expect many efforts to make the Internet more robust to attacks, including attacks by secret services. But, I do not think that privacy rights can be protected by an ‘infrastructure.’ They can, on the other hand, emerge from competition, i.e., ‘free as spy’ services competing with some ‘pay and trusted’ services. People are going to learn what to share and how to share it. We see that, already, among the young generation. Project a neat, public image, and keep your personal stuff actually private.”
Tom Standage, digital editor for The Economist, wrote, “As with financial regulation, privacy regulation makes progress as a result of regular crises. Technology firms (and security agencies) will repeatedly over-reach and then be brought into line by consumer pressure/boycotts and new regulations. In this way, we will discover where people would like to draw the line when it comes to paying for Internet services using personal data. I think this trade-off will become more explicit: use this service free by giving us access to your data, or pay for it. For a long time, it has been assumed that Gen Y-ers have a different attitude to privacy and are more inclined to make everything public; the success of Snapchat this year suggests otherwise. As people get older, they worry about this more. It is possible to have mass take-up of publishing tools, while also agreeing that it makes sense to keep some things private.”
Theme 3) Living a public life is the new default. People will get used to this, adjust their norms, and accept more sharing and collection of data as a part of life—especially Millennials and the young people who follow them. Problems will persist and some will complain but most will not object or muster the energy to push back against this new reality in their lives.
Stewart Baker, a partner at Steptoe & Johnson, a Washington law firm, wrote, “Security is a pain in the butt, a major inconvenience. It also hampers innovation. We will not give up convenience and innovation without living through a disaster. Almost everything we are shocked and worried about—including all the things we are saying the government should never do—will be commonplace by 2025. And, it will not really bother us that much. Privacy is the most malleable of expectations.”
Ben Shneiderman, professor of computer science at the University of Maryland, wrote, “There will continue to be pressures for increased security, liberty, and privacy, but there are powerful forces working to enable businesses to track behavior, as well as government to monitor activity. While I am not fearful of dystopian futures, doing things on the Internet will be much more like being in public than being in the protected privacy of your home. Recognition of the Internet as a public, and not private, space will be more widespread. There will still be scams, pornography, stalking, etc., but the worst cases will be stopped, and Internet benefits will outweigh threats. Premium services that offer more privacy will be valued.”
Marjory Blumenthal, a science and technology policy analyst, wrote, “There is a lot of pressure to do something—now. So, one can expect work on an infrastructure that will be relatively secure. Whether it will be popularly accepted—that is harder to say, since skepticism has skyrocketed. People will become more aware of the tradeoffs, which will drive an evolution of norms. They will also have become more sophisticated about choices regarding disclosures they make, exercising finer-grained control—in part because there will be more technical support for doing so—and there will also have been evolution of the legal and regulatory framework.”
Jeff Jaffe, CEO for the World Wide Web Consortium, the standards-setting body for the Web, wrote, “Today’s policy makers have difficulty in making basic policy tradeoffs in existing areas such as spending and taxes. They are not ready to step up to these new complex issues. The generation of teenagers growing to adulthood will have different norms for privacy than today’s adults.”
Jonathan Grudin, principal researcher at Microsoft Research, responded, “There is an inevitable tension between potential commercial exploitation of personal information by businesses, including those that are well-intentioned, and the desires of some individuals. Businesses will always be motivated to push infrastructure boundaries, whatever they are. In fact, the more work we invest in developing a framework that seems balanced, the more a business can find grey areas, workarounds, and loopholes in good conscience. Young people are more used to a world with cameras everywhere. They spend more time online and identified. The older generation developed behavioral habits that assumed a degree of privacy that young people have not experienced. What oldsters would have to give up, young people will not miss. In 2025, more of the population will have grown up in the new world, so concern about privacy will decrease and perhaps shift in emphasis. Of course, the dwindling ranks of dinosaurs may not see things much differently than they do now.”
Privacy is a passing artifact of the industrial age: One further insight emerged in several answers—that privacy might gradually fade and become recognized as a social construct of the industrial age. Several noted that the rise in urbanization that came once factories were built moved people from villages where they enjoyed little privacy into social settings where privacy among the masses could be achieved. Now, the pervasive social connectivity and awareness afforded by digital technology could be returning people to that village-like environment.
One version of that thought came from Bud Levin, a futurist, and professor of psychology at Blue Ridge Community College in Virginia: “Increasingly, and gradually, people will realize that privacy, anonymity, confidentiality, secrecy, and similar constructs of the industrial age, are giving way to ubiquitous transparency. Consider how we might behave when we know that everything we do is or could easily become headline news. Privacy laws will become more obviously incompatible with normal behavior. They will be trying to push back the ocean. That is likely to generate increasing contempt for government.” And Vickie Kline, an associate professor at York College responded, “Government and corporate intelligence and hackers will always keep us outside of the comfort zone. I wonder if the expectation of privacy as a right will gradually fade as people experience less actual privacy in their lives.”
One thoughtful writer synthesized a variety of “yes” and “no” themes and put them in the context of a future in which the Internet of Things, more powerful artificial intelligence, big-data analytics, and other factors combine to learn and infer things about individuals.
Barry Chudakov, founder and principal of Sertain Research, wrote:
“By 2025, we will begin to define an emergent problem: any secure, popularly accepted, and trusted privacy-rights infrastructure must balance transparency with intrusion, and as we develop subtler and more powerful technologies that become ever more intrusive, we will realize how difficult this is.
We will continue to monetize watching and tracking; cameras and recognition technologies will create ‘everyware.’ As we do, rights and choices will collide; we will struggle to satisfy forces of personal privacy, secure data, compelling content, and tracking and analytics. This entails ‘thinking fast and slow’—and in a decade, we will still struggle with statistical (probability) thinking versus quick-get thinking. We will be challenged to not fall in love with our invasive tracking, watching, and predictive technologies and their beautiful data displays, marketed in easy-to-use formats.
We will slowly realize the inherent conflicts between our data summations and the reality they are summarizing. Privacy will ostensibly be hidden behind this mask of abstract data; it may well be hidden by the seductive insights of simulation. This will create both intense interest and equally intense insecurity about personal information. As monitoring and statistical tools enable us to abstract behaviors to norms, trends, and predictions, it is inevitable, given our inclination to turn information into more information, which we will engage with the abstraction as if it were real—as if it were the concrete thing it is abstracting. This can lead to inaccurate, even wildly distorted, perceptions, as we saw in the credit default swaps of the 2008 Wall Street meltdown.
A healthy tension will arise: many of us will separate the thing (whatever it is we are tracking and analyzing) from the data abstraction; but equally, many others will not—either because they cannot (they are not trained or equipped to do so) or because they do not want to. Yes, more average persons will start to understand opt-in data capture and monitoring protocols that enable tracking and analytics. But, when our gestures and bodily identifiers—gait, ear lobes, eye movements, faces, emotional responses, or behaviors and choices—are the content of that tracking and analysis, we ourselves become the abstraction.
I do not believe that, in a decade, we will have resolved all the quandaries of this new reality. We will become smarter about it, but we will also be more conflicted. This abstraction of our actions and inclinations is bedeviling because privacy and tracking and analytics should be at odds with each other; they are strange bedfellows, and we are better served by understanding that tension among them is healthy. Further, once our movements and choices and behaviors are captured, digested, and brought to some enhanced understanding, we may know more about certain actions.
But, in our delight over data and analytics, will we match that with enhanced empathy for and understanding of each other? The public will slowly come to realize that privacy is what we are left with after our technology enables discovery of what we want to know. Do we want to know your actions, your behaviors, and your pathway through the city or store? Do we want to know your face, your emotions, and your demographic data? Do we want to know if you were in a certain place at a certain time? Such knowledge, and much more detailed knowledge of behavior patterns, trends, and predictions, will define privacy in the broader social context.
Like the remainder in a division problem, privacy is the problematic ‘answer’ after we divide our lives by the technology to watch and measure those lives. Knowledge is power (and profit) for some, and it is so-called ‘security’ for others. The struggle with the balance between the far-reaching knowing of technology and keeping our identity intact is the future of privacy in a broader social context. We now think of privacy as the ability to keep our information to ourselves. By 2025, we will expand that to include the ability to keep our identity and natures from being invaded by other techno-forces, as well as to keep our identity in line with our intent and volition.
The desire for fame has already put identity up for sale; as our technologies enable us to know virtually anything about others, privacy will become a commodity. It will be sold to the highest bidder; privacy will become the offshore bank account of identity. Regular people will be transparent; those who can pay for opacity will do so via new services and business models. While I think this will eventually be available to more than just the wealthy, we will not have sorted this all out in just a decade, and so, privacy will be available to those with the fattest bank account.”