May 2, 2013
“The ubiquity and power of the computer blur the distinction between public and private information. Our revolution will not be in gathering data — don’t look for TV cameras in your bedroom — but in analyzing information that is already willingly shared.”
Are these the words of a 21st century media critic warning us about the tremendous quantity of information that the average person shares online?
Nope. It’s from a 1985 article for the Whole Earth Review by Larry Hunter, who was writing about the future of privacy. And it’s unlikely Mr. Hunter could have any more accurately predicted the Age of Facebook — or its most pervasive fears.
Hunter begins his article by explaining that he has a privileged peek into the computerized world that’s just over the horizon:
I live in the future. As a graduate student in Artificial Intelligence at Yale University, I am now using computer equipment that will be commonplace five years from now. I have a powerful workstation on my desk, connected in a high-speed network to more than one hundred other such machines, and, through other networks, to thousands of other computers and their users. I use these machines not only for research, but to keep my schedule, to write letters and articles, to read nationwide electronic “bulletin boards,” to send electronic mail, and sometimes just to play games. I make constant use of fancy graphics, text formatters, laser printers — you name it. My gadgets are both my desk and my window on the world. I’m quite lucky to have access to all these machines.
He warns, however, that this connectedness will very likely come with a price.
Without any conspiratorial snooping or Big Brother antics, we may find our actions, our lifestyles, and even our beliefs under increasing public scrutiny as we move into the information age.
Hunter outlines the myriad ways that corporations and governments will be able to monitor public behavior in the future. He explains how bloc modelling helps institutions create profiles that can be used for either benign or nefarious purposes. We can guess that credit service companies beginning to sell much more specific demographic information to credit card companies in the early 1980s generally falls into the nefarious column:
How does Citicorp know what your lifestyle is? How can they sell such information without your permission? The answer is simple: You’ve been giving out clues about yourself for years. Buying, working, socializing, and traveling are acts you do in public. Your lifestyle, income, education, home, and family are all deductible from existing records. The information that can be extracted from mundane records like your Visa or Mastercard receipts, phone bill, and credit record is all that’s needed to put together a remarkably complete picture of who you are, what you do, and even what you think.
And all this buying, working and socializing didn’t even include through mediums like Facebook or Twitter in 1985. Hunter explains that this information, of course, can be used in a number of different ways to build complex pictures of the world:
While the relationship between two people in an organization is rarely very informative by itself, when pairs of relationships are connected, patterns can be detected. The people being modeled are broken up into groups, or blocs. The assumption made by modelers is that people in similar positions behave similarly. Blocs aren’t tightly knit groups. You may never have heard of someone in your bloc, but because you both share a similar relationship with some third party you are lumped together. Your membership in a bloc might become the basis of a wide variety of judgements, from who gets job perks to who gets investigated by the FBI.
In the article Hunter asks when private information is considered public; a question that is increasingly difficult to answer with the proliferation of high-quality cameras in our pockets, and on some on our heads.
We live in a world of private and public acts. We consider what we do in our own bedrooms to be our own business; what we do on the street or in the supermarket is open for everyone to see. In the information age, our public acts disclose our private dispositions, even more than a camera in the bedroom would. This doesn’t necessarily mean we should bring a veil of secrecy over public acts. The vast amount of public information both serves and endangers us.
Hunter explains the difficulty in policing how all of this information being collected might be used. He makes reference to a metaphor by Jerry Samet, a Professor of Philosophy at Bentley College who explained that while we consider it an invasion of privacy to look inside someone’s window from the outside, we have no objection to people inside their own homes looking at those outside on the public sidewalk.
This is perhaps what makes people so creeped out by Google Glass. The camera is attached to the user’s face. We can’t outlaw someone gazing out into the world. But the added dimension that someone might be recording that for posterity — or collecting and sharing information in such a way — is naturally upsetting to many people.
Why not make gathering this information against the law? Think of Samet’s metaphor: do we really want to ban looking out the window? The information about groups and individuals that is public is public for a reason. Being able to write down what I see is fundamental to freedom of expression and belief, the freedoms we are trying to protect. Furthermore, public records serve us in very specific, important ways. We can have and use credit because credit records are kept. Supermarkets must keep track of their inventories, and since their customers prefer that they accept checks, they keep information on the financial status of people who shop in their store. In short, keeping and using the kind of data that can be turned into personal profiles is fundamental to our way of life — we cannot stop gathering this information.
And this seems to be the same question we ask of our age. If we volunteer an incredibly large amount of information to Twitter in exchange for a free communications service, or to Visa in exchange for the convenience of making payments by credit card, what can we reasonably protect?
Hunter’s prescription sounds reasonable, yet somehow quaint almost three decade later. He proposes treating information more as a form of intangible property, not unlike copyright.
People under scrutiny ought to be able to exert some control over what other people do with that personal information. Our society grants individuals control over the activities of others primarily through the idea of property. A reasonable way to give individuals control over information about them is to vest them with a property interest in that information. Information about me is, in part, my property. Other people may, of course, also have an interest in that information. Citibank has some legitimate interests in the information about me that it has gathered. When my neighbor writes down that I was wearing a red sweater, both of us should share in the ownership of that information.
Obviously, many of Hunter’s predictions about the way in which information would be used came true. But it would seem that there are still no easy answers to how private citizens might reasonably protect information about themselves that’s collected — whether that’s by corporations, governments or other private citizens.
Chillingly, Hunter predicted some of our most dire concerns when Mark Zuckerberg wasn’t yet even a year old: “Soon celebrities and politicians will not be the only ones who have public images but no private lives — it will be all of us. We must take control of the information about ourselves. We should own our personal profiles, not be bought and sold by them.”
What do you think? Does our age of ubiquitous sharing concern you? Do you think our evolving standard of what is considered private information generally helps or hurts society?
June 27, 2012
In 1987, Bill Gates became the world’s youngest self-made billionaire, making the Forbes 400 Richest People in America list with a net worth of $1.25 billion, up from a measly $900 million the year before. Gates was just 32 years old and Microsoft Windows was still very much in its infancy, the operating system having been introduced just a couple of years earlier in November 1985. The world of 1987 was an exciting one for Gates and he saw even more exciting things ahead.
The January 1987 issue of OMNI magazine featured predictions from 14 “great minds” about what the future held; specifically the world of 20 years hence. Bill Gates predicted that the world of 2007 would be filled with flat panel displays, diverse forms of interactive entertainment, highly advanced voice recognition software and the ability to access vast quantities of information at the touch of a button — this was a capital I, capital A, Information Age.
Gates explains the typical home of 2007:
You’re sitting at home. You have a variety of image libraries that will contain, say, all the world’s best art. You’ll also have very cheap, flat panel-display devices throughout your house that will provide resolution so good that viewing a projection will be like looking at an original oil painting. It will be that realistic.
And the information that is accessed with the help of these displays will seem limitless. His idea of a world database sounds quite familiar to the 1981 predictions of Neil Ardley that we looked at a few months back.
In 20 years the Information Age will be here, absolutely. The dream of having the world database at your fingertips will have become a reality. You’ll even be able to call up a video show and place yourself in it. Today, if you want to create an image on a screen — a beach with the sun and waves — you’ve got to take a picture of it. But in 20 years you’ll literally construct your own images and scenes. You will have stored very high-level representations of what the sun looks like or how the wind blows. If you want a certain movie star to be sitting on a beach, kind of being lazy, believe me, you’ll be able to do that. People are already doing these things.
Gates predicts the perfection of a technology that has been around for decades, but one that many people of 2012 might associate with the name Siri: voice recognition.
Also, we will have serious voice recognition. I expect to wake up and say, “Show me some nice Da Vinci stuff,” and my ceiling, a high-resolution display, will show me what I want to see—or call up any sort of music or video. The world will be online, and you will be able to simulate just about anything.
I would love to see an iPhone commercial where Zooey Deschanel or Samuel L. Jackson say “Siri, show me some nice Da Vinci stuff.”
Gates continues by explaining that you’ll be able to realistically simulate racing formula cars in Daytona but worries what it might mean when people no longer have any reason to leave the house.
There’s a scary question to all this: How necessary will it be to go to real places or do real things? I mean, in 20 years we will synthesize reality. We’ll do it super-realistically and in real time. The machine will check its database and think of some stories you might tell, songs you might sing, jokes you might not have heard before. Today we simply synthesize flight simulation.
Gates believed that all of our technological advancements would also mean the end of credit cards and checks — old technologies replaced by voice and fingerprint recognition.
A lot of things are going to vanish from our lives. There will be a machine that keys off of physiological traits, whether it’s voiceprint or fingerprint, so credit cards and checks — pretty flimsy deals anyway — have to go.
Gates also welcomed the death of what he calls “passive entertainment.”
I hope passive entertainment will disappear. People want to get involved. It will really start to change the quality of entertainment because it will be so individualized. If you like Bill Cosby, then there will be a digital description of Cosby, his mannerisms and appearance, and you will build your own show from that.
Later in the article Gates is cautious and believes that we may eventually test just how much information the human mind can take.
Probably all this progress will be pretty disruptive stuff. We’ll really find out what the human brain can do, but we’ll have serious problems about the purpose of it all. We’re going to find out how curious we are and how much stimulation we can take. There have been experiments in which a monkey can choose to ingest cocaine and the monkey keeps going to create some pretty intense experiences through synthesized video-audio. Do you think you’ll reach a point of satisfaction when you no longer have to try something new or make something better? Life is really going to change; your ability to access satisfying experiences will be so large.
Gates ends his article by explaining that he doesn’t think we can really extrapolate with much accuracy from the year 1987.
But in the next 20 years you won’t be able to extrapolate the rate of progress from any previous pattern or curve because the new chips, these local intelligences that can process information, will cause a warp in what it’s possible to do. The leap will be unique. I can’t think of any equivalent phenomenon in history.
I’d argue that the vast majority of Gates’ predictions are actually fairly accurate. Here in the year 2012 we’ve seen many of his ideas about the world of 2007 become a reality. But perhaps the most interesting prediction of the bunch is about interactive entertainment. It’s fascinating that the internet has given rise to a remix culture that values slightly different modes of interaction — from the creation of a new video itself right down to the comments — though they’re typically unsanctioned by the original artists and rights holders.
For the time being, it would seem that modern copyright law makes these forms of remix entertainment targets for litigation — despite many obvious examples of fair use. And it’s not just remix culture, but the right to parody itself that has been under attack with the rise of the internet. An animated parody show about Bill Cosby himself, called House of Cosbys received a cease and desist letter in 2005 for even daring to imitate Bill Cosby’s voice and likeness. And if you’ve ever seen House of Cosbys you can probably attest that it’s likely not what Bill Gates had in mind when he was picturing the future.
Image above is a screenshot from this video: