The Naked Future: What Happens in a World That Anticipates Your Every Move?
The Naked Future: What Happens in a World That Anticipates Your Every Move?

The Naked Future: What Happens in a World That Anticipates Your Every Move?

Jun 9, 2014

TV Show

Highlights

Today we create information in everything that we do, and there is no going back. But instead of seeing this as as a threat, we should seize the opportunity to use it to our advantage, says Patrick Tucker. Big data can improve our lives, offering everything from more informed consumer choices to more accurate and detailed medical data.

Introduction

JOANNE MYERS: Good afternoon. I'm Joanne Myers, and on behalf of the Carnegie Council want to thank you all for joining us.

Our speaker is Patrick Tucker. He is the author of The Naked Future: What Happens in a World that Anticipates Your Every Move? In March of this year, this book was cited as an Amazon Best Book of the Month.

In the last couple of decades, technology, once solely the province of science fiction, has become a startling reality. It seems that every day we are besieged with new ideas, new advances, and more data, giving people around the world new opportunities to shape their own destinies and to accurately predict aspects of the future.

But what happens when technology is used to shape our reality in unwanted ways? What happens when the emergence of potentially intrusive technology—such as video facial recognition, biometrics, DNA profiling, or other such data—is analyzed and distributed in a manner that invades our privacy at every turn? What happens when we are able to predict guilt before a person commits a crime or quarantine someone 99 percent likely to have a deadly disease while they still appear to be healthy?

While acknowledging that new technology cannot be deployed without some element of risk and uncertainty as to future consequences, Patrick takes the view that personal data is nothing less than a superpower waiting to be harnessed. He posits that the worst possible mistake that society can make is to demand that technological progress reverse itself by trying to avoid its power. A better solution, he says, is to familiarize ourselves with not only how these tools work, but how they can be used by individuals to live more healthily, realize more of our own goals and in less time, and avoid inconvenience and danger.

As we listen to Patrick discuss the future, and before you draw any conclusions, I would like to suggest some things you might want to think about as he discusses the ways you and I might publicly and privately relate to new technology.

Questions that come to mind are: How private do you want to be? How secure do you believe you are? What obligation do you have to share some of your personal data for the benefit of society? In other words, how do we reconcile an ethic of public prudence with a culture of innovation?

In order to move fast-forward into the future, please join me in welcoming in our guest today, Patrick Tucker.

Remarks

PATRICK TUCKER: Thank you, Joanne. I apologize for being a little bit late. I didn't anticipate it, which is terrifying because I am a futurist.

An interesting thing just happened. It happens every time somebody gives a talk. Someone come up and says, "Turn off your cell phones."

To do something a little bit crazy here for a moment, go ahead and take out your cell phone and turn it on.

Here is what is going to happen. Two things are going to happen as we do this. Number one, at some point during the presentation, someone's phone is going to ring. That person has to answer the phone and we all have to say "hello" to whoever is on the other end.

The second thing that happens whenever you turn on your phone is this: Your phone is contacted by a cell base station, what we call a cell tower, and they begin trading signals. A lot of people know this. What a lot of people do not know is that that cell tower then begins talking to other cell towers that are around it and then they begin trading signals too. The number of cell towers that engage in these sorts of conversations has been rapidly advancing.

So it is a great big conversation among a whole bunch of different machines about where you are and where you are going. They are having that conversation so that cell towers can figure out which cell tower is the best one to provide your phone with service.

A little bit weird, right? Now, if you are like me, you have a smartphone. This is mine [indicating]. I usually keep the GPS [Global Positioning System] capability on my smartphone turned off because it eats my battery life. I am going to go ahead and turn that on. Yours is probably on too. As I do that, see if you hear anything.

It is already starting. This is good. Listen to these. These are the sounds of the machine world talking about you and where you are and what you are doing. We do not pay attention to it nearly as much as we should—and this is the entire point of the talk.

So I am going to turn on my phone's GPS. As I do this, see if you hear anything unusual.

No. Nothing, right?

But the second I do this, my phone begins to receive signals from the Global Positioning System, which is a network of satellites originally created by DARPA [Defense Advanced Research Projects Agency]. They begin also having conversations with my carrier about where I am going, what I am going to do.

I think this is really miraculous. Raise your hand if you have a smartphone.

[Show of hands]

That's everybody, including the guy not raising his hand because he is playing with his smartphone. [Laughter]

Officially, smartphone penetration in the United States right now is 70 percent of the U.S. population. Every time I have given this talk, it has been 100 percent of the audience. I am not exactly sure why those numbers do not match up, but I think it has something to do with the people that I give this talk to—or Pew really screwed up. I am not sure.

You can walk into any room in this country and you can ask people, "Hey, are you carrying around a device that is constantly talking about you to a wide network of other machines and receiving signals from space? " and everyone will wave their hand, "Yes." But no one thinks to ask, what are those machines saying about me right now? We do not.

This is big data. This is what we are talking about when we are talking about big data, those conversations that are happening right now between all of our machines about us. With email and video and music streaming and posting and tweeting—mostly streaming—we create on the order of 1.8 million megabytes of data each year just through passive interactions with machines. That is about nine CD-ROMs a day. Those numbers are from IDC [International Data Corporation]. IBM has some numbers that compete with that that actually put that a little bit higher.

That is big data. It is you. It is 65 billion location-tagged payments that happen annually in the United States. It is 154 billion emails. It is the reason why 90 percent of all of the data in the world was generated within just the last three years. And it looks like that. It looks a lot like you streaming music or video on your phone.

Now, not all of it is kept or stored permanently and not all of it is necessarily useful, but all of it says something about you. We created a whole bunch of it just a moment ago when we all turned on our phones and accessed GPS.

Those are some very large numbers. But the thing to keep in mind about that is that, as big as they sound, there is going to be 44 times as much digital information in the year 2020 as there was in 2009—44 times. This is really key. It's not just a silly statistic.

It really speaks to why I wrote this book: because you can't say you've got a way out. You might hate everything about the digital ecosystem. You might really resent the fact that your phone seems to be conversing with unseen parties about you all the time in a language that you can't hear. You might really object to it—and you should.

But that doesn't mean we are going back. You cannot go from 44 times as much digital information in 2020 to 43 times to 39 times. We can all get up and walk outside and throw our phones in the street and really decide to inconvenience ourselves for a couple of weeks, until we finally burst. But, basically, we are going to be creating exponentially more data in the year 2020 than we do right now. There is no going back. Welcome. It's happening.

Think about it this way, too. In 2008, the number of interconnected devices first surpassed the human population. In the year 2020—this is according to Cisco—the number of interconnected devices will be 50 billion. If they all decided to join ranks against us, it is 50 billion to what is projected to be 7.6 billion people. So we are already toast, as long as we are talking about machines having conversations with one another about us.

With that in mind, for me the next question was: What if I understood more about what this phone is saying about me with those cell towers? What if I could make more use of that information?

Somebody out there is able to make use of that information—otherwise it wouldn't exist. It is ultimately for people. The machines themselves have no agenda; they want only to live. People make use of that information. What if I were one of those people? What if I decided I wanted to be someone who understood or could find relevance in my phone's conversations with cell towers and with satellites?

This, for me, is a really important point, because I believe that the most important theme of the next two decades is the rising tension between empowered individuals and declining institutions. Here is what I mean by that.

For a long time I was deputy editor of a magazine called The Futurist, and I am still their editor-at-large. It was a really fun job. Every so often I got to talk with science fiction authors. I talked with Cory Doctorow once. I asked him, "How do you come up with the ideas for your science fiction? "

He said, "Pick something that's difficult, complicated, and expensive for people to do, and then imagine that thing becoming simple, cheap, and inexpensive, and then write about it, because that's what's happening. "

That is the story of the future. That is the story of big data too. It is the story of computation. Something that was expensive and complicated, that lived in university warehouses at the University of Pennsylvania, that was really large and didn't perform very well, got cheaper. It became more numerous. It moved from an academic/military thing to a B2B [business-to-business] thing and finally to something that empowered the consumer revolution. That is the story of all computation.

It is the story of big data, too. Believe it. Big data right now is something that it seems like governments created, that businesses used to triangulate your behavior.

The good news is there is no reason why it cannot be something that you use as well. You created this data. You have to ask for it; you have to demand it. You have to demand changes in law; you have to demand laws to protect you. But it's yours; you made it.

That is the story of technology: As it becomes cheaper, it becomes available to more people—governments, to business, to finally consumers. So be excited about that. That is the first thing I want you to do. The future is inevitable. You have to be excited about it.

Over the course of the last year, this led me to write this book. It was a really fun, amazing project.

I went out to find people who were doing just that—who were learning how to listen and make use of the data that we create through all of our interactions. They happened upon some really remarkable insights. Here is a case in point.

Raise your hand if you know with 80 percent certainty or higher where you are going to be a year and a half from right now. Raise your hand if you know that.

[No response]

Every time I do this, there is always one person who raises their hand. This is the first time they didn't do it. I always look at them like, "Really? "

If you had asked me a little while ago, I would have totally said "maybe here." If you had asked me a week ago, I would have said "here," but I would have said "here at 5:30." And I was wrong. You can't know this.

If you know where somebody lives, you know where somebody works, you have two data points and you can make a reasonable inference. That is your baseline—50 percent accuracy. But it is a guess. It is a guess based on two data points. It is not accuracy.

Now, here's the thing: This is information that you are giving away all the time in your phone. Your phone knows. Two researchers, Adam Sadilek and John Krumm—Adam Sadilek is now with Google. He was at the University of Rochester. He's a very young man—brilliant. John Krumm is with Microsoft since forever.

They wanted to see how you could use someone's location data, if you had a large enough data set—a big data set—to predict where they were going to be.

John Krumm came up with this really remarkable experiment. He took some Microsoft money and he found a whole bunch of test subjects and he paid them money to carry around a GPS receiver, not unlike the one that is constantly talking to satellites in your phone. After 3,200 days of GPS recordings—that's a long time; it's like six years of just watching people walk around and seeing everyplace that they go—he found that he could predict a subject's future location 80 weeks in advance with higher than 80 percent accuracy.

We are not quite as random as we think. There are these underlying patterns that bubble up that we can't pay attention to because we do not have perfect memory—or at least we didn't. But now we do. It is our memory too. We buy this device. Someone else gets the data. But we made it and it is stored here for us.

Next question: If I am like Adam Sadilek, if I am like John Krumm, a regular person, and I can predict my future location with 80 percent in advance, that is sort of cool. But what would I do with that?

We can be dystopian and we can say, "I do not want to live in a world where that is possible because that means someone is going to stalk me and grab me and ambush me. It's terrible."

That is true if only one person has that capability and nobody else does. But if everybody has it, then all of a sudden what was living in a dystopian world full of surveillance becomes living with a superpower. If you can predict for yourself where you are going to be, you have the superpower. If someone else has that on you, then you are living in a dystopian surveillance state. It is a subtle change. If you've got it, it is a superpower; if someone else has it, then you are living in a surveillance state.

So what would a regular person do with that power, just a normal person? I think if all of us had it, we would not use it to stalk people and ambush them or stuff like that. We would probably use it to avoid people we did not want to see. It is like the total opposite. That is what a normal person does with that. You're, like, "Oh, I am so not going to that thing," especially in New York.

Adam Sadilek teamed up with a couple of researchers from Johns Hopkins. What they found was that, using that formula, you could predict with a pretty good amount of accuracy the next person to give you a cold.

Here is how it works. They took 4 million tweets from about 6,000 New Yorkers. These were geotagged tweets—people posting to Twitter where they were, how they felt, where they were going. He extrapolated from that this enormous network. He saw all these different interlinks between these people. These were people that were making very public broadcasts, again, about their state of being and where they were headed. He found that with these 4 million tweets from these 6,000 New Yorkers, he could predict 18 percent of the person-to-person flu transmissions that happened between them. That's a remarkable level of resolution when you consider the huge number of variables that go into predicting the next person to give you a flu. It would be much higher if more people geotagged their tweets and, of course, if we could model for surface transmission, which is really difficult.

But in a future where we have probably UV [ultraviolet] lasers and UV sensors attached to different surfaces to pick up bacterial agents that meet up with them, and where everybody is geotagging their tweets—that's the main thing, where everybody is geotagging their tweets—when everybody is making public statements about "here I am, here's how I feel, here's who I'm hanging out with, for this long, " then we will be able to predict the person that is about to give you the flu. It is pretty remarkable.

Most of you are probably thinking to yourself, "I don't know if I'm comfortable with that behavior. I do not feel really great about telling everybody where I am going and how I am feeling. " And this is true.

Twitter finds that people that are under the weather are a bit less likely to sick-tweet. It was a hard thing for them to figure out how to do. They had to actually teach a machine to understand it. Because it was 4 million tweets, they had to teach a machine to semantically understand the difference between rhetorical sickness in a tweet and literal sickness. So there is "Bieber fever," which is not a thing, and dengue fever, which is a thing.

After they did that, then they realized this resolution. It was pretty remarkable.

But most of us do not actually feel that comfortable yet geotagging our tweets. It is a behavior that rises roughly in correlation with smartphone ownership. So the more likely you are to own one of these things that lets you do that, then the chance that you are going to partake in that behavior goes up. In the same way that if you own a bunch of hammers you become somebody that starts hammering a lot, if you own one of these, you get to the point eventually where you just start using the services that this thing vends to you.

Having said that, many people are not comfortable making that data public. We are freaked out by the NSA [National Security Agency].

We are freaked out about Target. "Target is going to use that information that I just tweeted to target me with context-aware advertising." We hate this. This is the future, though. This is the future that we are all moving towards.

You are walking by a store. You already feel like you have the flu, and you get hit with a CVS coupon that says, "Hey, Patrick, we noticed that you just were hanging out with Stephanie, and, boy, does she have that dengue fever," or something awful. "You are already probably experiencing symptoms. Why don't you just pop on over here, because we've got a special just for you on Echinacea and vitamin C booster. You're welcome. We love you. CVS."

This is context-aware advertising. You do not have to be a brilliant futurist to know that that is going to be showing up in your phones—not necessarily in terms of a push notification, but probably, when you start accessing Facebook and Twitter more often in locations that give some information about where you are. It is just going to happen. We do not like that.

Now, what if—because, again, I want you to be excited about the inevitable future—what if the information that you are giving away can also be used to protect you from coercive advertising? Imagine getting a push notification on your phone telling you the probability of you disliking a purchase that you are being pushed to make, telling you that you have a 60 percent probability of not liking buying vitamin C booster and Echinacea at CVS because Echinacea—that solution is based on superstition, and the vitamin C booster you are going to pee most of it out anyway. Skip it.

This is information that you also give away in your phone. You actually give it away all the time. It is just that you have not taken advantage of it yet.

There is a start-up, though, that can let you do this. I have no proprietary interest in it. It is called Tictrac. It was started by a guy named Martin Blinder. What it does is it allows you to take all of the different data streams that make up your life—it's your email; if you've got Fitbit or Nike+ or RunKeeper or something like that—and it lets you take all of your activity-level data that you are constantly streaming out—your geotagged tweets, your social network data, anything that you need—and you can see how they work in correlation with one another.

You can get this view of your life that we constantly feel like Target has on us, except, instead, it is actually authentically like us. You can see, for instance, how all of your exchanges with one particular person affects your drinking, or how pushing all of your email to the end of the day affects your stress level, or how not working out affects the temperature you like to keep your house—all of this stuff that you did not suspect correlated that actually is related, because it is the full data stream of your life. You can actually get a glimpse of it, get sort of a dashboard view. This is what they are working towards. I think it is really remarkable.

From that, you can protect yourself from marketing that you do not feel really applies to you or that you do not want to participate in. You can protect yourself from everything.

You have self-knowledge now. We can use this data that we create all the time for all sorts of things. We can use it to become better friends, to heal our relationships—very human things—to become better lovers.

I am married. The thing is I am married to someone who is very smart. She is getting a Ph.D. from a very prestigious school. She is brilliant. Despite this, I am not always the most interested or engaged conversationalist. I know, I am shocked! The problem is that I am a man, and so I cannot pay attention to when I am not paying attention because I am not paying attention.

This is a very common problem. We communicate all the time. We hear what we say. Other people hear the literal content of what we say, but we do not understand all of the other things that are going into how we communicate with people. There is a researcher named Sandy Pentland from Massachusetts Institure of Technology, who also has a book out now. Buy his after mine. He calls these honest signals. These are the hidden signals that make up real communication. We do not know we are giving them out, but we are giving them out all the time. If you can understand what they are, you can predict how different interactions are going to go.

If I stand here before you like this and I speak in sort of a monotone voice, somewhat Thomas Jefferson-like—think Hall of Presidents at Disneyland—I am taking a role, and that role is a leadership role. If you want to know more what that looks like, Mitt Romney demonstrated it perfectly in the first presidential debate, not so much the ones after that. It projects confidence over subject matter. It is very patriarchal.

Conversely, if you speak like me, if your voice goes up and down a lot, you are a little bit nervous, you use your hands, this suggests an explorer role. I want to actually stop talking really soon and start hearing from you. I am interested in going with you to a place that we can go to together and having an interaction.

Conversely, if I just stand here and I listen to you talk to me, then I am taking a very different role, one that men often aren't so great at taking, which is listening.

What Sandy Pentland wanted to do was figure out if you could get data on these invisible signals using a very simple machine. He created a machine called a "sociometer " that is no different from your phone. It had a microphone. It could pick up some gesture, sort of like the accelerometer in your phone can pick up gesture when you move it around like this.

What he found, when he had a large enough data set, was that with just 30 seconds of sociometer data, using a formula he could predict how any exchange was going to go.

If it was like a speed date and the treasure on the table was a number to be exchanged between two people, he could predict on this sociometer data whether that number was going to be swapped.

If it was a business negotiation, he could predict who was going to come out better and who was going to come out worse because of the roles people immediately took when they started talking to one another.

And what they were saying, the literal content of their conversation, didn't matter. This is all invisible, that we didn't know how to pay attention to ever before in human history, that now just exists, that we give away to machines, that we all carry around with us, that every one of us has.

The next step is just to pick it up and start listening. This is the future that we can all look forward to. Imagine that superpower, where you can actually know that about yourself and your exchanges with people.

How would you use it? You would not use it to win every business negotiation ever if everybody has it, but you could probably use it to become a little bit smarter about yourself.

His partner in this research is a guy named Anmol Madan. He made a start-up invention off of this. You could download it on a Zaurus phone, which was sort of the precursor to this that didn't get around for too long.

It was an app called the Jerk-O-Meter. It worked just like you'd think it would work. You would be talking to someone and the Jerk-O-Meter would listen to your hidden signals and tell you if you were being a jerk. I would find that useful.

You can use this stuff for really anything in the future. We are going to be able to, on the data that we create, predict the ever-changing probability of falling victim to a crime, on the basis of just a small number of variables that used to be really hard to create in usable data form like this but today are really easy.

They are your circumstance, your situation, and your environment. Your circumstance is who you are, your situation is what you are doing, and your environment is where you are. These are the big variables that go into what is called the victimology continuum.

If you ever watch crime shows, like Law & Order, all Law & Order episodes are basically premised on the victimology continuum: How is the subject related to the victim? What was the subject doing? Who is the subject?

If you know that you are a drug dealer who is dealing drugs in a bad part of town, your score on the victimology continuum is way up here. You have a very high likelihood compared to the general population of being the victim of a crime, based on who you are, what you are doing, where.

If you are a politician or a homemaker and you are on a golf course or in a church, then the probability of you being a victim of a crime is much more over here.

If you are drug dealer in church or a homemaker selling drugs, then it sort of goes in between.

This is also information that used to be hard to compute that we now can create data around all of the time, based on where we are and who we are and what is going on.

So we can look forward to a future where it is not just police departments that are able to anticipate places where crime is going to happen, which is a program that sounds futuristic, but has been in place in New York for decades under Bratton. It is the policing program.

There are now similar versions, ramped up with computerized power, in place all across the country. In the future, in the very near future, it is a power that you have—not just police departments, but you yourself.

[Cell phone rings in audience.]

Here we go. Are you ready? This is the best part.

PARTICIPANT: I think it's a telemarketer.

PATRICK TUCKER: That's great. Even better.

THE AUDIENCE IN CHORUS: Hello.

PATRICK TUCKER: Now you've got to hang up.

PARTICIPANT: They hung up.[Laughter]

PATRICK TUCKER: I should come up with a more elaborate script. I haven't done that yet. I should really come up with one where we all pretend to be on a plane or something.

So you can use this stuff for all sorts of things—predicting the likelihood of a particular civil uprising.

It is also data that in the future goes into helping objects that are directly in our surroundings anticipate what we are doing so that they can meet our needs much better. That sounds a little bit weird, but it is the basis of what is going to be the self-driving program that is going to be on the streets of Manhattan before too long.

The hard part of that program has already been done, which is getting a machine to understand something about topography, to make predictions about the terrain ahead.

Once these machines are interconnected with sensors in their environment, with one another, and they are able to share a common situational awareness of all the people that they are interacting with, then you can look forward to a future where you do not even have to call the car; the car knows when you will be leaving. It arrives on its own to take you to the next place. It parks far away so that you do not have to think about where to park it, and we do not, as a city, have to allocate resources to it. All of that happens on the basis of information that we are now giving away all of the time.

We are going to have 10 times fewer cars in the next 20 years, not just because of a ramp-up in technology, but because of data we give away. And that is a good thing. That is a very good thing, because we do not need as many cars as we have right now. This helps the city; it helps all of us. It changes the way we all live.

We have arrived at an age where we create information in everything that we do. There is no going back.

What is unfortunate is that, instead of seeing that as the opportunity that it is, we have decided to see it as a threat. There is a lot to be concerned about. There are a lot of conversations that have to happen. But if we can just take a moment and learn to listen to the information we already make, then the future is as bright as anything we can imagine.

Thank you.

Questions

QUESTION: Thong Nguyen, International Peace Institute and Carnegie Council.

My question is about individual empowerment. You said that individuals are going to be more empowered by big data than, I guess, traditional actors today, such as the NSA or private corporations. What is the operational logic there? Why do you think individuals will be able to edge out the resource advantages, the connections to large data sets that governments have or private corporations have?

PATRICK TUCKER: Part of it has to do with the transference of data as a thing. When it becomes apparent to all of us that there are uses for our data, then I think you are going to see an ecosystem of start-ups, like Tictrac, begin to emerge to sell that information to us in a way that is useful to us as consumers. The B2B market at that point is going to look small and squalid in comparison.

This is somewhat of an article of faith, but I think that it is validated by everything that has happened in technology since the year 2000. Eventually, to reach the big money, you have to start hitting consumers, and not just working as a data scientist at Target. That is the way capitalism actually works. You reach consumers, not just other great big businesses. So that is one aspect of it.

When we begin to realize the utility of this, we will see more Tictracs. There are new ones sprouting up every day. I was just at Health Datapalooza down in DC. There are like four dozen.

Another one is Prime—also no proprietary interest—that just sprouted up very, very recently. All it does is gives you access to your own digital medical records on your phone so you can send them wherever. It is doing the entire thing that Obamacare cannot quite do with the amount of money we are spending on it, and it does it sort of automatically, because it is user-empowered rather than top-down-empowered. I am very hopeful about that.

But also these big institutions, in an era of decline, cannot hold onto this stuff. A lot of times I get questions about Edward Snowden. How freaked out should we be by the NSA? This is a good question.

Whatever you think of Edward Snowden—whether you think he is a patriot or whether you think he is a traitor, or you think he is a patriot that should maybe be prosecuted or you think he is a traitor that should be let go because of the First Amendment—that does not matter.

History will regard him as this, with absolute certainty: Edward Snowden is the most famous systems administrator that has ever lived. He recently gave a talk where he was really adamant about this one point. I rarely see him this adamant in an interview. But he was really adamant to point out that he was not just some low-level contractor with the NSA. He was important, dammit.

He was not. He was not an unusual person at all. The NSA had this stuff. To use it they have to give it a little bit away, they have to expose it. So he did to them what we feel like they did to us.

The same with Target. A couple years ago, the big data story on Target, as rendered wonderfully by Charles Duhigg in the New York Times, was "Target uses customer data to predict that a 16-year-old was pregnant." [Editor's note: For more on Charles Duhigg, check out this 2012 podcast.] They figured it out before her father. They sent coupons to her house on the basis of a correlational analysis that involved a number of variables. One was a switch to unscented skin lotion from regular skin lotion, and she started buying more of it. There were a lot of things that went into that analysis, but that was one of them, I believe.

What is weird is that when you become pregnant, you have a life inside of you that wants a lot of moisture and water, and so it dries out your skin, so you need more skin lotion. A lot of people nowadays do not want to use the scented stuff because it has more chemicals in. So that consumer shift indicated that she was in a motherly way. Everyone was, like, "Oh, that's creepy. I can't believe Target can know that about us. "

Now the big data story on Target is how vulnerable they are to data breaches. It is not like plutonium. You can't stick this stuff in a box and not use it and have it be of any value. You have to use it, and as you use it you expose it, and as you expose it it becomes available to more people.

Those two things combined suggest to me that if we dare be ambitious enough, if we dare ask the right questions, if we all continue to be vigilant in government overreach of data and corporate overreach of data—and that is a big part of it—then we can absolutely realize the great potential of this new age.

JOANNE MYERS: And we should not be buying unscented cream from Target either.

QUESTION: Don Simmons.

There have been a lot of dramatic predictions based on extrapolation of current trends that have not come true. I am just wondering what caused you to be so confident about that 44 times as much data. Things could happen, it seems to me. The value of the data might not grow as rapidly as expected. The cost of storing it may become more burdensome. The cost of the electricity that is used, which is significant in the data industry, could become a consequence. I just invite you to comment.

PATRICK TUCKER: At this point I would say that, if anything, that estimate is probably conservative. You are right, there are electricity costs associated with these enormous server farms that house a lot of this data. Most of that data that we create, though, that I mentioned is not actually stored permanently at all. But that does not mean that insights cannot be gleaned from it.

I think it is inevitable that you will see an increase in the amount of usable data. That figure is sort of a proxy for that.

If you look at adoption rates of smartphone technology, the projections across the rest of the world—because Moore's law is still in effect—you look at the cost forecasts for IT [information technology] and how that continues to trend, you look at all of the utility that it offers, especially in the developing world, I do not know how we cannot reach a point that, certainly in places where you can see it as a leapfrog technology, you are going to experience any sort of retardation or cessation in consumer adoption of smartphone technology in the United States.

But I think the biggest driver that suggests to me that that figure is credible and inevitable is the growth in the baby boomer generation, the elder baby boomer generation. As that generation gets older, you actually start making a lot more data, because you are going to be incorporating devices into your biological functioning that let you communicate with your doctor constantly and in real time, and that's telemetrics.

We have to have that, because the way we access health care now is really dumb. When you feel bad and you experience symptoms, you pay a co-pay to sit in a waiting room for a little while and then you have an exchange with someone who may perform some sort of analysis on you.

But for the most part, everyone agrees that what is much better is if you can passively communicate your health data on a continuous basis to a system. Then your health-care provider, your doctor, and your insurance company can look at all of that and make determinations about what you are about to get, symptoms you may experience in the near future. That is just a much better way to consume health care than what we've got today.

That is really data-intensive and also, I think, quite inevitable because it is a huge cost-saver. It is a cost-saver we have to have. There is plenty of money to be made in it. And it is going to result in better health for an extremely wealthy consumer bloc.

Look at MOOCs [massive open online courses] and how useful MOOCs are. Look at adoption of those in the classrooms.

I have yet to see any indication that we are going to cease our adoption of information technology. I have not seen a single indicator to suggest that that is something that is going to happen.

Whether or not that IDC figure remains the absolute figure—if I am wrong, then come see me in 2020 and I will get everybody a drink. But I won't be that wrong, so not a lot of drinks.

JOANNE MYERS: So now we know where you will be in 2020, which is really projecting into the future.

QUESTION: I'm Ed Marschner.

I wonder if you would comment on the European legislation of what the French call le droit d'oubli, the right to be forgotten, that is now confronting Google. It sounds as if Google, as reported in the mainstream press, has caved in to agree to some kind of right to be forgotten. It does not sound to me as if you believe this is at all remotely likely to happen.

PATRICK TUCKER: The specifics of whether or not they can pull off a right to be forgotten, I think remain to be seen.

Having said that, I think that everybody owes Europe a huge thank you, primarily for not just the right to be forgotten, but for a legacy of extremely strong privacy legislation that really began in 1995 under a European directive on privacy, where they established privacy as something that everybody—this is a part of that.

So even if making the right to be forgotten, as they envision it right now, seems to be somewhat impossible, what is fantastic is that Europe can, if we allow it, drive privacy and, particularly, transparency laws in the United States in a way that is really helpful.

The right to be forgotten—I'm not sure it is going to work out the way they imagine.

But there is European law that I think everyone might want to check out, which is the 1995 directive on privacy. That is the reason why in Europe today, if you want your data from Facebook, they have to give it to you. This is something we need. That is a model piece of legislation that we can all ask for in the United States—demand from Congress and also from our social networks. It changes the way everybody interacts with Facebook in Europe, and it could change the way we interact with Facebook here if we had similar laws.

Having said that, I give the implementation of the right to be forgotten, as envisioned, a 50-50.

Even then, it is temporary, because you can't contain this stuff. We are going to keep using services. We are going to keep making more data. As that becomes the case, then any particular piece of legislation becomes toothless much faster. You enshrine this stuff in law, and the rate at which we create this information, demand this information, and that information becomes useful and marketable moves much faster than the legislative process. All the definitions in there are going to be obsolete much faster than the framers think they will be.

What is important is to have a continuous level of vigilance and demand new legislation all of the time, changes to legislation, be on top of it. Subscribe to the Electronic Frontier Foundation's newsletter. Subscribe to the American Civil Liberty Union's newsletter on data privacy.

Keep on top of it, and when laws like that come around, send your congressman a link and say, "I don't know if they are going to pull it off, but I think that the right-to-be-forgotten legislation is pretty cool. Where do you stand on this?" Ask that question all the time.

It is still useful, even if they don't quite pull it off.

QUESTION: Eva Schweitzer. I am from Europe, not here.

I have a question about the dangers of especially what you told us about the doctors. I heard a similar speech by Eric Schmidt, who thinks that is a great idea. If your doctor knows if you have diabetes—

PATRICK TUCKER: He is going to make a lot of money on that.

QUESTIONER: Oh yeah.

Let's assume, just for the sake of the argument, there is an evil government who gets all that data and decides that everybody who has diabetes or any kind of genetic defect or anything at all should not live. Then they can very easily go after these people.

PATRICK TUCKER: Which government are you thinking of, specifically?

QUESTIONER: Like a government that has access to a lot of technology and—not Somalia—

PATRICK TUCKER: Yes, but what characterizes that government? Is that government this government?

QUESTIONER: No.

PATRICK TUCKER: Is it China? Is it Russia?

QUESTIONER: Any government that is high in technology and has resources, so not Somalia or Yemen.

PATRICK TUCKER: Any government with resources that is high in technology and also lacks any sort of populace accountability or scruples or humanity. Depending on your level of cynicism, that is either all governments or only a handful or none.

I do not see this happening in the United States.

What I do think is really important is a user-empowered revolution in health care, where, first, like we said before, we are transmitting that data entirely telemetrically and constantly to our doctor. By doing that, we are also sort of taking ownership, because we are beginning to understand it ourselves and its relevance to us.

What is really important, I think, is to protect ourselves from governmental medical malfeasance or government medical tyranny. I am not exactly sure how to describe the terrifying death panel scenario for this country, other than "terrifying death panel scenario." But I think the first step—

QUESTIONER: It doesn't have to be a death panel. Let's say the government decides you have diabetes and you are buying more than two pounds of sugar, because they have all the data. You are not getting Medicare or not getting Social Security because you haven't taken care of yourself.

PATRICK TUCKER: So sort of like nudgy government, angry, sort of penalizing legislation—"Oh, you are over your sugar quotient and now—"

QUESTIONER: "We know you have diabetes. You cost us a lot of money. "

PATRICK TUCKER: I do not see that happening in the United States. I do not think that we politically have any capacity to accept that at all. It is just not in our character right now. Our character could change, but I think you would see a violently hostile—perhaps not literally violent, but certainly enormous—reaction against that sort of thing.

Right now the people who are most inclined to create health-care data are the healthiest people. That changes as we talk about baby boomers being forced to adopt these to manage chronic, continuing medical conditions. But right now early adopters on this stuff are all these super-crazy-fit San Franciscans. I think that that's good.

Having said that, there is some legislation that I think provides a model that we should ask for from government. It is a 2007 bill, actually, that was created by John McCain. This is, to me, a piece of model legislation that we want to ask for more of from our legislators in this country.

It does not allow for discrimination against individuals on the basis of their genes. You go in and you get a full DNA map, like 23andMe used to be able to provide. This bill—a very foresighted piece of legislation—says that no insurance company can discriminate against you on the basis of that.

So I would say our protection from what you describe is—if you feel concerned about it, familiarize yourself with the bill that makes it illegal for an insurance company to discriminate against you on your genetic information and say to your legislator at the Congressional level, the local level, your health insurance provider—if it is somebody that you pay money to at all—say, "I want more of this. What are you going to do about it?" Then, depending on their reaction, you take it to the street, I guess.

But there is legislation that can protect that that we passed in the past. So I think we can be hopeful that we would do it again.

QUESTION: Allen Young.

To some extent, this question was already broached. In a place like China, it is very difficult for dissidents, as it is, to get together and try to do something about their repressive government. We are on the 25th anniversary of Tiananmen Square. You are saying that the cell phone could predict where we are going to be two years from now. Cell phones can also predict for the government where the people will be in Tiananmen Square. It is all very well and good to say we don't have to worry about that in this country. But what about countries like China? What can we do to make sure that countries like China cannot get access to that kind of information?

PATRICK TUCKER: I do not necessarily agree that we shouldn't be too worried about that in this country. There was an incident in 2003 where police used some limited social network data to predict a nonviolent protest against the World Trade Organization. It is far less common in this country. I think we have every reason to be proud of that fact, but I do not necessarily—it is a very valid question.

China is actually a big point on that. I think that the powers of predictive policing in China we are going to see—and I don't want to harp on China. China has been through a lot of changes lately. But I think that if you want to look at the place most likely to have perhaps a big nightmare scenario in terms of predictive policing, China is a very good example of that.

So what do we do about it here in this room? It really starts with all of us. How many of us subscribe to a human rights newsletter bringing us news out of China? That is a place to start.

At some point someone will be preemptively arrested, if not necessarily in China then perhaps in some place where there is an equal amount of anti-authoritarian strength. The question is, what do we do when we hear about that, because that hopefully will be sooner rather than later? How do we all respond en masse?

But yes, much of the hope that I have in this book is really predicated on the idea of living in a democracy and living in an economy that allows startups to flourish and provide services to people that they want, that provides mechanisms of accountability for government. In places where those are absent, then the potential of big data to be much more abused is very real.

I think eventually it does open up all of humanity to a much brighter future. But there is a good chance in authoritarian settings that a lot of people, innocent people, will go through a very difficult transition phase before that happens.

I do not have a clear solution on how to save China from predictive policing and big data future abuses, except to say that, for all the worry about China, in many ways it is getting better than it was in the 1990s. I think that there is something inevitable about that too.

JOANNE MYERS: Even though our time has come to an end, I can predict that if you stay and have a drink and join in the conversation, Patrick will be very happy to answer all your questions.

Thank you very much.

You may also like

A Dangerous Master book cover. CREDIT: Sentient Publications.

APR 18, 2024 Article

A Dangerous Master: Welcome to the World of Emerging Technologies

In this preface to the paperback edition of his book "A Dangerous Master," Wendell Wallach discusses breakthroughs and ethical issues in AI and emerging technologies.

APR 11, 2024 Podcast

The Ubiquity of An Aging Global Elite, with Jon Emont

"Wall Street Journal" reporter Jon Emont joins "The Doorstep" to discuss the systems and structures that keep aging leaders in power in autocracies and democracies.

APR 9, 2024 Video

Algorithms of War: The Use of AI in Armed Conflict

From Gaza to Ukraine, the military applications of AI are fundamentally reshaping the ethics of war. How should policymakers navigate AI’s inherent trade-offs?

Not translated

This content has not yet been translated into your language. You can request a translation by clicking the button below.

Request Translation