2013 Sayada Community Network Building. CREDIT: <a href="https://plus.google.com/photos/113632566131475824062/albums/5965895313641259393/5965897686580208274?pid=5965897686580208274&oid=113632566131475824062">Ryan Gerety</a>
2013 Sayada Community Network Building. CREDIT: Ryan Gerety

The Bright Side to Big Data: Good Intentions and Ethical Questions

Oct 31, 2014

We wrap up our three-part series on data and privacy with a look at some ways big data can improve our communities. Technology and big data are delivering some big payoffs for our culture and society, while also posing some of the greatest risks. How can big data promote social good? How might these efforts potentially introduce big ethical questions?

JULIA TAYLOR KENNEDY: You're listening to Impact from the Carnegie Council. I'm Julia Taylor Kennedy.

This is the final installment in our three-part series on data and privacy. The material is so rich, we will likely revisit the topic in the future. For now though, we'll wrap up the series with a look at some of the ways big data can improve our communities and why those efforts might introduce some of the biggest ethical questions.

Over the next half hour, we'll revisit some familiar voices from earlier podcasts and bring in new ones to discuss how big data is delivering some of the biggest payoffs for our culture and society—while also posing some of the greatest risks.

ALEX PENTLAND: My name is Alex Pentland. I'm a professor at MIT (Massachusetts Institute of Technology) and everybody calls me Sandy.

JULIA TAYLOR KENNEDY: Sandy Pentland is one of the most frequently cited scholars in computer science. That may be because he is incredibly curious and prolific. He's also an interdisciplinary thinker—his most recent book is called Social Physics.

I connected with Pentland via Skype after learning about him through a source in the first installment of this series, New Yorker ideas blogger Joshua Rothman. Remember him? He used Virginia Woolf to explain our relationship to social media. Rothman is also well versed in Sandy Pentland's work.

JOSHUA ROTHMAN: He is considered one of the most important researchers in this world of big data. He's done a huge amount of work in this big data universe, most of it at a very large scale.

JULIA TAYLOR KENNEDY: Pentland is both a pioneer in big data and author of the Consumer Privacy Bill of Rights. He mentored the inventors of Google Glass.

What drives all of this? Looking at how humans communicate without speaking a word.

ALEX PENTLAND: We have been a social species that could read each other and signal each other to coordinate for hundreds of thousands of years earlier than we ever had language.

JULIA TAYLOR KENNEDY: For Pentland, our actions say far more than our words do.

ALEX PENTLAND: When you get computers watching people quantitatively and objectively—if you build computer systems to read people—it really is 50 percent of the message.

JULIA TAYLOR KENNEDY: Analyzing tone of voice and gestures can be used to improve human behavior in all sorts of ways—to make us more productive, to be more considerate of the environment, even to prevent crime.

To understand how Pentland looks to data to improve communities, let's bring back New Yorker blogger Joshua Rothman to explain how Pentland has experimented with a micro-society: the workplace.

JOSHUA ROTHMAN: He has this device that his lab has invented, called a socio-metric badge. So a socio-metric badge is a little badge you wear around a lanyard on your neck. It's probably the size of a deck of cards or something. They're getting smaller all the time.

JULIA TAYLOR KENNEDY: This badge can track all sorts of variables about you.

JOSHUA ROTHMAN: It can track, obviously, your location, but also things like whether you're facing another person and talking face to face, the orientation of your body around a conference room—we're all in our swivel chairs. It can track which way you're looking. It can even track tonal changes in your voice, like are you upset, are you angry. Without having to know what the words mean, we can track the emotional stresses.

ALEX PENTLAND: We've gone out and measured what the people actually do, not what they say they do; what they actually do versus the performance of the team.

JULIA TAYLOR KENNEDY: Once he had built these socio-metric badges, Pentland started trying them out with teams in labs—and then with teams at more than 20 companies.

JOSHUA ROTHMAN: One of the studies he did was in a call center run by Bank of America. In a call center, the main metric that's measured is how quickly can a phone call be processed; how quickly can the customer's problem be solved.

JULIA TAYLOR KENNEDY: After all, call centers need high productivity—and the managers figure the way to get the most calls in per day is to keep employees working independently at their desks instead of letting them slow down or get distracted by office chitchat. So, like many call centers, this workplace avoided having too much interaction among employees.

ALEX PENTLAND: They have a big meeting in the morning and they just work on the phones and take breaks independently.

JULIA TAYLOR KENNEDY: But Bank of America was noticing low employee productivity.

ALEX PENTLAND: So what I did was I talked them into jiggering the coffee breaks so that people would have more time to talk to each other.

JULIA TAYLOR KENNEDY: And he analyzed the data collected by the badges.

JOSHUA ROTHMAN: What he found was, the more those employees talked to one another in a coffee break not even talking about work—in fact, he has no idea what they were talking about—the faster they processed the calls. Because they spend a lot of time mingling, and talking about who knows what, their productivity increased.

JULIA TAYLOR KENNEDY: By increasing face time among Bank of America call center employees, Pentland both improved their productivity and, one can assume, the quality of their lives at work day to day.

Now in some ways, these badges are hyper-intrusive. After all, every employee is wearing one around their neck, and their biorhythms are being recorded.

ALEX PENTLAND: People don't get too worried about it.

JULIA TAYLOR KENNEDY: Seems unbelievable, but Pentland insists it's the truth.

ALEX PENTLAND: As long as they know you're not recording the words and you're not going to show the boss when you went to the bathroom or things like that, there's these things that everybody knows, like Bill just talks too much. But you can never talk about that because you don't have any data. It's just your opinion against Bill's opinion. But people would really love to be able to have quantitative data being able to say, "See, I told you Bill talks too much." They sort of get into it, right?

JOSHUA ROTHMAN: That's information that's actually quite scary—that someone has access to that information.

JULIA TAYLOR KENNEDY: Rothman isn't so sanguine about all of this.

JOSHUA ROTHMAN: If you can get employees to wear that socio-metric badge, then you can collect a whole other kind of data that is really telling, really powerful, and that's data that strikes me as the privacy implications there are really profound.

JULIA TAYLOR KENNEDY: Managers have been using surveys and focus groups to try and crack team behavior for years. So why does this method stand out? Maybe because it almost completely does away with self-reporting. There's no survey, no focus group for people to perform their jobs for an observer. They're just doing their work. And they wear the badges long enough that they forget they're being observed.

JOSHUA ROTHMAN: What Alex Pentland is measuring are actual emotions.

JULIA TAYLOR KENNEDY: Are we truly stressed?

JOSHUA ROTHMAN: Did we actually look at someone face to face when we talked to them, or did we look away?

JULIA TAYLOR KENNEDY: Those are actual facts about our physical responses.

JOSHUA ROTHMAN: That's why it's reality mining instead of data mining. The problem isn't like the employer is Google-stalking a job applicant and getting a false version of who that person is. Here the problem is, we're actually getting a very true version on a level that we've never been able to get before, and that's off-putting.

ALEX PENTLAND: Well, I think that employees ought to have rights to privacy and the law is unfortunately not very clear about it.

JULIA TAYLOR KENNEDY: Pentland admits there may be some privacy concerns with how employee data might be used, but he thinks we should look at policy solutions instead of closing ourselves off to using these tools altogether. For example, Pentland has developed a Consumer Data Bill of Rights with the World Economic Forum.

Maybe this Consumer Bill of Rights could work. But the stakes of complex behavioral analysis get even higher when you consider another method Pentland has pioneered in a large-scale effort to improve communities: police work.

ALEX PENTLAND: You can use things like cell tower activity—not where the people are, just how many people are there to be able to predict places that are more likely to have crime.

JULIA TAYLOR KENNEDY: It's a practice that's caught on with many urban police departments, and has spawned a mini-industry called "predictive policing."

ALEX PENTLAND: If you can see that say a town square, or a park suddenly begins to have a very different population of people—the elderly people don't go there anymore, people don't go there at night, things like that. Those are places that are likely to have more crime going forward.

JAMES COLDREN: My name is James Coldren. I go by Chip. I'm the managing director for justice programs at the CNA Institute for Public Research.

JULIA TAYLOR KENNEDY: Through a program called the Smart Policing Initiative, Chip Coldren works with police departments to test and use data prediction tools in every day police work. But he doesn't like the term predictive policing.

JAMES COLDREN: It's very difficult, especially in the social science world to pinpoint with great accuracy what's going to happen, even in the near term future, let alone the far term future. I think that the term "anticipatory" relates more to what police agencies are trying to do.

JULIA TAYLOR KENNEDY: Well, whether it's about prediction or anticipation, here's how this kind of policing works: Let's say your precinct wants to cut back on vehicle theft. First, you look at the history.

JAMES COLDREN: You do a good analysis of the last three to five years' worth of vehicle theft reports and vehicle theft arrests and calls for service around vehicle theft. You identify certain areas of the city where vehicle thefts are more likely to occur. They're typically called "hot spots."

JULIA TAYLOR KENNEDY: You can also use this process to identify the types of cars likely to be stolen.

JAMES COLDREN: You get a trend line from five years prior to the current day on how much motor vehicle thefts have been increasing month to month and year to year and probably how much the thefts of certain types of vehicles have been increasing from month to month and year to year.

So now you're at the present day and you institute a motor theft task force and you assign some detectives and some patrol officers to this task force and you start to target these vehicles and these areas where the thefts are most prominent. And lo and behold, six months down the road or nine months down the road or a year down the road, you look at that hot spot and you find out that, in fact, there have been fewer motor vehicle thefts.

JULIA TAYLOR KENNEDY: Success! Well, Coldren says, "Not so fast." First, you have to make sure that motor vehicle thefts weren't already on the decline before you started responding to the predictions. Then, you also have to consider other factors.

JAMES COLDREN: There are a number of things that could have happened that would have caused motor vehicle thefts to go down whether there was a task force in place or not.

JULIA TAYLOR KENNEDY: For example, car manufacturers could have put better security on their cars. Or . . .

JAMES COLDREN: . . . somebody from a motor vehicle theft ring could have been caught and incarcerated and everybody maybe slowed down their behavior for a while because they were afraid.

JULIA TAYLOR KENNEDY: So there are a few ways to evaluate whether a predictive policing initiative worked. But this basic premise—that historical data, combined with Pentland's cell data, combined with other predictors of criminal activity, has inspired software companies to develop all kinds of platforms for police departments that are trying to prevent crime from happening in the first place.

One called PredPol gives police officers a block to go and patrol each morning for criminal activity. And police departments are showing that while it seems like voodoo, to simply patrol blocks that are likely to see criminal activity—well it actually can have a dampening effect on crimes like auto theft and mugging. But if it becomes the sole driver of police work, hot spot policing can become a blunt instrument.

JAMES COLDREN: You could actually follow the numbers rather blindly and get involved in what we typically refer to as a "whack-a-mole" approach where you're simply running around from street corner to street corner trying to arrest people and disrupt these markets when you're not really focusing on the root causes of the problem.

JULIA TAYLOR KENNEDY: So, hot spots have their limitations. But there's another predictive policing model that adds another layer on top of the so-called hot spots.

JAMES COLDREN: It's not just identifying the geographic areas but it's identifying the individuals in those areas, the small percentage of individuals that account for the greater percentage of the problems.

We know that there's a small percentage of individuals that are responsible for a great, great proportion of the problems that they present, and so now you'll find police agencies developing what you'll hear them call top offender lists or "WOW" lists, the "worst of the worst."

JULIA TAYLOR KENNEDY: What if somebody erroneously got on the worst of the worst list and then experienced a lot of scrutiny and attention from a police officer that the person didn't deserve?

JAMES COLDREN: That's a very good question and I've got two different responses to that.

So one thing that most police agencies do and the ones that are really good at this do is they validate those lists before they take any action. They'll develop some sound criterion around which anybody can even get on a list in the first place. So it can't just be somebody's hunch. It's got to be a certain number of arrests over the past 12 months or a certain number of incidents over the past 12 months, corroboration from two or three different sources.

They'll go to their detective bureau. They'll go to the crime analysis unit. They'll go to their gang intelligence unit and they'll say, "Hey, look at this list of names that we have. These are the people that are coming up as the most prolific offenders. What does this list look like to you?"

JULIA TAYLOR KENNEDY: That's number one: make a list, check it twice.

Number two? Refresh the list regularly.

JAMES COLDREN: If someone's on a hot offender's list for nine or 12 months and there's no activity and no trouble, well, then they should be taken off that list. Some agencies do that, as well. They manage these lists. They don't just run them in perpetuity.

JULIA TAYLOR KENNEDY: And it's not like getting your name on a worst of the worst list means you're automatically arrested.

JAMES COLDREN: You can have a list and you can address people on a list without being heavy-handed about it. You're simply talking to them and saying, "Hey, what's going on today? Where you been? What you been up to?"

Now if you're on a scene of a crime and you come to the scene and there's been a shooting or a street fight and you have your crime intelligence bulletin with you and two or three of the people on your top 10 list are there, well, I think the police have reasonable cause to talk to these people and ask them some questions.

JULIA TAYLOR KENNEDY: Used responsibly, this sounds like more informed beat policing. It's just checking in on people who have trouble following the law, right?

VIKTOR MAYER-SCHÖNBERGER: A prediction is probabilistic. There is, at least, a decent chance that our predictions are wrong.

JULIA TAYLOR KENNEDY: That's our old friend Viktor Mayer-Schönberger. He's an Oxford professor and software developer who writes about privacy and the virtue of forgetting. And he hates the idea of predictive policing.

VIKTOR MAYER-SCHÖNBERGER: We are always running the risk of punishing people for something that they haven't done. By doing so, we deny them the most elemental of all human qualities: free will.

JULIA TAYLOR KENNEDY: By approaching someone on one of these worst of the worst lists, you're not giving them a chance to choose not to commit a crime. The same could go for hot spots.

VIKTOR MAYER-SCHÖNBERGER: If we look for something longer, and more intensely in one area than another area, then it's very likely that we'll find something in the one area, rather than the other area. Those that are not careful, look at the results and say, "Oh, there is evidence that our system works because where we are policing more, we find more crime."

They don't realize that this is a perpetuating, self-serving hypothesis, that it is a hypothesis that we prove by just believing it, and acting upon it. If we would do that, if we would create a world in which we create a self-fulfilling prophecies, in terms of law enforcement, in terms of security, more generally, we would fall right back into a world of discrimination, and a world of profiling.

JULIA TAYLOR KENNEDY: After all, algorithms can amplify bad assumptions. Chip Coldren knows that—and yet, he says predictive policing is too good of a tool to throw out.

JAMES COLDREN: The police don't just go to where the data tell them to go. You have to take into account the context, the type of problem, the type of neighborhood and, again, you really should be getting other types of information other than this arrest or this call for service data because we know that people of color are more likely to be arrested than white people.

So this is one tool in a pretty large toolbox that police carry around with them.

JULIA TAYLOR KENNEDY: It's all about context. Here's what he means:

Take a predictive policing initiative in Indio, California, a small city near Palm Springs with a majority Latino population and a median income slightly above the national median.

Indio experienced an explosion of growth in the last 30 years. Its population grew by 50 percent between 2000 and 2010. Their police department worked with Coldren and his Smart Policing Initiative to dig into the causes of home robberies in the city.

JAMES COLDREN: They found that if you track truancy rates in the neighborhoods in Indio where those rates are high, in about two years, burglary will be high in those same neighborhoods.

JULIA TAYLOR KENNEDY: So skipping school led to burglary. And based on truancy patterns, the police predicted that burglary rates were likely to increase in several neighborhoods that currently weren't experiencing high levels of burglaries.

JAMES COLDREN: They created a community-policing-oriented preventive approach to working in those schools and in those neighborhoods to address the truancy problem before it turned into a burglary problem.

JULIA TAYLOR KENNEDY: The program has been in place for five years. And while it's looking like burglaries have increased a bit in the targeted neighborhoods, the preventive measures kept them from skyrocketing. Instead of taking a "whack-a-mole" approach, the community is trying for a more holistic solution.

JAMES COLDREN: One of the main things that they did is they launched a parenting class within the school district. So they gave parents skills and techniques to use when their children start getting in trouble with truancy. That starts to get at the root cause. It starts to get at the family life and the peer influences that these youth are dealing with and that came out of that predictive model.

JULIA TAYLOR KENNEDY:
This could be a lovely world, where we are able to be more sensitive to kids who might be headed down a dangerous path, and improve their likelihood of success. A parenting class could even have the additional effect of increasing graduation rates, although Coldren's Smart Policing Initiative didn't evaluate that impact.

But what if the police department decided to use the information about truancy a different way? Or kept the list of truants and started to follow them extremely closely two years later?

What if one of the truants committed a robbery and was arrested because her name was on the list . . . but a classmate who attended school every day and later robbed a bank, escaped notice of the police because she wasn't on any of their lists?

Predictive policing has created a huge class of critics. Bruce Schneier, renowned cyber-security expert, author, and fellow at Harvard University, who appeared on our last podcast, strongly opposes this method.

BRUCE SCHNEIER: Predictive policing seems to have a lot of rhetoric and not a lot of results. What we do see is it's a thinly disguised racism.

What neighborhood should we target for policing? Well, it's going to be the minority place. It's going to be the poor neighborhoods. They don't actually need predictive policing to do that.

So I would be skeptical of a lot of predictive policing things right now because we're not seeing any real good evidence that they work, and a lot of evidence that they're really nothing more than thinly disguised racism.

SEETA PEÑA GANGADHARAN: It's never a question of, "Is this technology good or bad?" It's, "What's the context in which this technology is unfolding and being applied? What are the values that are informing its use and application?"

My name is Seeta Peña Gangadharan. I'm a senior research fellow with New America's Open Technology Institute.

JULIA TAYLOR KENNEDY: Gangadharan is another predictive policing skeptic. But as you heard, she agrees with Coldren that no tool is inherently bad—it just has to be used in the right way.

Gangadharan has thought about this a lot. She helps New America Foundation create intentional network communities all over the world. These communities often use networking technologies that are independent from a traditional Internet service provider.

The idea is to help people think more deeply about context, about how they want to interact with each other and with the rest of the world online. This process often starts with a community workshop.

SEETA PEÑA GANGADHARAN: One of the exercises that we take people through when thinking about how to be more secure or safer online is a sort of mapping exercise where we ask, "What communications and information technologies do you use?"

Then we ask a second question, which is, "To what extent are those tools owned by you versus owned by somebody else, like a corporation or government?"

What's remarkable, every time in every place that we've done this exercise is to see how much of our information and communications technology really is not owned or controlled by us. You begin to realize, "Oh, I am very embedded in communications infrastructure that sets the rules for me. I don't set those rules."

So naturally, in those environments, people will ask, "Well, what's the alternative?" Community networking is one of the alternatives that come up time and again.

JULIA TAYLOR KENNEDY: The idea is for the community to decide together on the purpose of the network—and the tools they use to achieve that purpose.

SEETA PEÑA GANGADHARAN: Some people will say, "Oh, we absolutely only need to use open source tools or free software and we have to only use our communications networks or community networks that we establish and operate." Some people are in less agreement thinking it's good to be in these quasi-public environments like Facebook and Twitter, because that's how, sort of, serendipity can happen in these environments where you touch people that aren't necessarily part of the in-group.

JULIA TAYLOR KENNEDY: And what if the community priority is security or privacy?

SEETA PEÑA GANGADHARAN: There are all sorts of things that you have to take on when you establish a community network. Your security and safety are not automatic. That's a principle that applies in any community-building context that you're in, right?

You, as a community, create safety together and it's negotiated and it's constantly reworked. And you rework that relationship by talking to people.

JULIA TAYLOR KENNEDY: Building a community network, even if its members are also using their normal Time Warner connections for most of their Internet needs, can be an exercise in setting and adhering to principles and norms online. In a way, they're deciding the guiding principles, the ethics, of their community.

Local community networks can provide Internet access to the underprivileged. They can provide backups in the case of a natural disaster. They can provide safe spaces for community dialogue in countries where the Internet is highly regulated.

Perhaps most importantly, they force community members to take control of their online environments. It brings a discussion of tradeoffs on a local scale that many think we desperately need on a global scale.

What should be kept private? What should be available publicly? What should corporations share with the government? The questions are intimidating enough that you almost want to go off the grid and say "Hey, slow down! Hands off my data!"

ALEX PENTLAND: I have no sympathy at all for people who say,"Oh, we shouldn't be using big data."

JULIA TAYLOR KENNEDY: Sandy Pentland points out that in poor countries, big data could make a huge difference.

ALEX PENTLAND: Almost all of the people in the world have no statistics. We really don't know where the babies die. We don't know where genocide is happening. We don't know where disease is happening. We have no clue. Millions of children die every year because we have no statistics.

JULIA TAYLOR KENNEDY: And yet, in exploding numbers, everyone all over the world walks around with a sensor: their cell phone.

ALEX PENTLAND: For the first time, you can hope to actually know what's going on in all these underdeveloped countries, the 5 billion people who have very little today. By knowing what's going on, you have the ability to potentially fix it. It's huge.

JULIA TAYLOR KENNEDY: These data analytics tools present great opportunities to save lives all over the world. They also introduce huge risks, like erasing free will by wrongly predicting someone will commit a crime.

Among all the different voices in this podcast series, I did hear a few points of agreement.

First, this conversation matters. Global society has gone too long without engaging in a widespread dialogue about governing principles online.

Second, we need better education in statistics and other relevant subjects, so we can build and use data tools in the best way as they become more common.

And third, we need to pay more attention to context. No technology is innately good or bad. It's how we view the technology, the importance we imbue in it, and the other countervailing forces that shape the role data and privacy play in each of our lives.

Thanks for listening to Impact from the Carnegie Council. A special thanks to our production team Mel Sebastiani, Terence Hurley, Deborah Carroll, and Amber Kiwan. I'm Julia Taylor Kennedy.

You can find out more about this podcast—along with the first two installments of our data and privacy series—at carnegiecouncil.org.

You may also like

MAY 6, 2022 Podcast

For Companies, Could China Be the Next Russia? with Perth Tolle

After Russia's invasion of Ukraine, the global financial backlash was swift and unprecedented: Dozens of financial institutions cut off their exposure to the Russian market ...

President Macron at the World Economic Forum's Davos Agenda, January 2021. <br>CREDIT: <a href="https://www.flickr.com/photos/worldeconomicforum/50877135273">World Economic Forum/Pascal Bitz</a> <a href="https://creativecommons.org/licenses/by-nc-sa/2.0/">(CC)</a>

MAR 30, 2021 Article

ESG Offers Capacity, Capital, and Consensus for Global Challenges

In this article, Carnegie New Leader Ravipal Bains outlines an environmental, social, and governance (ESG) led reorientation of the global financial system. The core idea ...

MAR 19, 2021 Video

Carnegie New Leaders Podcast: Feminism, Ethics, & International Affairs, with Professor Cynthia Enloe

In this Carnegie New Leaders (CNL) podcast, Dr. Anwar Mhajne, an assistant professor at Stonehill College and a member of the CNL program, talks about ...

Not translated

This content has not yet been translated into your language. You can request a translation by clicking the button below.

Request Translation