What Robots Represent and Why it Matters, with Robert Sparrow

May 17, 2018

Robots are not neutral. People attach social meanings to them. Robert Sparrow discusses the politics of sex and race in the robots that humans create, which is hiding in plain sight but seldom discussed.

ROBERT SPARROW: I'd like to begin by thanking the Uehiro Foundation, Oxford University, and the Carnegie Council for the opportunity to present to you today.

What I want to present today is material that builds on some of my published work, as well as on some unpublished work that I have been completing recently.

In the last six to twelve months, I've been working on a paper on the race politics of robots. I've been doing some co-authored work with colleagues in robotics at the University of Canterbury and Bielefield looking at people's responses to robots, in particular when it comes to the attribution of race to robots. And I've done some work on sex robots, looking at the gender politics of robots. This presentation will draw on and extend these papers. In the background here, there is a question of what's at stake when we talk about political futures, which I explored in some work on nanotechnology some time ago.

I'm hoping my presentation actually follows quite nicely from the preceding one. Had I seen the content of that before preparing my slides, I could have saved you having to look at a few of my slides.

It is traditional to begin a presentation about the ethics of robotics with a definition of robots. We have heard a couple of definitions of robots already, most of which have focused on their mechanism. Usually the way people define robots—as Bill Casebeer did here—is as programmable machines with actuators. I want, instead, to emphasize the extent to which robots are what I call "machines with meaning." I mean this in three different senses.

First—and I think most of us already know this in our hearts—I mean that the popular enthusiasm for robots, and indeed a lot of the discussion of robots even by philosophers and engineers, is really about what it means to be human.

As machines, robots tend to be a little bit underwhelming, and when they work, we tend not to notice them. So if you think of your ATM or your dishwasher as a robot—which they are—it's actually hard to get too excited about robots. What people do get worked up about is this idea of a mechanical person or an artificial person, and the reason why they get worked up about it is because they're wondering what it is that makes us human.

I think this is never clearer than when people start talking about Three Laws of Robotics, which were set out in stories written before there were any robots, as though they were a serious proposal for developing an ethical framework. In fact, it is very clear they were a structure for generating drama. The "laws" of robotics allowed Asimov to tell some cool stories, which asked his readers to think about their ethics. Yet people still keep referring to Asimov's laws as though they can tell us something about robots instead of something about human beings, who like to write and to hear such stories.

Even with production-line robots or industrial robots, it seems to me that they're only really interesting when they provoke a debate about the dignity of labor, the meaning of labor, and the centrality of work to a good human life. Indeed, there is usually a whole series of anxieties that are driving and are being explored in discussions of robotics. For instance, there are some interesting conversations about masculinity, about the relationship between being a man and being someone who works, that often take place when people start thinking about the social impacts of robots.

Thus, most of the conversations going on about robots today are really arguments about what it means to be human and about what sort of world we're going to be living in in the future. So robots are machines with meaning in the sense that they are machines that enable us to tell stories that help us think about what it means to be human.

Second, social robots, in particular, are machines with meaning because they are machines that only work to the extent that we attribute meaning to them. One might even say that social robots are machines that work with meaning. In order to build a robot that is capable of engaging socially with human beings, you have to make sure that people understand your robot's emotions, for instance, that they respond in the right way to the emotions that they recognize, and to the task that the robot is supposed to be performing.

More generally, the project of designing a robot involves designing the user as much as it does designing the robot. In order for the robot to work we must also get people to do the right thing, which often means changing their behavior. For instance, with industrial robots, you have to keep people away from the production line—which usually means restricting the freedom of humans, by fencing off the production line, so that robots can do their work.

So robots are never just machines. They're machines in an ecosystem of relationships between human beings, which means that shaping the behavior of those who interact with robots is always a key part of the design process. So in that sense they're also machines with meaning.

Finally, robots are machines that have meaning for relationships between people. When robots shape the behavior of their users, and of other people who relate to the robot, they also impose new relationships between people, which means that they are machines with politics as well. Thus, at the end of my paper I want to offer some observations about the politics of robotics.

Let's have a look at some robots now. Here is what I take to be—and to a certain extent at least I am just guessing at this—the four most widely used social robots.

  • We've got "Pepper," a humanoid robot produced by Softbank in Japan.
  • We've also got "Nao." Nao is most famous for its role in dancing videos on the Internet. There's a whole genre of mass dance performance involving Nao available online.
  • Here is "Jibo," developed by Cynthia Breazeal and her team.
  • Finally, there is "Buddy," an emotional robot designed and built in France.

These are pretty much the state of the art of social robots. Apologies if I've missed your favorite robot. I'll show you some more robots in a moment, but I just want you to take a look at that image of these four robots for a moment.

(In passing, in preparing this PowerPoint I have drawn heavily on a really nice resource, which is the Anthropomorphic Robot (ABOT) database. This is a database which basically consists of pictures of cool robots. You will see I have included a citation, which you are supposed to include when you draw on the database. If you are interested in social robots, I can't recommend that database highly enough.)

This next slide shows what I take to be the most "spooky" and "science fiction" looking robots around.

Notice in particular Atlas 2 from Boston Dynamics and Honda's ASIMO robot. ASIMO is a little bit dated now, but ASIMOs were kind of the shock troops of the robot army in a way. They generated a tremendous amount of public interest when they were first displayed.

People still completely freak out at Atlas. There was a video released to YouTube the other day of a successor to Atlas running and jumping, and there is a whole genre again of Boston Dynamics videos that are supposed to convince you that we are on the verge of the singularity, or at least of being hunted like rats by the Terminators in the future.

Again, these are fairly powerful cultural icons.

What do you notice about those robots? What do all of those images have in common?

They're all white! I think they have a race politics that no one has noticed. Literally, almost no one has—there are only three or four papers on the race politics of robots.

There is, it has to be said, some sample bias here. It is possible to find a couple of non-white robots, and I will say a little bit about them in a moment.

But let's just fess up to the sample bias of my selection from the ABOT database and move on to some images of robots from another paper. This is the main set of images in a survey paper on robot appearances that came out a couple of years ago. Yes, there are some yellow robots and there are some blue robots—I'll talk a bit about these robots in a moment—but again whiteness is really front and center at the representation of robots.

It's also important to recognize that when it comes to the race politics of robots, the popular images of robots are as important as real robots. The audience in this room is kind of a privileged audience in that people here have got to see real robots . . . but most people don't see real robots; they see images of robots, and those images of robots are even whiter, if anything, than the images on my slides. If you do a Google image search for "android," for instance, you have to scroll several pages down before you find anything that isn't this glossy white plastic color.

We had a discussion of anthropomorphism earlier. It is an article of faith in the social robotics community and the human-robots interaction community that people attribute social meanings to robots and understand robots through the same sorts of categories that they use to understand human beings and animals.

So, for instance, people are remarkably quick to attribute gender to robots. They will tell you whether a robot is male or female very quickly. And the way they do so has all the traditional features of gender attribution. If you take the same robot and you give it a shovel, people say that it's male; you give it a mop, and they say that it's female. Similarly, you can have the same robot and change the tone of its voice, and it changes the gender attribution. You put it in a role where it commands, people are more likely to say that it's male. You put it in a role where it serves, people are more likely to say that it's female.

So robots have gender. That is pretty well understood.

Robots also have species. You can't build robot cats and robot dogs without thinking that robots have species.

However, what no one seems to have noticed is that robots have race as well and that, in fact, these lower-case-W "white" robots are also capital-W "White" robots.

Now, there are a couple of exceptions to my generalization about robots being White—I know some people are thinking of the exceptions—but I actually think they serve my case.

The classical exceptions that people point to are these two robots. This is Professor Hiroshi Ishiguro's famous Geminoid. This is a "robot" called BINA48, Martine Rothblatt's mysterious taking head that she built in the image of her partner.

There are two things to notice about these robots. First, they only work if they have race. These examples can't show that robots don't have race; these particular robots clearly are an Asian and an African American robot, so race is front and center of these robots. If you couldn't represent race in a robot, these robots wouldn't succeed as exceptions to my rule that robots are white.

Second, these robots are modeled on individuals, who are Asian and African American respectively. Again, if the robots didn't have race, they couldn't succeed in representing these individuals. More generally, it is striking that the first two examples that people tend to point to non-white robots are in some sense specific individual robots rather than robots that represent types of individuals.

Interestingly, Japanese robotics laboratories, as far as I can tell, are the only national robotics laboratories that sometimes build robots that look Japanese, that clearly get raced as Asian. As far as I can tell, for instance, Indian robotics labs still build white robots.

Thus, I don't think these are counter-examples to robots having race. I think these actually show us that robots do have race. The fact that these robots stand out just serves to highlight that the mainstream of robots are in fact White.

This slide shows another important "exception" that just goes to prove that robots have race. People writing about robots should be more aware of "Rastus, the mechanical man", which was an appalling racist racial parody from the 1940s. Rastus was used in performances that involved shooting an apple off his head. You will notice that the accompanying text, taken from a newspaper report says "He can do almost everything except shoot craps", which is also a sort of racist stereotype.

Louis Chude-Sokei and Simone Browne have argued that Rastus's ideological role was to reassure white agricultural and factory workers that they would retain their race privilege even if they lost their jobs. Again, Rastus couldn't play this role except for the fact that everybody understood that "he" was Black.

However, there is a complication here: there is also clearly a sense in which robots don't have race. Washing machines don't have race, toasters don't have race. These are just machines—and machines, it seems, don't have race.

Yet, according to a "social constructionist" account of race, race is a social relationship. So if people treat machines as having race, this means that they do have race because that's just what race is: it's a set of social relationships. It's not some mysterious thing in the body. It is a set of social understandings and interactions.

So I think these things have race, and I think that this is a real problem for robotics because of the history of robotics. In his amazing book The Sound of Culture, which is basically a cultural studies take on the history of robotics, Louis Chude-Sokei, argues that, in an important sense, all robots are also always Black, because fundamentally robots are slaves, and, historically, slaves are people of color.

The more a robot becomes a robot servant—a robot butler—the more people will think of that as being Black or Brown. So then you have this really interesting tension between the explicit race politics of robots, which is White, and the historical race politics of robots, which is Black, and there is a question about how that unfolds in practice . . .

Now, I still haven't fully worked out how that race politics goes. However, there is a real issue here, because if you respond to the whiteness of robots by saying, "We need to build more robots of color," you actually play into this second narrative and you actually reinforce an idea of people of color as being essentially servile. So there is a really interesting problem here. If you're building a humanoid robot, you're either building a white one, which will contribute to a lack of media diversity that risks being straightforwardly racist, or you're building robots that reinforce a whole lot of racist narratives about how people of color are slaves by nature.

For the sake of time, I will move fairly swiftly through a similar discussion of sex robots. (If you are interested you can find a full discussion in: Sparrow, R. 2017. "Robots, rape, and representation." International Journal of Social Robotics 9(4): 465-477.)

Here we see the front page of the website for the famous TrueCompanion "Roxxxy" sex robot. Again, I want you to think about the politics of these images; I'm interested in the robot as media artifact here. What does it say about how our society thinks about women that male engineers build robots that look like this?

It's also worth keeping in mind the history of robotics as being driven by the desire to produce an artificial partner: the fundamental task of the engineer is to build a woman who can become his wife. This idea is in many of the foundational cultural stories of robotics. It's in Metropolis, and it is in the story of Pygmalion, for instance. Yet look at this slide and see what people build!

It has to be said that it seems likely that TrueCompanion was a media stunt and never really existed. This robot was carried around the porn industry and the website looks as though you can purchase it but as far as I can tell no one has actually ever purchased it. Interestingly, the cheap version comes with no arms or legs . . . there are so many creepy things about the TrueCompanion robot!

This image of a sex robot is getting a lot of currency in media at the moment. This is the Synthea Amatus robot designed by Sergi Santos, which has been in the news recently. This does seem to be a real production robot and not just an idea for a robot.

The people who make RealDoll, the American company that makes RealDoll, are also making sex robots now. And there is a Korean company that makes sex robots. So it does look as though we will eventually see real sex robots and not just images of sex robots.

Again, importantly, I think these things only work if they represent women. They only function to allow sex if they can play the role of a woman in the sexual act. Otherwise they're just tools for male masturbation. If people know what a Fleshlight is—these sex robots are just very expensive versions of that. Interestingly, the actual market for existing sex dolls mostly seems to be doll fetishists, not people who want to have sex with a woman but can't find a woman; it's people who fetishize having sex with dolls or sex with robots. Nevertheless, it is clear that sex robots are supposed to have gender and that most of them are supposed to represent women.

Again, there is a complicated politics here, because if these machines and their images represent robots, then certain issues fall away. However, if they represent women, the appalling sexism of the images of women is a real problem. I suspect in different contexts they will represent both.

My work on the ethical issues raised by sex robots has been concerned with the representation of consent when people engage in sex with the robot or sex doll. When you have sex with these things, is it a representation of rape? If you think these things, particularly the sex dolls, represent unconscious women, then sex with an unconscious woman is rape, so this looks to be a representation or simulation of the rape of an unconscious woman. But if the robots always say yes to sex, then you're representing women as essentially always available for sex.

Just very quickly, this slide lists the personalities that the TrueCompanion robot, "Roxxxy," was supposed to ship with. Note the inclusion of "Frigid Farah," this personality which is essentially designed to encourage rape fantasy! She is frigid and shy and she doesn't want to have sex with you, but nonetheless you can have sex with her. So this idea that these things are designed for rape is actually more present in the design space of the robots than people might think.

Once we start thinking about the representation of consent in the use of sex robots, it looks as though there is no good way to build a sex robot! Either you represent women as always available for rape or you represent them as always interested and willing to have sex with you. Both of those seem problematic to me for fairly obvious reasons.

Now, why does all of this stuff about the race and sex of robots matter? I think it matters when we return to consider the third sense in which robots have meaning, which relates to the way that they constitute an intervention into human social relations.

Robots and AI raise the same issues here. When we start to think about the web-enabled and "intelligent" devices we've all put on our kitchen bench and the relationship between the corporations and the data being gathered by these devices, you really begin to become aware of the power relationships that this technology establishes. When we adopt these technologies, we adopt new relationships between people, new ways of relating to each other, new ways of living, and that is itself a power relationship.

The people who design these machines are designing us and the future world we live in, and that means that there are conversations about what sort of future we want to live in, conversations that should be democratic, and means of political organization that would enable people to participate as citizens in building a future together that we desperately need to be exploring.

With that, I think I will leave it and look forward to discussion.

You may also like

A Dangerous Master book cover. CREDIT: Sentient Publications.

APR 18, 2024 Article

A Dangerous Master: Welcome to the World of Emerging Technologies

In this preface to the paperback edition of his book "A Dangerous Master," Wendell Wallach discusses breakthroughs and ethical issues in AI and emerging technologies.

APR 11, 2024 Podcast

The Ubiquity of An Aging Global Elite, with Jon Emont

"Wall Street Journal" reporter Jon Emont joins "The Doorstep" to discuss the systems and structures that keep aging leaders in power in autocracies and democracies.

APR 9, 2024 Video

Algorithms of War: The Use of AI in Armed Conflict

From Gaza to Ukraine, the military applications of AI are fundamentally reshaping the ethics of war. How should policymakers navigate AI’s inherent trade-offs?

Not translated

This content has not yet been translated into your language. You can request a translation by clicking the button below.

Request Translation