The Robot Explosion on Simulation Series

Screen Shot 2019-04-02 at 1.05.07 PM

I was just interviewed on a multidisciplinary show called Simulation which features the greatest minds of our time. The host Allen Saakyan and I covered a lot of territory, including how robotics can solve the grand challenges, what are the chances of rogue robots and AI wiping out society, how we can design morality and ethics into robots – but not by encoding them! – and what the real limits of AI and robotics are right now.

Here are the links to our interview:
Posts on LinkedIn | Twitter
Please share!
Best,
Andra

 

What is good robot design?

Let’s stop talking about bad robots and start talking about what makes a robot good. A good or ethical robot must be carefully designed. Good robot design is about much more than just the physical robot, and at the same time good robot design is about ‘less’. Less means no extra features, and in robotics that includes not adding unnecessary interactions. It may seem like a joke, but humanoids are not always the best robots.
‘Less’ is the closing principle of the “10 laws of design” from world famous industrial designer Dieter Rams, and design thinking has informed the discussion around guidelines for good robot design, as ethicists, philosophers. lawyers, designers and roboticists try to proactively create the best possible robots for the 21st century.
Silicon Valley Robotics has launched a Good Robot Design Council and our “5 Laws of Robotics” are:

  • Robots should not be designed as weapons.
  • Robots should comply with existing law, including privacy.
  • Robots are products: and as such, should be safe, reliable and not misrepresent their capabilities.
  • Robots are manufactured artifacts: the illusion of emotions and agency should not be used to exploit vulnerable users.
  • It should be possible to find out who is responsible for any robot.

These have been adapted from the EPSRC 2010 “Principles of Robotics” and we greatly thank all the researchers and practitioners who are informing this ongoing topic.
Silicon Valley is at the epicenter of the emerging service robotics industry, robots that are no longer just factory workers but will be interacting with us in many ways, at home, at work, even on holiday.
In 2015, we produced our first Service Robotics Case Studies featuring robotics companies: Fetch Robotics, Fellow Robots, Adept and Savioke. We will shortly release our second report featuring Catalia Health, Cleverpet, RobotLab and Simbe.
Design guidelines can not only create delightful products but can fill the ethical gap in between standards and laws.
After all, if our robots behave badly, we have only ourselves to blame.

Blended Reality With Robots

161.966

We are already living with robots. The future is here, but as William Gibson says, it’s not evenly distributed yet. Or as I like to say, I believe that we often don’t recognize the future when we see it.

How do we recognize robots? We usually look for humanoid robots, the stuff of science fiction. Even the classic robot ‘arm’ is part of a ‘human’. But technically, a robot is simply a machine that ‘senses, thinks and acts’. Even the ISO for industrial robots – the international standard describing industrial robot arms – is somewhat broad in definition. A robot is “an actuated mechanism, programmable in two or more axes, with a degree of autonomy, moving within its environment to perform intended tasks.”

Is a car a robot? Yes. Even without full autonomy, a car consists of many autonomous systems. Elon Musk called the Tesla S ‘a computer in the shape of a car’. But really, it’s a robot.

Is a washing machine a robot? Visually, we would find it hard to think of it as a robot. All the ‘humanoid’ bits are hidden inside a box. Yet the modern washing machine, soon to be a washing-drying-folding machine, is a very sophisticated piece of machinery, sensing, thinking and acting in the environment.

Because we see the world through human eyes, it is very hard for us to see things outside human categories. We divide the world into humans, and things. Robots change everything. As we build robots, we are really reshaping what it means to be human. What does it mean when our devices start to look like humans? What does it mean when they don’t? And what does it mean when we use so many different devices to communicate with other people?

Slide09

Our technologically blended reality is asynchronous, mediated and indirect. Our technologies allow us to communicate across distance and time, and expand our scale, creating a larger richer world. This is nothing new. Civilization is the story of technology taming space and time.

Since we invented writing, we’ve been able to communicate with other people at a distance, at different times and at larger scale than direct communication. And as we invented reproduction technologies, like the printing press and photography, the scale of our communications increased. This has had a huge impact on the world, reshaping our cultural, religious and political structures.

The last 200 years has seen the introduction of many new communication technologies, telegraph, telephone, radio and television. But one thing they’ve all had in common. Until very recently, we’ve been able to see who is ‘pulling the strings’. The subject or object of communication has been visible or known.

In the last decade, we’ve seen an explosion of information and communication technologies and we’ve gone wireless and unplugged. Internet technologies in the 80s and 90s were supposed to usher in an era of anonymity, but in reality they largely just increased the scale of known communications. And our connections to the devices of communication were much more obvious.

Slide17

As social beings, our reality is very much defined by our communication technologies. These days, even when we are in the same physical place as other people, we are no longer sharing the same reality. We are experiencing different worlds, as if we were in our own reality bubble.

And even when we are communicating, we are no longer certain to be communicating with other people. Ray Kurzweil predicts that in the future we will mainly communicate with machines and not other people. We will experience this technologically blended reality as an extension of ourselves, as a proxy for other people and has its own ‘alien’ identity.

Slide20

Sharp’s new Robohon phone, created by Tomotaki Takahashi, is the epitomy of a blended reality device. It acts as an extension of ourselves. It provides a proxy for other people, and it has its own very distinct identity. Our categories of ‘you’, ‘me’ and ‘it’ are more fluid than we think. Robots are blending the me and you into the it.

Heidegger was one of the first to describe technology as an invisible extension of our identity. Heidegger’s hammer is ‘present’ when we look at it and think about it. But when the hammer is in the hand of a builder, then it becomes invisible. The hammer is ‘ready at hand’ when the builder thinks of building, not hammering or the hammer. The tool is well known and the focus is on the task instead. The hammer becomes an extension of our identity, an expression of our intent in the world.

Our technological extensions also augment our senses. A lady with feathers on her hat, as described by Merlau-Ponty, has enhanced her spatial awareness. She has increased her sense of the whereabouts of walls and doorways. Just like the whiskers on a cat, we are augmenting our world with technological whiskers.

Similarly, as technology acts as an extension of others, or a proxy, it also becomes invisible to us. Telepresence robots offer an illusion of real presence and become transparent as technologies. Our focus shifts from the tool to the task. In  this case the task is the social interaction. Suitable Technologies even prefer that we don’t call their telepresence devices robots because they want our focus to be on the experience not the device.

Robots are becoming popular and as more of them enter our world, they bring their very own personalities and appearances. But any device with a screen, or speakers and connectivity, is capable of being a gateway for many other people. We can have relationships that are indirect, asynchronous and at scale. Our relationships can be with you, me and it and many mixtures in between.

We are going to see more and more social robots in the service industry, including health, manufacturing and logistics, and in the consumer end, including the home, retail and hospitality. And we are just starting to understand the scope of this blended technological reality with robots.

Slide30

People enjoy meeting Savioke’s Relay, the robot butler now at 4 hotel chains in California. You can communicate with Relay, although the robot behaves more like R2D2 than C3PO. Relay is functional too. Relay is designed to deliver small items to guest rooms when the front desk staff are busy.

After collecting a lot of feedback, Savioke find that as well as people enjoying their communication with Relay, they also appreciate not having to communicate with a person at a time when they are not feeling social, ie. late at night. The robot starts to become an extension of their wishes, but still has just enough personality to improve the experience.

Slide33

A robot like Mabu from Catalia Health is acting as a proxy for a doctor or primary health care physician. Mabu will stay in the home of patients on a specialty pharma treatment where Mabu’s AI engages the patient directly in conversation and it’s only the data that is communicated to the doctor. And while Mabu the robot may sit at home, Mabu the app can travel with the patient anywhere.

And Fellow Robots OSHBot is really mixing all our relationships up. OSHBot can act as a simple extension. When you enter the hardware store you can ask the robot for directions and then simply follow the map. Or the robot can autonomously guide you to the correct location inside the store. You can engage the robot in conversations about the parts you’re looking for.

Slide34

Robots are great at remembering 10,000s of SKUs and where on the shelves they all are. But people are really great at problem solving and understanding complex communications. So if you ask questions like “What sort of glue should I use on a roof tile like…”, then OSHBot can call an expert in for a video call with you. So you can be talking to both the robot and another person.

For the customer, this is just a great shopping experience. But this could change the nature of daily work for the store associates, leaving them free to focus on solving the things they enjoy, with their social and expert knowledge, rather than walking miles of aisles, tracking thousands of small items.

So robots are really augmenting our reality in a multitude of ways. Robots are the embodiment of information. And in our new blended reality, they extend and augment our senses, they are the proxies or avatars for others. And they also have their very own alien identity.

As a child, I wanted to be an astronaut, to explore the universe and to meet aliens, but it turns out the aliens are here, and they can teach us a lot about what it is to be human. In research areas from neuroscience, to biomechanics and psychology, we’re using robots to better understand humans.

Slide39

Ex Machina: What happens when Turing meets Bechdel test

04_YouTube_ExMachina-141030

Alex Garland’s first feature film as a director, Ex Machina, had its US debut at SxSW on March 14. This stylish idea film explores the Turing Test in a very Pinteresque fashion as a young coder falls in love with an advanced AI. Ex Machina is beautifully framed, but Garland’s stark script succeeds on the strength of the acting from Domhnall Gleeson, Alicia Vikander and Oscar Isaac.

file_124561_0_exmachinaposterlarge

Garland’s writing career launched in 1997 with the best selling novel “The Beach”, which the Times called the Gen X answer to Lord of the Flies. After a string of cult successes like 28 Days Later, Sunshine, Dredd and drafts of Halo and Logans Run, Garland became fascinated with the emerging promise and perils of AI. In the Q&A following the SxSW screening, Garland talked about feeling a zeitgeist, a technological and cultural turning point, compelling him and other film makers and writers to address robots and artificial intelligence.

Although he says he’s on the side of the robots, it’s an uneasy truce. Garland describes his film as the story of ‘two brains torturing each other’. That’s true. In Ex Machina, Tony Stark meets John Searle in a gripping drawing room theater, when a billionaire tech genius recruits a young coder to administer the Turing Test to his secret advanced embodied AI.

And it’s a stark film, there are only 4 characters; 2 men and 2 women. 2 AIs and 2 humans. And which two are the brains? That is supposed to be uncertain, but anyone who has used the Bechdel Test to analyze films or popular culture for gender issues knows exactly where the ‘brains’ are.

The Bechdel Test started as a gender litmus and has become a remarkably useful indicator of power imbalance. The test is named after Amy Bechdel a cartoonist who outlined the rules in a 1985 cartoon. To pass the Bechdel Test, a film has to have two women in it, who talk to each other, about something other than a man.

Sometimes the proviso is added that the women have to have names, because some films can have many women characters, but if the characters are all “girl at checkout” and “girl with gun” then they are just devices to add color or move the action forward. And of course, possession of a name is an important indicator of personhood, or identity awareness, so it’s always one of the first steps to separate the beings from the machines.

Many films seem at first glance to have badass female characters but when put to the Bechdel Test, it becomes clear that they never talk to anyone but the main man, or if they talk to each other, it’s about the main male characters. So really, they have no interiority, no self awareness and are probably going to fail a Turing Test. That’s where I think it would be very interesting if the Turing Test were to meet the Bechdel Test more often.

Garland is also playing games with gender and the alienness of AI in Ex Machina. There is a beautiful scene where Ava, the AI, performs a reverse strip tease, putting on her human body.

But I’m afraid that Ex Machina falls at the final fence, as does just about every other science fiction film I’ve ever seen, aside from Alien. The Bechdel Test is useful for more than examining gender representation. It can be our Turing Test for creating believable alien or artificial life forms. If you look at our filmic or cultural representations of the other or alien, then you have to be struck by the singular nature of them all. From Frankenstein to Big Hero 6, do they have any reality without the central human characters?

No, they are alone. Even Alien is alone. At least in Frankenstein, it is the utter aloneness of the new form that is the whole story. Films that have pushed the envelope are few. And doing a quick mental check, the was left feeling empathy for the ‘others’ in only a couple, like Westworld, BladeRunner and Planet of the Apes, and the books of writers like Brin and Cherryh.

How believable are our ‘other’ AIs and robots? Brad Templeton said that an autonomous vehicle isn’t autonomous until we tell it to go to the office and it decides to go to the beach instead. A life outside of our anthropomorphic story is what’s missing from our AIs, aliens and others. We don’t really care about them or their lives outside of their impact on our own. And this makes us poorer.

The final shot is a haunting homage to Plato’s Cave’ although Garland credits his Director of Photography entirely for it. In The Republic, Plato posed the question, what if humans were born chained to face a cave wall seeing the world only as the shadows passing in front of a fire behind them in the mouth of the cave. Imagine the difference when you see the world, unchained from the cave.

I can’t say more. Go see Ex Machina. And use the Bechdel Test on everything.