MASTERCLASS
Robots and the future
Lessons
Share
Transcript
Peter: Robotic cars are coming. We’ve seen increased levels of automation in recent times in regular passenger cars. We’ve seen systems like backing alarms, automatic parking and so on, adaptive cruise control. Google particularly, recently, have got a really high profile in this space, but I know that most of the major auto manufacturers have got projects looking at developing fully self-driving cars and the number that gets bandied around a lot is that this technology is going to be on the roads by the year 2020.
Lots of people worry about the legal issues around this. That if there is a self-driving car and if it causes an accident, it kills somebody, whose fault is that? But you’ve got to I guess balance that against the fact that human beings are terrible drivers. Every year, a million people are killed, and maybe a hundred million people are injured on the road, by human drivers. So if we had robot cars, and they perhaps do cause some injuries, but perhaps 100th of the number of injuries we have now, what does ethics have to say about that? Is that a good thing or a bad thing that robot cars would be accidentally killing people? Not as many, but perhaps still some?
Doug: Being over 50, I always look to the past to answer the future and I’m thinking all right, let’s think of the design of cars in the past. So you and I were raised at the time when there were no seatbelts. When we were kids, we all jump in the back seat, there were five of us and we’re back there like hamsters or rabbits. We had a 1964 Chrysler Saratoga. The cars were designed not for safety at all. They were designed to look good…
Peter: And they did at that time.
Doug: Yeah, and they were wonderful. And they’re designed to go fast. We had a 383 V8 and it just flew. The mindset in those days was not around efficiency because fuel consumption wasn’t an issue because the oil was cheap. Safety- cars weren’t designed for safety. You got into an accident in those days, you were scrambled inside the car because it wasn’t design friendly. Mercedes-Benz, Volvo, you know the story. Automobile started to design around safety into the later ‘60s and into the ‘70s. It became a major issue. Engines were designed to drop-down, they had seatbelts…
Peter: Airbags…
Doug: Airbags came along and then the brakes. It all changed as safety became a criteria. Automobile deaths are the biggest killer in most first world countries. They are a big killer, there’s no doubt about it, in terms of violent death.
Peter: Yeah.
Doug: So the argument would be that, in fact, ethically are automobiles being manufactured even today under the strictest sense of safety to preserve human life? Because ethics would argue that they should be, that you should be designing this car, so that it’s the safest car possible within sort of the economic constraints. So if you argue from the past to the future and look at this construct of robotics making less death, making the road safer, I would argue that it’s completely ethically founded in terms of using robotics. I have no problem with that at all.
Peter: So in a greater good sense, it’s fantastic.
Doug: In a teleological sense, absolutely, yeah.
Peter: But for the individual whose partner was killed by a robotic vehicle, they’re not going to be happy. But I guess they’re not going to be happy if they’re killed by a human-driven vehicle either.
Doug: Yeah, so again, what’s the difference? You’re right. It becomes a case then of law, who gets sued and how. Does it fall back onto the car manufacturer to be sued in that case rather than the fallibility of the individual who lose their license or get a jail term for causing that?
Peter: So then it becomes a legal issue and no longer a moral issue?
Doug: That’s my belief, yes.
Peter: Okay, so the higher or the moral consideration here is that we could dramatically reduce the number of people killed on the roads and that’s a good thing.
Doug: Absolutely, Peter. I believe that. Yeah.
Code
Is it acceptable that self-driving cars will cause accidents if they kill fewer people than human drivers? When an accident occurs is it a moral or a legal issue?
Rate this lesson
Discussion
Leave a comment
Please Sign In to leave a comment.
It was on the motorway near Phoenix, Arizona, that I realised fully driverless cars might be quite a distant dream. And that was because our Google Waymo robo-taxi seemed incapable of leaving that motorway.
We were in Arizona to record a radio documentary for the BBC World Service about the progress towards creating autonomous vehicles that would make our roads safer and replace human drivers with robots.
Google leads this race at the moment and for the past six months has been offering a robo-taxi service, Waymo One, to a select few early adopters in and around the Phoenix suburb of Chandler.
Our first ride with Waymo took us through the quiet suburban streets, where traffic is sparse and drivers well mannered.
Here, the minivan, fitted out with a battery of sensors and high-definition cameras, performed very impressively, handling slightly tricky left turns, spotting other road users and slowing down as it passed a school.
While a Google engineer sat behind the wheel, she never intervened and soon we relaxed and forgot that we were effectively being driven by a robot.
_________________________________
http://www.sfrelationshipcoaching.com/