LESSON
Robots and an Older Population – Discussion
Share
Transcript
Peter: One of my favorite robot movies in recent times is called "Robot and Frank". And it's about the relationship between an elderly man who is suffering the early stages of dementia - he used to be a cat burglar - and the robot who looks after him. And Frank is a fairly curmudgeonly character, and his son finds it pretty draining and exhausting to go and visit him and be abused. So he buys his father a robot, because the robot will look after him. And it's kind of a less bother for the son, Hunter, and he has effectively delegated the care of his father to a machine because it's less painful for him to actually visit him, the father, himself.
So you could perhaps argue that this father, Frank, is going to be better looked after with the robot. Do you wonder whether Hunter is doing the right thing here by buying his father a robot and walking away?
Doug: If we look at deontology, deontology is ‘deon’, duty. And key to deontology are the sets of duties that are required to meet those criteria I laid out earlier. Autonomy, non-maleficence, right? So in this case we're talking about autonomy and the right of the individual to live a full and considered life. So the father in this case is being looked after by a machine, the son is abrogating his duty to his father. And so one could say that that's unethical. Secondly, to what degree though? I think that having a machine to look after somebody, as long as it fulfills the autonomy criterion for the father, it's fine. As long as he is autonomous and can make decisions outside of the robot - not a problem. The robot is serving his needs and is increasing his autonomy. But at that point where, and this is really a paternalistic move to look after an individual, right? But at the point that the individuals autonomy is impacted, in other words they can't make decisions, or what decisions they made are affected by the machine, then I'd argue it's unethical. And I'd also argue that the duty of care is being abrogated by the son.
Peter: Even if the machine would do a better job of care than the son would himself?
Doug: But this does not abrogate the individual for the duty.
Peter: Okay.
Doug: You're with me?
Peter: Yup.
Doug: But correct, I agree with you on that. But the duty of care to a parent is an ethical, it's a fundamental ethical sort of quandary we all have to face as our parents get older.
Peter: Absolutely.
Doug: But ethics and deontology would argue that you have this duty and you fulfill it.
Peter: So what about the... let's go down a generation instead of up a generation. We have a lot of discussion in this country about the cost of preschool care, childcare, and that we can't get enough people to do that work. So what does ethics say about having machines, robots looking after kindergarten age children or younger? They're well fed. They're looked after. Their nappies and diapers are changed. Maybe they’re played with, very enthusiastically, perhaps more than by a human.
Doug: Yeah, I'm with you. But I think that, again we use the same argument in terms of duty, and that's the duty of the parent. The filial duty, and what that entails. So again, if a robot supplements that filial duty, good. But if it replaces it - not good. So it becomes a grey area that you have to negotiate, because if a robot at the end of the day makes our children safer – well I'm for that. You know, if it doesn't let them get onto the street and that sort of thing, I'd argue that's great. But a robot can't replace the duty of care of the parent. It can't do it. At least not now. But there's still that fundamental duty the individual has to their children.
Peter: And to their parents.
Doug: And to their parents, yeah. And you can't escape that.
Peter: So where is that written down, the duty to your children and your parents?
Doug: Oh, it's around.
Code
At what point does it become unethical to replace a human carer with a robot carer for an elderly parent or a child?