Robots And Morality: Article Analysis

601 Words2 Pages

It is not possible for machines to learn morality because the robot does not know from right or wrong. The robot is programmed to hurt people he is not gonna stop until he is told to stop he is not gonna see it the the human says that they had enough. The robots does not have a heart like us, they dont have brain, and what makes people think that they can develop morality, when they cant even develop feelings.

A robot that is programmed for one thing will not be able to have morality or develop it because it does not know from right to wrong. The article states, " school bus carrying 40 innocent children crosses its path. Should your car swerve, possibly risking the life of its owner (you)", This evidence shows that the robot will not be able to make the right decision for us human, for example, if a robot is pointing a gun at an innocent person, but the robot thinks he is a criminal, he would not know that. The robot will not make a mistake because it's programmed to be right at all time, but if he gets a choice to make a wrong decision compared to a right decision it will just be clueless. …show more content…

The article states, "Would a drone be able to learn not to fire on a house if it knew innocent civilians were also in". This evidence shows that if there was a building and the criminals are held hostage and they promised to kill them if they don't get what they want. How would the robot know there are civilians in there? He is just gonna do what he is programmed to, he's not gonna think about it. If the robot makes a mistake he is just gonna stand there and think that he did what he was programmed to do, he's not gonna know if the decisions he made is right or

Open Document