Should robots be punished for their crimes?



If a robot commits a crime, should they be charged? And how do you punish a robot?

In 2017, the European parliament received a draft report from the commission of legal affairs about robots. With robotics progressing at a rapid rate, the report focuses on the concept of liability. For example, who is at fault if a robot injures someone or damages property?  According to the report, depending on the level of automation and autonomy, the robot may have greater accountability than its creators.

Who is in charge? Who do we charge?

When it comes to assigning blame, the report suggests that if a robot can only perform the tasks it’s been programmed to do, the creators are liable for any damage caused as the robot is acting as a tool.

However, if a robot uses machine learning and artificial intelligence to adapt to its environment, the robot would be at fault. This helpful in determining who to blame, but begs the question – how do we punish them if they are guilty?

 

Robot Punishment

The report outlines a few suggestions for how we might make a robot pay for its crimes.

Firstly, it is suggested that robots should be classified according to their sophistication, and upon reaching a certain point, they must be registered with the European Union.

Secondly, the report suggests a required insurance plan where the manufacturers must pay insurance for the robots they make.

A slightly more abstract suggestion is paying robots ‘wages’. No, this is not so robots can save up for a nice holiday – the wages would be used to create a compensation fund in case the robot is liable for any damages down the road.

Do we treat robots as humans?

The draft report even toys with the idea of granting robots human rights. Categorising humans and robots in the same vein is a controversial topic, but the report insists this would serve the benefit of humans, not robots.

For instance, what happens if robots and automation start replacing more jobs than they create?  Systems like welfare and government benefits depend on employment taxes and they can be underfunded if we do not have enough people in the workforce.  In classifying robots as people in an employment setting, business owners who use robots in automated roles would be obligated to pay taxes for the robot, as if they were actual human employees.

 

The start of a significant discussion

While the report’s ideas have not been reflected in laws at this stage, it shows that government bodies are starting to take the issues of robots, artificial intelligence and ethics seriously. The report has since been frequently mentioned in the ongoing discussion of the ethical rights of robots, with exciting debate continuing in 2019.

Are you interested in using your thinking to challenge the modern legal and political environment? A Bachelor of Business in Business Laws from Murdoch University tackles real-world issues, and prepares you for the workforce of the future.

You can opt for the following Double Majors at Murdoch University: