More than half a century ago Sci-Fi writers predicted that robots will interact and work closely with humans in everyday environment. Isaac Asimov created rules which should be obey by robots to safe coexist with humans. When first robots show on the market they were breaking all of them.


Asimov’s 3 Laws of robotics:
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2) A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.
3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

Today due to progress in control, design and computing performance we are starting to build a robots which could cooperate with humans. In past robots worked always behind fences, so nobody investigate what could happen when robot arm hits a worker. Presence of humans in working area was strictly forbidden during robot work.

Robot safety

Today when we are able make cooperative robots we still don’t have norms which could tell us how danger are robots. ISO 10218-1:2006 require from cooperative robots respecting hard limits on maximum velocity (250 mm/s) and static force (150 N).

To evaluate danger caused by cooperative robots scientists decided to use criteria from EuroNCAP car crash tests. What interest, first results shows that even hit by high payload robot at high velocities is not dangerous until we are not constrain.

Situation getting worse when we are standing against a wall or cell equipment. Robot has big inertia which cause that cannot stop immediately. Heavy duty robots weights over thousand kilograms. They simply crush human body.