Robot-assisted system with Ensenso 3D camera for safe handling of nuclear waste

The decommissioning of nuclear facilities poses major challenges for operators. Whether decommissioning or safe containment, the amount of nuclear waste to be disposed of is growing at an overwhelming rate worldwide. Automation is increasingly required to handle nuclear waste, but the nuclear industry is reluctant of fully autonomous robotic control methods for safety reasons, and remote-controlled industrial robots are preferred in hazardous environments. However, such complex tasks as remote-controlled gripping or cutting of unknown objects with the help of joysticks and video surveillance cameras are difficult to control and sometimes even impossible.

Application

Anyone who has ever tried out a fairground grab machine can confirm it: Manual control of grab arms is anything but trivial. As harmless as it is to fail when trying to grab a stuffed bunny, failed attempts can be as dramatic when handling radioactive waste. To avoid damage with serious consequences for humans and the environment, the robot must be able to detect the radioactive objects in the scene extremely accurately and act with precision. The operator literally has it in his hands, it is up to him to identify the correct gripping positions.

Extreme Robotics Lab’s 3D vision-guided semi-autonomous robotic cutting of metallic object in radioactive environment.
Extreme Robotics Lab’s 3D vision-guided semi-autonomous robotic cutting of metallic object in radioactive environment.

With the help of the software, the Enseno 3D camera takes over the perception and evaluation of the depth information for the operator, whose cognitive load is considerably reduced as a result. The assistance system combines the haptic features of the object to be gripped with a special gripping algorithm. "The scene cloud is used by our system to automatically generate several stable gripping positions.

Since the point clouds captured by the 3D camera are high-resolution and dense, it is possible to generate very precise gripping positions for each object in the scene.

— Dr. Naresh Marturi, Senior Research Scientist at the National Centre for Nuclear Robotics —

Based on this, our "hypothesis ranking algorithm" determines the next object to pick up, based on the robot's current position," explains Dr Naresh Marturi, Senior Research Scientist at the National Centre for Nuclear Robotics.

Graphic: (c) Extreme Robotic Lab
Graphic: (c) Extreme Robotic Lab

The principle is similar to that of the skill game Mikado, where one stick must be taken away at a time without moving any other sticks. The determined path guidance enables the robot to navigate smoothly and evenly along a desired path to the target gripping position. Like a navigation system, the system supports the operator in guiding the robot arm to the safe grasp, if necessary, also past other unknown and dangerous objects. The system calculates a safe corridor for this and helps the operator not to leave the corridor through haptic feedback.

The system maps the operator's natural hand movements exactly and reliably in real time to the corresponding movements of the robot.  The operator thus always retains manual control and is able to take over in the event of component failure. He can simply turnoff AI and move back to human intelligence by turning off the “force feedback mode”. In accordance with the principle of shared control between man and machine, the system thus remains under control at all times - essential in an environment with the highest level of danger.

Dr Naresh Marturi, Senior Research Scientist in Robotics - Maxime Adjigble, Robotics Research Engineer
Dr Naresh Marturi, Senior Research Scientist in Robotics - Maxime Adjigble, Robotics Research Engineer

Outlook

Researchers at the Extreme Robotic Lab in Birmingham are currently developing an extension of the method to allow the use of a multi-fingered hand instead of a parallel jaw gripper. This should increase flexibility and reliability when gripping complex objects. In future, the operator will also be able to feel the forces to which the fingers of the remote-controlled robot are exposed when gripping an object. Fully autonomous gripping methods are also being developed, in which the robot arm is controlled by an AI and guided by an automatic vision system. The team is also working on visualization tools to improve human-robot collaboration to control remote robots via a "shared control" system.

This is a promising approach for the safety and health of all of us: the handling of hazardous objects such as nuclear waste is ultimately a matter of concern to us all. By reliably capturing the relevant object information, Ensenso 3D cameras are making an important contribution to this globally prevalent task of increasing urgency.

Ensenso N35 - 3D vision, fast and precise

The Ensenso N series from IDS.
University of Birmingham