Adaptive Automation
Using 3D data for the autonomous robot
Until recently, robots were "blind" command receivers which followed predefined and fixed paths. Using 3D data, robots can adapt flexibly to the particular situation and react to their surroundings. A promise is becoming a reality. The robot is turning into an autonomous employee. Benefits: Fast retooling times, high variance of workpieces, simple teach-in, simplified part feeding with a consistently high degree of automation.
Each step in the process considered, all eventualities ruled out. Thanks to automation, large quantities of units can be produced extremely efficiently. A high degree of specialization further improves efficiency. However, this specialized but expensive equipment falls by the wayside when it comes to flexibility and rapid retooling, as it is simply not cost-effective to produce a small batch of alternative parts. Each step in the process would need to be adapted. Small batches are often laboriously produced and manufactured by hand. While this may be flexible and cost-saving, it is a slow and non-stable process.
Robots adapt according to the situation
The development of 3D cameras and 3D-capable software has opened up opportunities for the industry to develop brand new machine vision technologies. Thanks to 3D vision, new tasks can be solved that were not possible with 2D.
One robot removes unsorted and overlapping t-pieces for a tube directly from a small transport box safely and reliably. Another robot depalletizes large aluminum parts directly onto a conveyor belt. The delicate movements of its robust gripper find a firm hold at the first attempt without the slightest collision with the workpiece. This is despite the fact that the parts on the used or dirty pallets are often skewed or leaning due to excess casting flash. Robotics has had to step up its game considerably for this bin picking and transfer of parts in the correct position.
The Freiburg-based systems integration company, isys vision, has developed a solution for this called "MIKADO Adaptive Robot Control" (or ARC for short). It is a configurable robot control with its own collision-free path planning. It uses its own inverse kinematics to calculate the joint angles of the robotic arms for gripping positions or traverse paths. 3D information, such as the workpiece shape, position, location, or a virtual image of the surroundings, is used as the reference point for the complex calculations. A large number of robots available on the market can be controlled using MIKADO ARC and make time-consuming programming unnecessary. Parts can be changed quickly so that even small batches can be produced using this robot-assisted material handling.
3D cameras capture the situation
The output data is crucial for optimum control of the robot. The integrator opts for suitable 3D camera technology according to the project and application. This decision depends not only on the general suitability of a method, but also on the cost, precision, speed, and reliable data collection.
The pros and cons of the classical methods such as time-of-flight (ToF), stereo vision, or laser triangulation can only be weighed up as an initial selection process. This is because many of the 3D cameras used today are hybrid systems, which use multiple process characteristics to cover a broader range of uses and improve the results.
isys vision uses Ensenso 3D stereo vision cameras for bin picking and material handling. These cameras consist of two area scan cameras, which work according to the stereo vision principle in conjunction with a powerful pattern projector to obtain robust 3D data, even of workpieces with difficult surfaces. The compact cameras of the N series are particularly suitable for close range and are mostly used directly on the head of the robot as a mobile eye. The new 3D system of the X series with its flexible base line can capture large volumes from greater distances ultraflexibly using various cameras from IDS Imaging Development Systems GmbH and is ideally suited for unsorted material handling from large wire-mesh pallets. Thanks to the 100 W power, the projector's LED light generates the finest textures on the workpiece surface even at large working distances of 5 meters.
With a variable base line and a 100 W texture projector, working distances up to 5 meters can be achieved using stereo vision cameras of the Ensenso X series, allowing you to capture objects with volumes of several cubic meters.
The system does not therefore depend on the ambient light and allows short exposure times. 3D resolutions of a few millimeters are possible even with 1-2 image pairs. With short exposure times, a small number of images, and extremely fast stereo image matching algorithms, 3D data is ready for further processing after just 500 ms. Thus, extremely high cycle times can be achieved in material handling.
The additional benefits of using two area scan cameras are obvious. In addition to 3D data collection by the stereo vision, reference characteristics of a scene can also be captured using the raw footage of the area scan cameras and used for the constant readjustment of the machine vision. The process results remain constant and robust. It is no longer necessary to perform recurring checks or time-consuming recalibration of the stereo vision system.
With capabilities such as bin picking and part feeding in the correct position, robotics in cooperation with MIKADO ARC and Ensenso 3D cameras can close the gap to adaptive automation. Even small batch production can be automated easily and cost-effectively.