Loading...
 
logo.jpg (18.60 Kb)

IRFO - Intelligent Robots for Handling of Flexible Objects

Project Description

Interreg-robot.jpg (38.12 Kb) Industrial production today relies heavily on automated tasks that are performed by robotic assembly lines. This approach works well if the objects are fully specified industrial assembly parts and do not change in size, appearance, etc., like bin picking of predefined parts. Robots can perform such sorting and picking task nearly “blind”; with limited visual capabilities. If, however, objects originate from the natural world and vary considerably in size and appearance, this approach will fail. Demonstrator Foto

Hence there is a need for increased visual sensing capabilities, resulting in intelligent, “seeing” robots that can decide what they grasp by visual sensors and object learning.
Such naturally changing objects occur frequently when handling food or other flexible and deformable goods, like cloth or rubber, where traditional robot assembly will be useless and the production has to resort to manual labor. Southern Denmark and Schleswig-Holstein have a rather large food industry, which could become endangered if production costs rise. In this case it would be advantageous to upgrade production with intelligent robotic assembly, which at the same time will upgrade the qualifications and required skills of the workers. The tedious manual work can be performed by the robots while human skills will be upgraded to control robotic tasks.


Project Goals


The IRFO project will enable companies in the region to upgrade their production facilities and to retain production in their region.

Possible areas of use for intelligent 3D vision systems are to help evaluate the status of live stock to automatically regulate food supply for feeding, to sorting of live stock. As well meat factories as slaughter houses can use 3D vision systems when meat is chopped and has to be handled and packed.

In this project we intend to develop a robot-vision platform that will enable companies to automate handling of natural goods, live stock and deformable objects. The platform contains a 3D sensor that simultaneously captures 3D shape and color in depth-video-sequences, which will allow modeling of 3D shape and time-dependent object deformation. The object models will be utilized either to evaluate live stock for growth, meat status etc., or to handle flexible objects with the robot. The modeling of internal forces of the object can predict the deformation over time and the robot will be able to grasp the deforming object correctly and to handle the object.
In addition to the direct benefits of the project for regional companies handling natural goods, is the indirect benefit for high-technology companies that will take over the developed technology and marked it towards specialized market segments. For each market segment, adapted technology must be developed to optimize the approach and to ensure proper support.

The project will develop the base technology and will support the regional companies further on, but the high-tech companies will multiply the technology towards the different market segments.

For more information, see http://www.interreg-robot.eu.


IRFO Vision System


Demonstrator Schematic

Our task in the IRFO project is to design the vision system as well as the development of the image processing algorithms. The current version of the vision system consists of two parts:
The laser stage, which is has line structured light setup, scanning the objects on the conveyor, and a Time-of-Flight stage, tracking the objects while the robot interacts with them.

Laser Stage Schematic Laser Stage Foto The Laser Stage:
This section is designed to create 3D models of the objects that pass by on the conveyor belt. The high resolution models are used to initialize the an 'undeformed' geometry of the object and to calculate the robot grasp. The data from the laser stage are the base for further computations.
Example scan of a piece of artificial meat
LaserScanReal LaserScan3d

TOFStage Schematic TOFStage Foto The Time-of-Flight Stage:
The purpose of the ToF section is to track the deformation of the object that occurs when the robot starts to interact with it. The deformation can be used to calculate parameters describing the material parameters, e.g. flexibility.This parameters are then used to simulate the behavior of the flexible object.


Here is an example of the Time-of-Flight stage applied to the surface tracking of live stock (cattle). The IRFO software is used to track significant points on the cows back which allow to calculate body condition parameters.

Flash player not available.


Created by ischiller. Last Modification: Thursday 16 of July, 2015 11:31:37 CEST by sandro.