Constraint-based motion specification application using two robots


The ACM research group of the K.U.Leuven used Orocos Real-Time Toolkit as framework for an involved robotics application as well as Orocos Bayesian filtering library and Kinematics and Dynamics Library as detailed in this paper.
The experiment consists of a complex task—“human-aware task execution”—involving two robot arms, five PCs interconnected by ethernet, and half a dozen or so sensors. One robot moves an object, while the other robot’s tool performs a force-controlled “operation” on the boundary of that object. A Sick laser scanner looks at the environment of the robots, and detects moving persons in a given range. A camera mounted on the tool “looks ahead” of the contact point, providing a quadratic estimate of the contour in front of the tool which allows to give appropriate geometric feedforward to the force feedback control. The twelve degrees of freedom of both robots together are used to allow the following flexibility in the task execution: (i) both robots can be kept away from their kinematic singularities; (ii) as soon as one person comes within a specified distance of the robots, they move the object away from that person; (iii) a camera mounted on one of the robots is aimed at the closest human, allowing visual feedback to the humans that the robot controller is aware of them.
More details on the experiment are found in the MFI2008 paper and details on the underlying robotic task specification are found in the International Journal of Robotics Research paper.