Image Based Visual Servoing using RTT 1.0
At KULeuven we succesfully succeeded in doing some real experiments showing image-based visual servoing. Every part of the application is created with RTTs TaskContext. The setup consists of a hacked industrial Kuka-361 robot, a firewire camera attached to the robots end-effector and a Pentium IV PC.
We use the existing hardware/camera components to get the images of the camera. The existing hardware/kuka components sends the control outputs to the robot-hardware. Reporting components put our results in a file for post-processing in Matlab. The motioncontrol/naxes and cartesian components allow us to initialize the robot for the visual servoing control.
A new taskcontext allows us to initialize the visual servoing by selecting the target in the image, this component runs in soft realtime. Another component does all the image-processing and calculates the control output in hard-realtime. A third taskcontext shows the resulting images in soft realtime so we can supervise the visual servoing.
The camera captures color images of 640x480 pixels at a framerate of 60 fps. This is the actual bottleneck for the control loop which runs therefor at 60 Hz in hard realtime. The component talking to the robot hardware runs at 500Hz in hard realtime.