Although modern cameras provide machines with well-developed vision, robots still lack such a comprehensive solution for their sense of touch. At ETH Zurich, in a group led by Professor Raffaello D’Andrea at the Institute for Dynamic Systems and Control, we have developed a touch sensor principle that allows robots to retrieve a wealth of contact feedback on interaction with the environment. I recently described our approach in a TEDx speech at the last TEDx in Zurich. The speech will include a technical presentation introducing a new touch recognition technology aimed at the next generation of soft robot skin.
identification technology is based on a camera that follows fluorescent particles that are densely and randomly distributed in a soft, deformable gel. The randomness of the patterns simplifies gel production and their density gives the stretching information in each pixel of the resulting image. In addition, the technique does not make assumptions about the shape of the recognition surface, which may have an arbitrary geometry.
While capturing images of particle motion is intuitive and to some extent visually interpretable, picking up accurate physical quantities is challenging. To overcome the complexity of real-time modeling of soft material behavior, the data obtained from the images are mapped to a data-driven distribution of the applied (shear and pressure) contact forces. Specially, neural network is used, which is fully trained by accurate finite element simulations to decompress the above mapping.
This technique has the potential to affect several application fields, of which robot manipulation is evident. Inside something a recently proven concept, we have shown how highly dynamic manipulation tasks can be achieved with a touch sensor alone, as shown in the video below.
In addition, the versatility of this approach makes it suitable for a variety of products outside of robotics. In fact, artificial tactile sensation can find applications in intelligent prosthetic systems, which can restore tactile sensation to people who have lost limbs.
Carlo Sferrazza is a PhD student at the Institute for Dynamic Systems and Control at the ETH Zurich, under the supervision of Professor Raffaello D’Andrea. His current research interests include the design and development of vision-based, data-based touch sensors, as well as learning-based predictive guidance.