Equipping machines with tactile abilities is still a major challenge. Will robots soon be able to feel their surroundings like humans and animals? Researchers at the Max Planck Institute have developed an artificial finger that can perceive touch as a step towards tactility for robots. In the journal Nature Machine Intelligence, the team from the Max Planck Institute for Intelligent Systems (MPI-IS) presented a sensor called “Insight”, which, with the help of machine vision and a deep neural network, could accurately estimate where objects come into contact with it and how great the forces acting on it are.
The sensor, which is modelled on a thumb, consists of a skeleton enclosed in a soft shell that provides stability (similar to the bones of a human finger). The shell is made of an elastomer mixed with dark but reflective aluminium flakes, which makes the shell opaque. Inside this capsule is a tiny 160-degree fisheye camera that captures colourful light patterns generated by a ring of LEDs. When the sensor shell is touched by one or more objects, the colour pattern inside the sensor changes. The camera takes pictures several times a second and feeds this data into a deep neural network. Because an algorithm detects even the smallest changes in the light in each pixel, the trained model can determine where exactly an object is touching the “finger”, how strong the forces acting on it are and in which direction they are acting. A so-called “force map” is then derived from this, a three-dimensional image of the forces acting on the artificial thumb.
“Our sensor shows excellent performance thanks to the innovative mechanical design of the shell, the customised imaging system inside, the automatic data acquisition and thanks to state-of-the-art Deep Learning methods,” says Georg Martius, head of the research group at MPI-IS. Katherine J. Kuchenbecker, director of the Department of Haptic Intelligence at MPI-IS, confirms the usefulness of the new sensor: “Previous soft haptic sensors only had a small area in which they could detect things. They were sensitive and difficult to manufacture and often could not sense forces that run parallel to the skin. But this is essential for a robot that holds a glass of water or moves a coin on a table.”
More on ndion
Share this page on Social Media: