Robots learn to feel what vision misses
en-GBde-DEes-ESfr-FR

Robots learn to feel what vision misses

22/04/2026 TranSpread

That matters because robots are increasingly expected to work in places where people cannot safely intervene, including outer space, radioactive zones, and hazardous sites. In such settings, camera-based perception can break down because of poor focus, occlusion, or unstable lighting. Touch can recover information that vision often misses, especially shape, protrusions, and local surface detail, yet many flexible tactile sensors still struggle to capture fine structural features at the millimeter scale. Tactile sensing alone also cannot reliably determine a target's broader position and orientation within its surroundings. Due to these problems, there is a need to carry out in-depth research on complementary vision–tactile strategies for precise robotic perception and manipulation.

Researchers from Yonsei University, Republic of Korea, and the University of Southern California, United States, reported (DOI: 10.1038/s41378-026-01190-8) in Microsystems & Nanoengineering in 2026 a complementary visual localization and tactile mapping approach for robotic perception of millimeter-sized objects with irregular surfaces. The system combines a single RGB-Depth camera for spatial and positional recognition with a soft tactile sensor array for surface-feature detection and three-dimensional reconstruction when visual information is restricted.

The tactile component is built around a soft capacitive pressure sensor fabricated with inkjet-printed deformable silver electrodes and an irregular-structured dielectric layer based on soft substrate. The selected design reached a sensitivity of 0.3458 kPa⁻¹ in the 0–0.8 kPa range, detected a light load of about 23 Pa, responded in about 48 ms, and remained stable over 5000 loading–unloading cycles. The expanded 10 × 10 array enabled pressure mapping and shape recognition, while its skin-like local deformation improved fine-structure detection. On the vision side, the RGB camera performed well under normal light, but under illumination below 10 lx its Mean Average Precision at Intersection over Union (IoU) 0.5, or mAP50, dropped to 0.706 from about 0.995 above 100 lx. Adding the depth module improved boundary and position recognition, but still could not fully capture fine surface details. Tactile scanning closed that gap: it detected single and multiple protrusions, distinguished height differences of 2, 3, and 4 mm, and reconstructed pill-scale surface features on a tablet blister pack even when vision failed because of close-range focus issues, occlusion, or very low light.

“This study points to a practical shift in robot perception: when cameras lose confidence, touch can take over the harder job of reading surface detail,” the findings suggest. “In that sense, tactile sensing is not just a supporting signal. It becomes a functional backup pathway for recognizing shape, protrusions, and height variation when visual information is incomplete or unavailable.” That conclusion closely matches the paper's core claim: tactile sensing can both complement vision and, under visually constrained conditions, serve as an alternative sensing modality for precise object perception.

The broader significance lies in giving robots a more human-like sequence of perception: first locate, then touch, then refine. Such a strategy could strengthen micromanipulation in hazardous response, maintenance, space operations, and other unstructured environments where reliable perception is critical. The paper also frames this work as a step toward sensory substitution in robotics, in which one sensing channel can compensate when another is degraded. Future work will focus on integrating visual and tactile signals in real time and further improving tactile resolution, with the goal of making robot manipulation more autonomous, adaptable, and precise.

###

References

DOI

10.1038/s41378-026-01190-8

Original Source URL

https://doi.org/10.1038/s41378-026-01190-8

Funding information

This work was initiated by the Samsung Electronics Future Technology Development Center (Grant No. SRFC-TD2103-01). This study was supported by the National Research Foundation Grant funded by the Korean government (MIST) (No. RS-2024-00338772, No. RS-2024-00461583, No. RS-2024-00411007). This research was also supported by a faculty research grant of Yonsei University College of Medicine (6-2024-0023, 6-2025-0023).

About Microsystems & Nanoengineering

Microsystems & Nanoengineering is an online-only, open access international journal devoted to publishing original research results and reviews on all aspects of Micro and Nano Electro Mechanical Systems from fundamental to applied research. The journal is published by Springer Nature in partnership with the Aerospace Information Research Institute, Chinese Academy of Sciences, supported by the State Key Laboratory of Transducer Technology.

Paper title: Complementary visual localization and tactile mapping approach for robotic perception of millimeter-sized objects with irregular surfaces
Fichiers joints
  • From seeing to feeling: A complementary robotic perception strategy.
22/04/2026 TranSpread
Regions: North America, United States, Asia, China, South Korea
Keywords: Science, Physics, Applied science, Nanotechnology

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Témoignages

We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet
AlphaGalileo is a great source of global research news. I use it regularly.
Robert Lee Hotz, LA Times

Nous travaillons en étroite collaboration avec...


  • The Research Council of Norway
  • SciDevNet
  • Swiss National Science Foundation
  • iesResearch
Copyright 2026 by DNN Corp Terms Of Use Privacy Statement