Sestosenso develops technologies for next generations of collaborative robots capable of self-adapting to different time-varying operational conditions and capable of safe and smooth adaptation from autonomous to interactive when human intervention is required either for collaboration or training/teaching. The project proposes a new sensing technology from the hardware and up to the cognitive perception and control levels, based on networks of embedded proximity and tactile sensors on the robot body, providing a unified proxy-tactile perception of the environment, required to control the robot actions and interactions, safely and autonomously. Within the project, the same technologies are also applied to wearable devices (like exoskeletons) to provide the user with better spatial awareness and to enforce safety in critical human-robot interactive tasks.
Use Case 3
Agricultural harvesting via wearable devices and collaborative mobile manipulators
News
Optoskin featured on I3A Young Researchers’ Day
Sestosenso’s researchers presented OptoSkin at the I3A Young Researchers’ Day on June 25th, 2025, in Zaragoza, Spain. We showcased both the technical achie...
Online Meeting
Another technical meeting of the European Project Sestosenso has taken place online on June 12th, 2025. We have discussed how to showcase the results of th...
Sestosenso demo at the Laimburg Integrated Digital Orchard
Sestosesto was featured at the Laimburg Integrated Digital Orchard, in which Prof Karl von Ellenrieder’s team at the Free University of Bozen-Bolzano teste...
Best-poster award for Optoskin at CEIG 2025
Sestosenso’s researchers presented OptoSkin at the Spanish Conference on Computer Graphics (CEIG) 2025 from June 2nd to the 4th, in Jaen, Spain. We showcas...