When a Robot Understands That It Is "In Pain"

Яна Орехова Exclusive
VK X OK WhatsApp Telegram
When the robot understands that it feels 'pain'

The creation of a new type of electronic skin is aimed at solving an important task in robotics — the development of tactile systems that go beyond simple pressure detection and instead provide safe and adaptive behavior. This sensory system is based on a network of flexible pressure sensors integrated into the electronic skin. When touched or impacted, these sensors convert mechanical pressure into electrical signals.

In the early stages of development, the signals were transmitted directly to the robot's central processor. However, in the new system, if the sensation exceeds a set threshold, the signal is sent directly to the motors. This innovation lies in how the signals are processed: instead of merely recording pressure, the system employs neuromorphic coding that mimics the functioning of biological nerves to convert force into electrical impulses, the frequency and nature of which depend on the intensity and location of the impact.

Under normal conditions, the signals represent ordinary interactions, but when the threshold is exceeded, their characteristics change, triggering protective reactions in the robot.

Researchers emphasize that the system is designed solely for detecting mechanical impacts, without reflecting emotional pain or higher levels of perception. It simply generates a functional signal that allows robots to detect and respond to harmful influences.

“Our neuromorphic electronic skin has a hierarchical architecture inspired by neural networks, which ensures high-resolution sensory perception, active injury detection, and the possibility of modular repair,” the scientists note. This significantly enhances the sensory capabilities of robots and their interaction with humans, which is especially important for service robots working alongside people.

To evaluate the effectiveness of the system, the researchers conducted a series of tests, applying varying degrees of pressure to the electronic skin, from light touches to significant loads simulating dangerous contacts. These tests allowed the team to verify how quickly and accurately the system could detect the transition from safe to unsafe interaction.

During the experiments, the sensor network consistently generated clear signals and activated protective reactions depending on the force of the impact. The system responded within milliseconds, which is fast enough to ensure real-time reactions, such as withdrawing from dangerous contact or reducing the force of impact. It also demonstrated stable performance during repeated tests, indicating its durability.

These advancements are of immense significance for safety in human-robot interactions. As robots increasingly appear in everyday life, the ability to recognize dangerous contacts becomes particularly important, as tasks performed in close interaction can lead to accidental collisions and excessive force application.

Existing safety systems are often not designed for such close interactions, relying on external sensors or pre-set movement limitations. While these methods work, they can be slow or insufficiently flexible. Integrating the sensory function directly into the robot's skin allows for instantaneous responses to physical threats.

Moreover, this technology can enhance the efficiency of performing collaborative tasks that require physical contact, such as working with objects or in service robotics, enabling robots to better regulate grip strength and contact during interactions. This improves handling of fragile items and in unpredictable conditions, minimizing the risk of damage.

The technology could also change the perception of human-machine interaction. Robots capable of responding to physical impacts appear more responsive and natural, even if the emotional aspect is not addressed.

This feedback can make interactions more intuitive, similar to how humans instinctively adjust their actions when another person withdraws. Visual feedback from machines can help guide user behavior and reduce the risk of accidental damage.

However, this technology raises broader questions about the acceptable level of realism in robots. While these sensory functions enhance safety and efficiency, they also provoke ethical questions regarding whether machines should imitate the reactions of living beings.

Some researchers believe that robots do not need signals that resemble pain. Others argue that employing biological strategies may be the most effective way to create adaptable machines. It is important to find a balance between functional advantages and potential negative consequences of excessive anthropomorphism. For instance, what would happen if such a system were linked to an emotional response program controlled by AI?

Despite the emerging philosophical questions about the realism of robots, these implications still need to be explored. Currently, the system is in its early stages and not ready for commercial use; it only covers limited areas of the surface. Expanding coverage to the entire surface of a humanoid robot will require significant improvements in manufacturing and energy efficiency.

In the future, the work will focus on increasing the coverage area with sensors and enhancing durability, which is necessary for transitioning the technology from laboratory testing to real-world application.

Original: New Atlas
VK X OK WhatsApp Telegram

Read also: