In short |
|
Technological advances transform the way in which machines interact with their environment. Thanks to an innovation directly inspired by bats, drones and robots can now navigate in extreme conditions of darkness or reduced visibility. This technology, developed by researchers from the University of Michigan, uses artificial intelligence echolocation to detect invisible obstacles to the naked eye. Without the need for cameras or GPS, these machines rely on ultrasonic impulses to map their environment, paving the way for new applications in disaster areas or hostile environments.
Revolutionary sound perception
The development system developed imitates biological functioning by emitting high -frequency sound impulses. Upon their return, these echoes allow machines to create a space map of their immediate environment. Unlike infrared systems or based on cameras, the sound is not disturbed by darkness or visual obstructions, making it an ideal solution for operations under compromised visibility conditions. The researchers have developed a unique artificial intelligence model based on convolutive neural networks, capable of identifying specific forms of objects from their echo patterns. This modular approach allows the system to learn new forms without requiring complete reset of the network.
The Ukrainian army reveals “the surprise of the century” with this new Russian anti-drone weapons: a technology at 3 million euros that divides experts
Synthetic training for concrete results
Instead of relying on expensive tests on the field, the team has trained the system in a virtual environment. They have simulated three -dimensional spaces with real world distortions to teach AI how echoes behave in chaotic conditions. This method has enabled AI to learn how the different forms of objects reflect sound from various angles. The use of synthetic data has also reduced development costs and time, while maintaining high precision. The networks of specialized convolutive neurons, called SCNN, have been fed by thousands of simulated echo patterns, each focusing on a different type of object. In testing, AI has managed to identify geometric shapes from real ultrasound echoes, even when they were similar.
Unpublished assault: Ukraine uses terrestrial drones to capture Russian soldiers, marking a revolutionary turning point in the conflict
Applications et implications futures
The echolocation model developed by researchers fills the gap between natural and artificial perception. This promising innovation opens up new perspectives beyond defense and robotics. It could be used in medical imaging, autonomous vehicles and industrial diagnosis. With the vision systems increasingly pushed to their limits, this sound centered approach could soon become the privileged solution when sight fails. By drawing inspiration from the strategies used by echolocating animals, this technology brings us closer to the construction of machines capable of perceiving as biology.
Shock investigation: these very well known influencers who buy followers to earn money
Funding and research perspectives
This innovative project was financially supported by the American army research office and the center of land vehicle systems. This collaboration highlights the strategic importance of developing advanced navigation systems for military and civil applications. By exploiting the potential of ultrasonic perception, researchers hope to extend the capacities of machines in environments where traditional tools fail. This advance could also stimulate innovation in other areas of engineering, where precise navigation and advanced imaging are essential. While technology continues to evolve, what new applications could we consider for this revolutionary approach?
This article is based on verified sources and the assistance of editorial technologies.
Did you like it? 4.6/5 (28)