Computers are constantly evolving. But the way we use them has changed little, so far. However, thanks to artificial intelligence, we could, in a few years, interact with our PCs in a completely different way. This is, in any case, the vision exposed by Pavan Davuluri, the owner of the Windows activity at Microsoft, in an interview.
“I think we will see computing becoming more ambient, more omnipresent, continue to cover the factors of form, and certainly become more multimodal over time […] Basically, the concept that your computer can look at your screen and is aware of the context will become an important modality for us in the future ”he said. He adds that we will be able to speak with the computer when we write, when we write, when we interact with someone else.
In summary, Windows boss suggests that, thanks to an AI that is able to analyze the context and understand our intentions, we will use our hands (and more our voice) to interact with computers. He also sees it as an improvement in accessibility for people who have difficulty using a mouse and a keyboard.
Meta imagines computers controlled by a bracelet
While continuing to work on his artificial intelligence assistant, who can be checked with the voice, the Meta group imagined another way to control the machines. Instead of mouse and keyboard, Mark Zuckerberg’s company offers new technology based on electromyography. For its future augmented reality glasses, Meta has developed a bracelet which is capable of detecting the signals of the nerves, thanks to sensors, in order to convert these signals into controls for a machine.
Result: it is possible to interact with an operating system with small hand gestures. “You can type and send messages without keyboard, navigate a mouseless menu and see the world around you when you engage in digital content without having to watch your phone”Meta said in a blog post that presents this work.
We’re thrilled to see our advanced ML models and EMG hardware — that transform neural signals controlling muscles at the wrist into commands that seamlessly drive computer interactions — appearing in the latest edition of @Nature.
Read the story: https://t.co/7G8qAdnGbh
Find… pic.twitter.com/nEeihClnjv
– You have meta (@Aiatta) July 23, 2025
“Ultimately, the SEMG could revolutionize the way in which we interact with our devices, help people with motor disabilities to gain autonomy while improving their quality of life, and open up new possibilities for the IHM (human-machine interaction, editor’s note) which we have not even dreamed of yet”also explains the group.
📍 To not miss any Citron press, follow us on Google News and Whatsapp.