Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Imagine silently saying something to yourself, and your AI assistant will know what you’re trying to say. This could be through your glasses, earbuds, or your phone’s camera. Apple has just bought a company called Question and answerthis attempts to do exactly that thing. It sounds weird and sci-fi, but to me, as someone who’s been looking at smart glasses and wearables for a long time, it also feels very familiar.
Apple’s investment in this Israeli startup is not small at all. The cost of the acquisition is approximately $2 billion, according to Original report by the Financial Times and News agencies such as Reuters. It’s more than any other move by Apple since its stunning acquisition of Beats a decade ago. Unlike Beats, no one knows about Q.ai. At least, not yet. However, the potential for creating new interfaces can be very strong. Another key piece is being added to the ever-expanding puzzle of future personal technology interfaces.
Q.ai isn’t a company I’ve ever met or gotten a demo from, but one of its founders, Aviad Maizels, also created PrimeSense, the infrared-based technology that powers the 3D room scanning capabilities of Microsoft’s Kinect camera for Xbox, years ago. It was PrimeSense Acquired by Apple In 2013. This technology became TrueDepth camera array For Face ID, it also lives in it See Apple Pro For near-term hand tracking.
Based on what has been reported on its patents, Q.ai allows small facial movements and emotional expressions to be tracked using optical sensors and can enable silent commands to be entered into an AI interface or recognize other subtle facial signals. Israeli website GeekTime He goes into more detail, saying the technology will measure muscle and lip movements and may need to be near your mouth.
CNET reached out to Apple and Q.ai for comment, but neither immediately responded.
Vision Pro can indeed track facial movements, but it can’t convert lip movements into speech.
I just wrote about how Apple is already showing signs of moving toward an ecosystem Wearable and connected AI devices: Pins, glasses, earphones, watches or a combination thereof. Any of these wearables could potentially use what Q.ai is developing. Headphones and glasses seem to be the most likely areas, with reports suggesting that… The next generation of AirPods It will have infrared cameras, the pieces look more ready to connect.
Even mixed reality headsets like the Vision Pro can benefit from Q.ai technology. The Vision Pro can already recognize facial expressions through eye-tracking cameras, downward-facing cameras, and infrared sensors. But interacting with the Vision Pro is still a bit awkward for me. I use my eyes to look and my hands to pinch things, but I have to say “Hey Siri” to make voice requests. I prefer my interactions to be more natural and subtle. Maybe this new acquisition can help.
As noted by augmented reality artist and researcher Helen Papagiannis Notes in her latest newsletter“Apple’s rumored AI Pin makes less sense as a standalone product and more as a node in Apple’s ecosystem, relying on shared sensing, intelligence, and context across devices that work in concert with AirPods and, eventually, Glasses.”
existing Smart glasses Like dead And the next From Google They mostly rely on voice for interaction. Doing this silently can be a huge advantage, but aspects other than sound are emerging as well. The Meta has a neural strap that is worn on the wrist, with the ultimate goal of adding eye tracking to glasses as well. Google Glass will also work with watch-based gestures.
I’m also a little concerned about privacy. Any technology that can read lips and recognize micro-expressions can be used to track your intent and hear it from a distance. How can this technology be used privately and reliably? Or would the ability to make requests quietly be more private than the voice commands I use now?
I still want interfaces that don’t use speech at all. Meta is based on electromyography Neural band technology It points to more complex ways in which wrist gestures could evolve to work with glasses and earbuds. Another Israeli company, Wearable Devices, has its own neuroband, called Mudra, and aims to expand its micro-input capabilities, derived from the electrical impulses of motor neurons.
Electroencephalography, which measures brain signals, is another trend. While some companies are exploring EEG for brain-computer interfaces, it is still primarily a sensor system focused on health and medical applications.
Q.ai technology is among the interfaces that could make the wearable computers we use feel more connected to us. This is weird and scary, but it’s also where I think most eyeglasses, wearables, and VR/AR companies are already headed. This is not strange. Apple’s move is another part of this trend.