> Notably, this is the second time CEO Aviad Maizels has sold a company to Apple. In 2013, he sold PrimeSense, a 3D-sensing company that played a key role in Apple’s transition from fingerprint sensors to facial recognition on iPhones.
Q.ai launched in 2022 and is backed by Kleiner Perkins, Gradient Ventures, and others. Its founding team, including Maizels and co-founders Yonatan Wexler and Avi Barliya, will join Apple as part of the acquisition.
" As first reported by Reuters, Apple has acquired Q.ai, an Israeli startup specializing in imaging and machine learning, particularly technologies that enable devices to interpret whispered speech and enhance audio in noisy environments."
you mean something that improves the detection and transcription of voices when the person doesn't realize the mic is on, like when it's in our pocket?
I have a child who had an individualized education program due to a disability. I recorded many meetings with an iPhone in my front pocket while sitting. Crystal clear audio every time.
The new tech is likely just for noisy environments and/or to enable whispered voice control of the phone.
The perfect crime - easily detectable, reputation destroying, barely profitable compared to information people give up willingly. Only Apple could come up with something so clever and so easily defeated, thanks to their boundless evil.
Could Q.ai be commercializing the AlterEgo tech coming out of MIT Lab?
i.e. "detects faint neuromuscular signals in the face and throat when a person internally verbalizes words"
> ...in most people, when they "talk to themselves" in their mind (inner speech or internal monologue), there is typically subtle, miniature activation of the voice-related muscles — especially in the larynx (vocal cords/folds), tongue, lips, and sometimes jaw or chin area. These movements are usually extremely small — often called subvocal or sub-articulatory activity — and almost nobody can feel or see them without sensitive equipment. They do not produce any audible sound (no air is pushed through to vibrate the vocal folds enough for sound). Key evidence comes from decades of research using electromyography (EMG), which records tiny electrical signals from muscles: EMG studies consistently show increased activity in laryngeal (voice box) muscles, tongue, and lip/chin areas during inner speech, silent reading, mental arithmetic, thinking in words, or other verbal thinking tasks
“Q.ai is a startup developing a technology to analyze facial expressions and other ways for communication.”
This is an interesting acquisition given their rumored Echo Show / Nest Hub competitor (1). Maybe this is part of their (albeit flawed and delayed) attempt to revitalize the Siri branding under their Apple Intelligence marketing. When you have to say the exact right words to Siri, or else she will add “Meeting at 10” as an all day calendar event, people get frustrated, and that non-technical illusion of the “digital assistant” is lost. If this is the model of understanding Apple have of their customers’ perception of Siri, then maybe their thinking is that giving Siri more non-verbal personable capability could be a differentiating factor in the smart hub market, along with the LLM rebuild. I could also see this tying into some sort of strategy for the Vision Pro.
Now, whether this hypothetical differentiating factor is worth $2 billion, I’m not so sure on, but I guess time will tell.
In case there are any Ender's Game fans here, the capability to understand micro-expressions reminds me of how Ender subvocalizes to Jane. Orson Scott Card predicted yet another technological norm.
Also earlier credit due to Isaac Asimov in Second Foundation [1953] "...
The same basic developments of mental science that had brought about the development of the Seldon Plan, thus made it also unnecessary for the First Speaker to use words in addressing the Student.
Every reaction to a stimulus, however slight, was completely indicative of all the trifling changes, of all the flickering currents that went on in another's mind. The First Speaker could not sense the emotional content of the Student's instinctively, as the Mule would have been able to do – since the Mule was a mutant with powers not
ever likely to become completely comprehensible to any ordinary man, even a Second Foundationer – rather he deduced them, as the result of intensive training.
The ability to impress CEOs and signal hotness to investors, may not corelate at all with the ability to produce breakthrough technology. Thus companies like google grow up unbought to then become ..
It's kind of sad watching Apple drift into irrelevancy. I know I'm not going to buy more products from them because nothing they have is worth the premium price.
Intel went through a phase in the 2010’s of buying gobs of companies with fancy tech and utterly failing to integrate those acquisitions.
And even more fundamental, Intel rested on its laurels of having good hardware and got bit hard in the end. Something similar seems to be happening at Apple.
It's pretty crazy it's Apple's second-largest acquisition ever but it's kinda boring so nobody cares. Of course, Beats was a household name and founded by Dre... a much more accessible story
TBF this ‘startup’ has a somewhat vague product. I read the article and couldn’t come up with a reason for the $2B valuation, what are we missing here?
Wake me up when they let one of these acqui-hires update Siri to be on par with a voice assistant I could make in an afternoon with off the shelf tools.
BlackBerry's keyboards & autocorrect were top notch. Nothing has matched it yet when using a pure virtual touch screen keyboard.
Crazy he had pretty much perfected the tech of typing out text on a smartphone and then decided to throw it all away by moving to all-screen devices instead. A virtual keyboard with no tactile feel will never compare until we can have screens that can recreate the tactile bumps of a physical keyboard.
Apple autocorrect has gotten actually worse over the last decade. Before it used to be duck instead of a similar sounding word and it took one action to correct it. Now it’s just fuschia and it takes 5 mins to correct the correction to the autocorrect.
I agree with this sentiment. It was so annoying that I turned auto correct off. I found that writing on iPhone has got worse as well, or at least it's my own observation. On the other hand, voice dictation has improved quite a bit that I can just dictate into my phone when needed. For more serious work I use a work device not a consumption one.
> Notably, this is the second time CEO Aviad Maizels has sold a company to Apple. In 2013, he sold PrimeSense, a 3D-sensing company that played a key role in Apple’s transition from fingerprint sensors to facial recognition on iPhones. Q.ai launched in 2022 and is backed by Kleiner Perkins, Gradient Ventures, and others. Its founding team, including Maizels and co-founders Yonatan Wexler and Avi Barliya, will join Apple as part of the acquisition.
Twice, well done!
What kind of tech does qAi bring to the table?
" As first reported by Reuters, Apple has acquired Q.ai, an Israeli startup specializing in imaging and machine learning, particularly technologies that enable devices to interpret whispered speech and enhance audio in noisy environments."
[puts on tin foil]
you mean something that improves the detection and transcription of voices when the person doesn't realize the mic is on, like when it's in our pocket?
I have a child who had an individualized education program due to a disability. I recorded many meetings with an iPhone in my front pocket while sitting. Crystal clear audio every time.
The new tech is likely just for noisy environments and/or to enable whispered voice control of the phone.
that was my first thought, big bump to their ad program
The perfect crime - easily detectable, reputation destroying, barely profitable compared to information people give up willingly. Only Apple could come up with something so clever and so easily defeated, thanks to their boundless evil.
Maybe to allow sub-vocalized commands when wearing airpods, for example? I think this was a theme in the later Ender's Game series books.
Yeah, so, I am never turning on Apple Intelligence...
Hope they do not adopt the MS approach to updates with the "shaken" Etch-a-Sketch for your settings on every update.
Could Q.ai be commercializing the AlterEgo tech coming out of MIT Lab? i.e. "detects faint neuromuscular signals in the face and throat when a person internally verbalizes words"
Yep, looks like that is it. Recent patent from one of the founders: https://scholar.google.com/citations?view_op=view_citation&h...
Yeah...
Pardon the AI crap, but:
> ...in most people, when they "talk to themselves" in their mind (inner speech or internal monologue), there is typically subtle, miniature activation of the voice-related muscles — especially in the larynx (vocal cords/folds), tongue, lips, and sometimes jaw or chin area. These movements are usually extremely small — often called subvocal or sub-articulatory activity — and almost nobody can feel or see them without sensitive equipment. They do not produce any audible sound (no air is pushed through to vibrate the vocal folds enough for sound). Key evidence comes from decades of research using electromyography (EMG), which records tiny electrical signals from muscles: EMG studies consistently show increased activity in laryngeal (voice box) muscles, tongue, and lip/chin areas during inner speech, silent reading, mental arithmetic, thinking in words, or other verbal thinking tasks
So, how long until my Airpods can read my mind?
> So, how long until my Airpods can read my mind?
Or explode in your ear
“Q.ai is a startup developing a technology to analyze facial expressions and other ways for communication.”
This is an interesting acquisition given their rumored Echo Show / Nest Hub competitor (1). Maybe this is part of their (albeit flawed and delayed) attempt to revitalize the Siri branding under their Apple Intelligence marketing. When you have to say the exact right words to Siri, or else she will add “Meeting at 10” as an all day calendar event, people get frustrated, and that non-technical illusion of the “digital assistant” is lost. If this is the model of understanding Apple have of their customers’ perception of Siri, then maybe their thinking is that giving Siri more non-verbal personable capability could be a differentiating factor in the smart hub market, along with the LLM rebuild. I could also see this tying into some sort of strategy for the Vision Pro.
Now, whether this hypothetical differentiating factor is worth $2 billion, I’m not so sure on, but I guess time will tell.
https://www.macrumors.com/2025/11/05/apple-smart-home-hub-20...
Sounds pretty invasive for privacy, if this was ever paired with smart glasses in public.
Hence the name, I assume.
and very expensive domain.
The website looks expensive as well. /s
https://www.q.ai
In case there are any Ender's Game fans here, the capability to understand micro-expressions reminds me of how Ender subvocalizes to Jane. Orson Scott Card predicted yet another technological norm.
Also earlier credit due to Isaac Asimov in Second Foundation [1953] "...
The same basic developments of mental science that had brought about the development of the Seldon Plan, thus made it also unnecessary for the First Speaker to use words in addressing the Student.
Every reaction to a stimulus, however slight, was completely indicative of all the trifling changes, of all the flickering currents that went on in another's mind. The First Speaker could not sense the emotional content of the Student's instinctively, as the Mule would have been able to do – since the Mule was a mutant with powers not ever likely to become completely comprehensible to any ordinary man, even a Second Foundationer – rather he deduced them, as the result of intensive training.
Why am I having a feeling that one of their reasons was so they can trademark "iQ", to match the iSomething "franchise", so to speak?
Apple dropped the "i" naming scheme many years ago.
iCloud, iPad, iPhone, iMac, iMessage, iOS/iPadOS, iMovie?
Granted, they are slowly but surely killing it, but it’s still going quite strong.
It's not used in new products.
The ability to impress CEOs and signal hotness to investors, may not corelate at all with the ability to produce breakthrough technology. Thus companies like google grow up unbought to then become ..
It's kind of sad watching Apple drift into irrelevancy. I know I'm not going to buy more products from them because nothing they have is worth the premium price.
I get the feeling Apple is the next Intel.
Intel went through a phase in the 2010’s of buying gobs of companies with fancy tech and utterly failing to integrate those acquisitions.
And even more fundamental, Intel rested on its laurels of having good hardware and got bit hard in the end. Something similar seems to be happening at Apple.
This story has Apple + $2B acquisition + AI
...how is this not at the top of the page?
It's pretty crazy it's Apple's second-largest acquisition ever but it's kinda boring so nobody cares. Of course, Beats was a household name and founded by Dre... a much more accessible story
TBF this ‘startup’ has a somewhat vague product. I read the article and couldn’t come up with a reason for the $2B valuation, what are we missing here?
> I read the article and couldn’t come up with a reason for the $2B valuation
I think the potential applications in military-grade spyware explain the valuation.
Oh, you mean what are we missing as far as this software meriting that valuation for consumer-friendly uses?
Wake me up when they let one of these acqui-hires update Siri to be on par with a voice assistant I could make in an afternoon with off the shelf tools.
This. And next word prediction / autocorrect that doesn’t look like it’s from the previous century.
On both my nokia and my blackberry it was far far better than on my iphone. That wasn't quite 199X but pretty close.
I wish the iphone had word prediction and autocorrect that was from the previous centruy
BlackBerry's keyboards & autocorrect were top notch. Nothing has matched it yet when using a pure virtual touch screen keyboard.
Crazy he had pretty much perfected the tech of typing out text on a smartphone and then decided to throw it all away by moving to all-screen devices instead. A virtual keyboard with no tactile feel will never compare until we can have screens that can recreate the tactile bumps of a physical keyboard.
Apple autocorrect has gotten actually worse over the last decade. Before it used to be duck instead of a similar sounding word and it took one action to correct it. Now it’s just fuschia and it takes 5 mins to correct the correction to the autocorrect.
I agree with this sentiment. It was so annoying that I turned auto correct off. I found that writing on iPhone has got worse as well, or at least it's my own observation. On the other hand, voice dictation has improved quite a bit that I can just dictate into my phone when needed. For more serious work I use a work device not a consumption one.
that already made the news. it will be powered by gemini and may launch before next wwdc.