Apple has purchased Emotient, a start-up that uses artificial-intelligence technology to read people’s emotions by analysing facial expressions.
It isn’t clear what Apple plans to do with Emotient’s technology, which was primarily sold to advertises to help assess viewer reactions to their ads. Doctors also have tested it to interpret signs of pain among patients unable to express themselves, and a retailer used it to monitor shoppers’ facial expressions in store aisles, the company had said.
Improving image recognition is a hot topic in Silicon Valley, where Apple rivals Facebook, Alphabet’s Google and others are investing heavily in artificial-intelligence techniques.
An Apple spokeswoman confirmed the purchase with the company’s standard statement after an acquisition, saying Apple “buys smaller technology companies from time to time, and we generally do not discuss our purpose or plans.” She declined to elaborate on the deal terms.
Emotient, based in San Diego, had previously raised $8 million from investors including Intel Capital. The company had been seeking a new round of venture-capital financing, but wasn’t able to secure it on favourable terms according to a person familiar with the matter.
Emotient Chief Executive Ken Denman declined to comment.
The company this week revised its website and removed details about the services it had been selling.
Apple has expressed interest in the field. In a 2014 patent application, it described a software system that would analyse and identify people’s moods based on a variety of clues, including facial expression.
In October, Apple confirmed that it had acquired another artificial-intelligence start-up, VocalIQ, that aims to improve a computer’s ability to understand natural speech.
In May, Emotient announced that it had been granted a patent for a method of collecting and labelling as many as 100,000 facial images a day so computers can better recognise different expressions.
Its technology leaves some skittish, including Paul Ekman, a psychologist who pioneered the study of reading faces to determine emotions and is an adviser to Emotient. In the 1970s, he created a catalogue of more than 5,000 muscle movements to show how even the subtlest facial tics could reveal a person’s emotions. Dubbed the Facial Action Coding System, it is the foundation for several start-ups trying to read emotions using artificial-intelligence algorithms.
In an interview with The Wall Street Journal last January, Dr Ekman said he was torn between the potential power of software that can read emotions and the need to ensure that it doesn’t infringe on personal privacy. He said the technology could reveal people’s emotions without their consent, and their feelings could be misinterpreted.
Dr Ekman said he still has these concerns, and has pushed Emotient to warn people if it is scanning their faces in public places, but the company hasn’t agreed to do so. An Emotient spokeswoman says the company doesn’t reveal information about individuals, only aggregate data.
Among more-established companies, Google in 2012 published a paper detailing how an artificial-intelligence program taught itself to recognise cats. The company has adapted that software to improve search results, though it has tread more carefully around facial recognition. It banned any apps for its Google Glass Web-connected eyewear that used facial recognition, for instance.
Facebook has been more aggressive, rolling out facial-recognition software across its social network that automatically recognises faces to make it easier to tag people in photos. Chief Executive Mark Zuckerberg said this week he hopes to build a personal artificial-intelligence assistant that could recognise friends at the front door to let them in.
- Wall Street Journal