fbpx

Affective Computing and AI Emotion Recognition

Table of contents

In a recent article on 5 Computer Vision and Image Understanding Companies, we talked about how artificial intelligence is enabling computers to see as well as humans when recognizing images and in even some cases better. A company we wrote about before called Enlitic has developed a deep learning algorithm that can increase the accuracy of a radiologist’s interpretation by 50-70% and at a speed 50,000 times faster. Other companies like Affectiva can analyze your family photographs and identify any of the 10,000 possible facial expressions your family members may be expressing like the ones seen below:

Human_Emotions
Source: Kairos

Not only that, but Affectiva’s technology can also evaluate your emotions in real-time through your webcam. They have an online demo you can try and see for yourself how it works.

Update 04/11/2019: Affectiva has raised $26 million in funding to advance its emotion and object detection AI for monitoring vehicle passengers. This brings the company’s total funding to $60.3 million to date.

The ability of a computer to detect human emotions falls into a field of study called “affective computing“.  An “affect” is the experience of feeling or emotion that is characteristic to humans. Affects can be recognized not only by your facial gestures but also the tone of your voice and your body language. In order for artificial intelligence to truly interact with humans, the ability to empathize needs to be mastered. The Gartner Hype Cycle tells us that affective computing is an area that is experiencing high levels of innovation:

Gartner_Hype_Cycle
Source: Gartner

Affective computing technology is not just in the planning stage. At the beginning of this year, Apple acquired a San Diego-based company called Emotient which uses artificial technology to detect emotion from facial expressions. Founded in 2012, Emotient had taken in just $6 million in funding from 2 investors, one of which was Intel.

An article by Wired talks about Emotient’s business model which was to charge advertisers to analyze how consumers responded to their ads. In other applications, Emotient’s technology was used by doctors to determine a patient’s pain level and retailers were even using it to capture how shoppers react to products in stores. If a $579 billion company makes an acquisition of an affective technology startup, it helps affirm that the technology is really here.

Another startup working in the affective computing space is Kairos. Founded in 2012, Kairos has taken in just $3 million in funding to develop their “human analytics platform” that is comprised of the following three areas:

Kairos_ProductsKairos offers APIs for each of these affective computing applications so that you can code emotion analysis or facial recognition into your own apps. One product offering by Kairos is called “Project Look” which is powered by Watson from IBM (NYSE:IBM) and gives you access to tens-of-thousands of pre-qualified people all around the globe who can watch your videos and help you figure out how effective your advertising methods are. While you can try the technology for free, plans start at US$500 per month for 60 minutes of video.

Conclusion

With IBM acquiring companies left and right in order to build revenues for their “strategic imperatives”, Kairos could be on their radar. The partnership with IBM’s Watson and the unveiling of “Project Look” both happened in the earlier part of this year, so 2016 will be telling as to how economically viable the Kairos affective computing technology really is. Kairos has a customer list that includes Nike, Volkswagen, and Unilever.

Share

Leave a Reply

Your email address will not be published.