We will be in person on July 19 and July 20. Join data and artificial intelligence leaders for talks and opportunities to network. Today is the day to register.

According to a recent VentureBeat article, more and more companies are entering an era where artificial intelligence is an aspect of every new project.

The theory of "basic emotions" states that people all over the world communicate six basic internal emotional states using the same facial movements.

It seems reasonable to assume that facial expressions are an essential part of communication.

According to a recent paper from a tech industry analyst firm, emotion artificial intelligence is an emerging technology that allows computers and systems to identify, process, and mimic human feelings and emotions. The field blends computer science, psychology and cognitive science to help businesses make better decisions.

Register Here

How emotion AI is being utilized

For scoring video interviews with job candidates for characteristics such as "enthusiasm", "willingness to learn", "conscientiousness and responsibility", and "personal stability", emotion artificial intelligence is used. The software is used by border guards to detect threats at border checkpoint, as an aid for detection and diagnosis of patients for mood disorders, to monitor classrooms for boredom or disruption, and to watch human behavior during video calls.

The use of technology is becoming more popular. Job coaches in South Korea often make their clients practice going through artificial intelligence in job interviews. Lawyers can use software to figure out what arguments will land with potential jurors. The technique to detect a lie was developed by Tel Aviv University. Apple was granted a patent for "modifying operation of an intelligent agent in response to facial expressions and emotions."

Emotion AI is based on pseudoscience

Researchers have determined that facial expressions vary widely between contexts and cultures, and that is one of the reasons why emotion is rife with confusion and controversy. There is a lot of evidence that facial movements are not consistent signals of emotion. There is growing evidence that the science of emotion detection is wrong and that there is not enough evidence to support the thesis that facial configurations reflect emotional states.

The technology has no proven basis in science and at its worst is pseudoscience.

“emotion AI has “at its best no proven basis in science and at its worst is absolute pseudoscience.” Its application in the private sector, she said, is “deeply troubling.” https://t.co/M6tD7LMlSx

— Tracey Follows (@traceyfutures) July 5, 2022

There is no good evidence that facial expressions reveal a person's feelings. Uncertainty is one of the factors that make decisions based on emotion artificial intelligence difficult to make.

At least some companies are pulling back from developing orDeploying Emotion Artificial Intelligence due to this concern. Microsoft updated their Responsible Artificial Intelligence Standard framework to help them build systems that are more beneficial and equitable. One outcome of their internal review of artificial intelligence products and services using this framework is the retiring of capabilities within Azure Face. According to the company, the decision was made because of a lack of expert consensus on how to infer emotions from appearance, and because of privacy concerns. The company is showing how to avoid potentially harmful impacts from the technology.

The market for emotion artificial intelligence is expected to grow at a compound annual growth rate of 12% over the course of the next ten years. Capital is flowing into the field. Uniphore closed $400 million in series E funding with a valuation of $2.5 billion.

Pandora’s box

Businesses have been using similar emotion artificial intelligence technology to improve productivity for a long time. According to an Insider article, employers in China use emotional monitoring to increase productivity and profits.

Businesses are interested in this technology, as well as other people. The Institute of Artificial Intelligence at Hefei Comprehensive National Science Center in China created an artificial intelligence program that read facial expressions and brain waves to determine the level of acceptance for political education. The test subjects were shown videos about the ruling party. The subject needed more political education and was assessed on their loyalty. The subject's "determination to be grateful to the party, listen to the party and follow the party" was included in the score.

The winners and thelosers are created by every wave of innovation. In the case of emotion artificial intelligence, there are a lot of uses that are questionable. The field is based on a shaky supposition. The use of emotion artificial intelligence is largely unregulated around the world.

Neuroscience News asked if we would want our lives to be monitored even if it was possible to read people's feelings. The question is about privacy. There are positive use cases for emotion artificial intelligence, but it presents a slippery slope that could lead to Orwellian thought police.

Gary is the senior VP of technology at the company.

The VentureBeat community welcomes you.

Data decision makers can share data related insights and innovation.

Join us at DataDecisionMakers to read about cutting-edge ideas and up-to-date information.

You could possibly contribute an article of your own.

Data decision makers have more to say.