AI reads emotions via speech, facial expressions

Published 10.04.2019 00:12
AI experts train algorithms to understand people’s emotions such as feelings of happiness, boredom and sadness.
AI experts train algorithms to understand people’s emotions such as feelings of happiness, boredom and sadness.

Neural networks generate results by creating an index between traditional computer programming techniques and data. However, the fact that these techniques interpret faces, behavior and human speech accurately show how far neural networks have come

Artificial intelligence (AI) experts can train algorithms to understand people's emotions and use these emotion analyses determine shopping tendencies, see whether someone is depressed or even prevent murders.

Scientists at the University of Science and Technology of China studied how people use speech and facial muscles to show their emotions. Detailing their findings in an article they said, "Automatic emotion recognition is a difficult task in terms of the exact content of emotions and multiple ways to expresses them. However, the visual and auditory information in automatic emotion recognition inspired by this cognitive process in humans can be used naturally in an instant, and the line in this neural network can be completed."

Depression spotted

In the emotion analysis, the scientists examined 653 images and auditory records by using the AFEW8.0 database. The algorithms guessed happiness, boredom, irritability, sadness and astonishment with a success rate of 62.48 percent. They were more successful in identifying anger, happiness and neutral feelings, which have more characteristic expressions.

However, they had difficulty in determining expressions with finer differences like boredom and astonishment.

Last year, Tuka Alhanai, Mohammad Ghassemi and James Glass, three researchers from the Massachusetts Institute of Technology (MIT), developed a neural network method capable of determining depression in clinical interviews with people and traced depression in voices and written responses.

Experts addressed questions asked by virtual assistants controlled by a human being to 142 people. Some of the answers were vocal while some were written. The virtual assistant was not informed about the questions before and the people were free to answer the questions as they

wanted. As a result, AI tried to predict whether people were depressed by reading the hidden clues in the language that they used. According to the results of the study, AI determined whether a person is depressed after seven written and 30 voice answers. The accuracy rate of these deductions was announced as 77 percent.

Tengai, the job interview robot

Tengai is a job interview robot developed by Furhat Robotics, an AI and social robotics company born out of a research project at Stockholm's KTH Royal Institute of Technology, which gained a new dimension with the adaptation of advanced technology in human resources.

It is a social robot measuring 41 centimeters in height and 3.5 kilograms in weight. Furhat Robotics has tried to humanize this robot for four years. A humanoid was also designed for the robot.

Tengai speaks with candidates identified by human resources and asks them questions. Furhat Robotics, which collaborates with Sweden's biggest recruitment company TNG, has begun using Tengai in the employment procedure in order to eliminate biases managers may have.

TNG will "recruit" Tengai fulltime in the next month after the test process, which has been ongoing for several months, is concluded.

Elin Öberg Martenzon, the chief innovation expert at TNG's Stockholm office, said: "The first impression of a person occurs in about seven seconds. It takes five to 15 minutes to make a decision after meeting the other person. While saying that we want to see what the situation will be, we want to prevent our recruiters being affected by sex, ethnicity, voice, education, appearance and unofficial conversations before or after the interview and making a decision in line with them."

"For example, when a recruiter asks an interviewee whether they like playing golf, a positive answer will make a positive impression on this person," Martenzon added. However, Tengai eliminates the possibility of a biased choice by a recruiter and asks questions with a neutral tone.

Then the recruitment manager takes the answers in writing and evaluates them. In order to prevent Tengai, programmed by people, from being naturally biased, it was made to test many different candidates. Tengai is scheduled to start doing business interviews in 2020 in English.

People involved in AI earn most

According to research carried out by Indeed, the largest job advertisement site in the U.S., in 2018, the most popular profession on the rise in the U.S. was machine learning engineering.

Machine learning engineers took first place in the list of the 25 most-demanded jobs with the highest salaries in the country. Accordingly, increased demand for machine learning engineers increased by 344 percent compared to the previous year. In the U.S., the average annual salary of those in this profession started at $146,185.

Looking at the overall list, nine of the 25 professions are technological or technologically related ones. Demand for software developers continues to rise, but machine learning engineers are also on the rise.

On the other hand, computer vision engineers were some of the top earners with an average annual salary of $158,303. However, as the demand rate increased by 116 percent, it ranked 13th on the list.

Computer vision is defined as a subtitle of a kind of neural network used in many different sectors from medical to industrial production, from military to autonomous vehicles and including the identification of images, the detection of the in the image, the repair of the image or the calculation of a three-dimensional model of more than one image.

Share on Facebook Share on Twitter