Voice bots have already reached the stage of evolution when it is not enough for them just to recognize and reproduce speech. They must have not only basic but also emotional intelligence. Modern services can read the user’s mood in order to offer more relevant services, assess psychological state and provide support.

The CTO of Brainy Solutions, Andrey Kirsanov, talks about which emotions can already be recognized with high accuracy and how a business can get the most out of technology even at the early stage of its development.

EQ Innovation

Emotional intelligence has long been the missing link in machine learning systems. Algorithms are already performing well of synthesizing speech and text, they are getting better at recognizing the context and supporting dialogue, but they still find it difficult to show empathy.

That being said, research shows that people will prefer to interact with a bot that is capable of showing empathy and compassion. Many developers are trying to endow algorithms with a high level of EQ – for this they have even designated a separate field of “affective computing”.

It’s not just researchers who are interested in developing empathy in robots. The market for systems for analytics and emotion recognition is already estimated at $21.6 billion, and by 2024 it will more than double. Experts from Gartner believe that by 2022, 10% of all devices will have the function of “emotional diagnosis”.

The technology is already being explored by tech giants. For example, Facebook plans to add emotion recognition to its Portal video calling device. Amazon recently introduced the Halo fitness bracelet, which analyzes the user’s voice in the background and determines their emotional state.

Affective computing is not limited to voice technologies, it also recognizes emotions by facial expressions, gestures, posture, gait, biometrics, text messages and even aggregated data from social networks or situational content.

Scientists from the University of Maryland have created the ProxEmo algorithm, which analyzes a person’s gait in real time, and then determines their mood. There are also already services that determine the mood by the subcutaneous blood flow of the face or by facial expressions relatively successfully.

This is potentially a huge market that spans dozens of areas, from education and healthcare to HR and gaming. Many companies, whose algorithms in speech recognition have already reached parity with humans, are switching to the development of EQ.

From polygraph to smart AI bot

The key challenge for the industry is the complexity of data processing. Emotions are too subjective and often difficult to categorize. Training any algorithm is a long and painstaking process of marking up data and creating relevant data sets and, in the case of emotions, it becomes even more complicated.

As our practice has shown, even people do not always accurately identify emotions to correctly mark them in the data set. We tried to work with third-party services, but in the end, we decided to create our own team of markers that specialize in identifying emotions.

The markers listen to dialogues, highlight passages with a pronounced emotional color, mark them, and enter them into the data set.

One mistake in recognition can violate the accuracy of the entire algorithm, so this is a difficult job that a person without experience cannot handle.

Many methods of determining mood by external parameters are far from perfect – a large meta-study has shown that it is impossible to determine emotions only by facial expression. Often, such promises from companies border on pseudoscience, especially when it comes to assessing the criminal inclinations of a person by the face, checking the trustworthiness by voice or determining the psychological stability in several phrases. Meanwhile, it’ important to note that polygraphs (commonly called “lie detectors”) are also an example of emotion recognition technology.

So far, the most scientifically grounded method is to determine the type of emotion: positive, negative or neutral. We are working on this task and have already achieved some success – our specialists have trained the neural network to recognize three basic emotions by the voice of the interlocutor with an accuracy of 93-95% (the remaining 5-7% of accuracy depend on the quality of the connection).

The developed module works correctly with almost any speech (except for whisper). This allows a client to switch from one bot to another (or to an operator) depending on his emotional state.

Emotions for business

Gartner predicts that by 2024 more than 50% of online advertising will be based on emotion recognition. Technology allows brands to be more responsive to customer requests. For example, a streaming platform will determine if a viewer likes the content by observing their facial expressions. Disney already utilizes this technique.

In many ways, voice assistants, which are already being used by many companies in the field of FinTech, logistics, insurance and e-commerce, have become a growth driver for emotion recognition technology. Virtual assistants are becoming an additional communication channel these types businesses. With their help, a company can give customized recommendations, broadcast ads, and increase user loyalty.

In the US alone, about 45% of residents use voice assistants for shopping. For example, Walmart cooperates with Google Assistant and Siri: with their help, one can add goods to the cart with a voice command, and pay with a linked credit card.

The technology is also widely used in banking. While emotion recognition is most often used to collect feedback and analytics, in the future, algorithms will help banks factor the clients emotions into credit decisions.

For example, in China, technology is already being used to determine the reliability of borrowers and the FinTech conglomerate Ping An claims to have used it to cut loan losses by 60%.

Emotions for customers

Emotion recognition is most often leveraged to process requests or alert customers. For example, a customer might call a transport company and find out about a delay in their shipment. They are angry and system detects negative emotions in their voice. A bot that does not yet know how to show empathy is likely to annoy the person even more. In this situation, it is important to switch the client to an operator who can find a solution to the problem. It is dangerous to use template dialogues and scripts in situations such as this, as the risk of losing the client is very acute.

Techniques in developing emotional AI can be used to create chat bots as well. In such cases, emoji can be used as emotion markers. For example, an angry or upset emoji will enhance the message, while a smiling emoji, on the contrary, will create a friendly atmosphere. The entire dialogue with the client can be built on icons, and the mechanism works in both directions: the model allows the bot to recognize emoji and apply them in the desired context to communicate back.

How to get the most out of emotion recognition technology

  • Determine what business tasks the algorithm with a developed EQ can help to solve: reducing costs, improving the quality of service, or expanding coverage. Carry out pilot tests to test the hypothesis. Technology can enrich existing products and, for example, make voice bots more emotional and empathic.
  • Collect and organize data to help extract valuable insights and clearly articulate a goal for developers and vendors. A business that doesn’t invest in analytics is unlikely to benefit from emotional AI. For example, a company may find that 20% of customers refuse future services if the manager caused them to experience negative emotions. Such insights help one ascertain where and how to apply emotion recognition to reduce customer churn.
  • Do your own custom development, but don’t give up on ready-made solutions. Hybrid services will help launch and improve products faster.
  • Analyze the market and review research on emotion recognition. Which products are speculating on the hype, and which ones are useful and really work? How do computer scientists respond to developments? Don’t try to implement emotional AI just because everyone does it.
  • Pay attention to ethical issues and customer privacy. Users are becoming increasingly wary of new technologies, so it is important not to abuse their trust.
Published On: October 12th, 2020 / Categories: Artificial Intelligence / Tags: /