A technology to detect emotions in human speech has been developed by researchers to enable more natural conversations with robots.
According to a recent report, a team of researchers from the RMIT University’s School of Engineering has discovered how to add emotional capabilities to machines to make communication more natural and more socially acceptable.
The current voice-activated technology being used in virtual assistants is limited by its inability to decipher human emotions.
Because of this, they have the tendency to provide irrelevant responses or miss the point of some conversations entirely.
The Australian University’s Associate Professor Margaret Lech, who is leading the research, explained that there is always an emotional context when humans talk to each other and this is understood.
However, machines do not understand this.
Calling an automatic call centre, for instance, causes people to get very frustrated because they talk to the machine and it does not understand that they are sad, or they are anxious and want things to be done quickly.
Machines do not understand the emotion associated with the task at hand. They have heard many recordings from which people have often said, “I want to talk to a person, not a machine”.
There is no way to explain certain things to a machine, including those subtle cues that can be expressed through emotions when people talk to each other.
Eleven years were spent by the team of researchers in creating new machine learning techniques that allow technology to understand human emotions based on speech signals.
It will also analyse and predict patterns of emotional interactions in human conversations.
With these capabilities, voice-activated devices can now understand both the linguistics and emotional contents of speech.
They can read seven human emotions such as anger, boredom, disgust, fear, happiness, sadness and neutral and can, therefore, provide appropriate responses.
The challenge of making machines read human emotions lay in measuring the unspoken commands in voices such as subtle changes in tone, volume and speed.
Emotion recognition will unlock many more applications and wider benefits from voice-activated technology.
People will accept machines more, they will trust machines, they will have the feeling that the machine really understands them and can help them better.
In addition, people, particularly the elderly, will not be so reluctant to use automatic call centres.
This will now allow machines to be employed like getting robots as companions, for instance.
An older person may like actually talking to a machine and hear that the machine can laugh with them, can sympathise as well as understand their feelings.
It could also be good if used for the toys of kids. Children will interact with the robotic toys that can talk emotionally, which will allow the children to learn more about emotions.
Lifelike artificial intelligence may have once been a futuristic dream. But with developments like these, making human-machine interactions simpler and smoother, the future has seemingly arrived.