The realm of emotion seems to be a hot topic in today’s world of artificial intelligence, not to mention Apple’s recent acquirement of the facial expression recognition startup, “Emotient”.
Tech giants such as Google and Facebook are also setting up their vision of integrating their technology with artificial intelligence to read the users’ emotion through various means such as facial expression, tone of voice, and overall context. And as A.I. gets smarter, mastering the art of accurately understanding the users’ emotion and corresponding accordingly doesn’t seem like such a distant future.
In this exciting shift in paradigm, AKA is attempting to solve the issue of emotional interaction in a slightly different manner.
Sure, “reading” one’s emotion is important, but how about “expressing” them?
What we see the most value in terms of integrating the concept of emotion to our A.I. technology is the enhancement of the overall user experience: The user should be able to interact with Musio not only through technical usability, but with a sense of attachment.
Imagine you come home after a long day at work and your dog greets you with her/his highest level of excitement. The un-containable happiness expressed by the tail-wagging of your pet is contagious, and at that moment, you feel emotionally connected. This kind of emotional connection is what we seek to prompt within the interaction between Musio and the user.
Through extensive research and debate, we at AKA have come to conclude that Musio should be able to express its emotions based on two fundamental criteria. First, the expression of Musio should be intuitively understood by anyone (hopefully even non-human beings), and second, it should be able to express a continuous spectrum of emotions. To this end, we have come up with a unique emotion system of Musio where it expresses its emotion through the medium of non-verbal sound and light colors.
Referencing psychologist Robert Plutchik’s famous “Wheel of Emotion”, Musio’s emotion is defined as a certain “point” within the three-dimensional emotion chart. The x,y,z coordinate of the certain point is then translated into values of hue, saturation, and brightness (HSB) of light, as well as frequency values of abstract sound. The result is a phenomenal expression of emotion composed of dynamically changing light and sound.
We hope that this kind of emotion expressing system can not only help the user to read Musio’s emotion in a much more intuitive manner, but also help the emotional connection between Musio and users with certain disabilities.
Watch the attached video to get a glimpse of how Musio expresses its emotion through non-verbal sound and light colors.