Jingyu Lee

How to Visualize Emotional Expressions of Artificial Intelligence

Studium: Master Visuelle Kommunikation
Jahr: 2020
Mentor*innen: Mischa Leiner, Invar Torre Hollaus, Ted Davis

Nowadays, with the development of Artificial Intelligence technology, AI services are a major influence and change in people’s everyday lives. While AI technology has become a convenient part of people’s daily lives, people have shifted their focus from the physical world to the digital world, so there is an increasing demand for AI services that can produce emotional interaction. As a result, emotional AI systems and emotional calculations are the main part of any IT-industry research. In the existing AI assistant services, it mostly focus on improving the practical functions based on useful information and data, but lacks any personalized emotional services for users.

The key aspect of this thesis project is to develop a simulation of a natural conversation between a human and an artificial-intelligence service. Through his research and the concomitant practical experiments, the researcher tried to find out how the visual interface could improve the interaction of a human and an AI.

At the beginning, the research explores the process of communication through general human interaction. The study is based on a series of images that investigates the basic characteristics of humans’ facial expressions. This research is based on Paul Ekman’s theory explained in Universal Facial Expressions of Emotion. Psychological research has classified six facial expressions which correspond to distinct universal emotions: disgust, sadness, happiness, fear, anger, and surprise. In addition, the color theory by Robert Plutchik demonstrates human basic emotions in detail, while the form study conducted at the Pennsylvania State University shows that graphic-shape features are related to emotions, and these features were used to create graphic images in the practical experiment.

Many digital services by globally leading IT companies show that digital products should have a clean and simple interface for the sake of readability and usability. To achieve this goal,

simplified graphic images were developed in the practical experiment. The experiment used various graphic elements, from basic lines and dots to complex solid shapes. The study and experiment conclude that simplified graphic images that contain emotional information can be used as the main graphic element of the emotional interaction between users and an AI assistant service.

The results of the practical experiments are intended to improve the AI’s emotional expressions in the AI-enabled emotion communication system and the accuracy of recognizing emotions. The outcome of the practical experiments created a series of visual languages: graphic shape, color system, movement, and more. The graphic images in the visual experiments showed that visual emotional expressions can be applied to the AI service and cause a real-time emotional reaction. They can also be applied to AI-enabled emotional communication offered by an emotional social AI-assistant service to provide users with personalized emotions. For the final output, the mockup scenario was based on a specific situation to simulate the AI’s voice-recognition service. The prototype is compared with existing AI services, used as evidence to demonstrate how the thesis project can improve the emotional communication between a human and an AI assistant service.

I am convinced that the contribution of this thesis project will take us closer to understanding emotions in graphic images. A color system, features such as graphic forms, and movements can all be used to create emotional graphic images. The process to develop an AI assistant service able to have an emotional conversation with a human, could then consider and apply the study and the experiments done to date. With the technological development of AIs, coexistence between humans and AIs has become a necessity. I hope this research will help humans and AIs coexist in the future.