ResearchEmotion Engineering

The Structure of Emotion I: The Basic Emotion Model

2025년 6월 09일

The Structure of Emotion I: The Basic Emotion Model
The word emotion is widely used in everyday language, yet it becomes surprisingly difficult to define precisely. We often describe someone as an “emotional person” or refer to an advertisement as “emotionally appealing,” but emotion itself is far more complex than a simple mood or personal preference. Affective engineering attempts to transform this vague concept into something that can be described, analyzed, and measured. Over time, emotion has moved from being primarily a topic of philosophy to becoming a subject of investigation in neuroscience, psychophysiology, and cognitive science. One of the earliest scientific attempts to understand the structure of emotion is the basic emotion model, which proposes that complex emotional experiences are built from a small set of fundamental emotional states. Understanding this model helps clarify how emotion can be categorized and eventually measured within scientific and technological systems. Researchers have long tried to classify emotions despite their complexity. Emotional experiences are fleeting, subjective, and expressed differently across individuals. Nevertheless, psychologists and physiologists have consistently sought ways to organize emotional phenomena into structured categories. The motivation behind this effort is straightforward: emotions play a crucial role in shaping human behavior. Without understanding emotion, it becomes extremely difficult to explain decision-making, social interaction, and human motivation. Early emotion research often began with language. Scholars examined how people across different cultures describe emotional experiences using words such as happiness, sadness, anger, or surprise. By comparing emotional vocabulary across languages, researchers began identifying patterns that suggested certain emotional categories might be shared universally. Language therefore served as one of the earliest tools for structuring emotional experience. However, purely linguistic approaches eventually revealed their limitations. Words can describe emotional states, but they do not necessarily explain how those states arise or how they can be objectively measured. As psychology evolved into a more experimental science, researchers began searching for ways to link emotions to observable signals. Physiological responses, facial expressions, and neural activity became key candidates for capturing emotional states in measurable form. This search for measurable emotional units led to the emergence of basic emotion theory. The central idea of the basic emotion model is that complex emotional experiences are derived from a small set of core emotional states. Just as countless colors can be created by combining a few primary colors, the wide spectrum of human emotions may arise from combinations of a limited number of fundamental emotional categories. This approach attempts to explain both the biological foundations and the apparent universality of emotional expression. One of the most influential versions of this theory was proposed by psychologist Paul Ekman. Through cross-cultural studies of facial expressions, Ekman argued that certain emotions appear consistently across different societies and cultures. He identified six emotions that seem to be universally recognized: happiness, sadness, anger, surprise, fear, and disgust. These emotions were considered “basic” because they are associated with distinctive and recognizable facial expressions. Regardless of cultural background, people tend to interpret these expressions in remarkably similar ways. This observation suggested that certain emotional responses might be rooted in shared biological mechanisms. Ekman’s work played a critical role in shaping modern emotion research, particularly in fields that rely on facial expression analysis. Computer vision systems designed to detect emotional states often begin with these six categories as foundational labels. Following Ekman’s research, other scholars proposed expanded or alternative models. Robert Plutchik introduced the well-known wheel of emotions, which organized emotional states into a circular structure that illustrates relationships such as opposites and intensity levels. Carroll Izard proposed a model including twelve basic emotions, arguing that a broader set of emotional categories might better capture the diversity of human experience. Despite these variations, debate continues regarding what should truly count as a “basic” emotion. Some critics argue that reducing emotional experience to a fixed list oversimplifies the complexity of human affect. Emotional experiences are influenced by culture, context, memory, and interpretation, making it difficult to fully capture them within a single universal classification. Even with these criticisms, the basic emotion model remains highly influential, particularly in applied fields such as affective computing and affective engineering. One of the key strengths of the model lies in its ability to translate emotional experience into measurable units. When emotions are defined as identifiable categories, they can be linked to physiological signals, facial expressions, vocal characteristics, or neural activity. This connection allows researchers to detect and analyze emotional states using biosensors, computer vision, and machine learning systems. In domains such as user experience research and emotion-aware AI systems, this measurability becomes extremely valuable. Designers and engineers need practical frameworks that allow emotional responses to be quantified and compared. Basic emotion categories provide a relatively simple structure for building such systems. The model is also particularly useful in the context of human–machine interaction. As artificial intelligence, robots, and digital interfaces become more interactive, machines increasingly need the ability to interpret and respond to human emotional signals. Basic emotional categories serve as one of the earliest “languages” that machines can use to interpret human affective states. For example, machine learning systems trained to detect facial expressions often classify images according to Ekman’s emotional categories. Similarly, emotional datasets used in AI research frequently label data according to basic emotion classes such as happiness, anger, or fear. In this sense, the basic emotion model provides a foundational framework for constructing emotion databases and training emotion recognition algorithms. It offers a standardized way to map facial expressions, voice patterns, and physiological signals onto emotional categories. While the model does not capture the full richness of emotional experience, its simplicity and clarity make it highly practical for technological applications. Machines benefit from clearly defined categories, and measurement systems require stable reference points. For affective engineering, the basic emotion model therefore functions as a first map of emotional structure. It offers a way to translate the abstract world of emotion into something that can be measured, modeled, and integrated into technological systems. At the same time, researchers recognize that emotional experience cannot be fully reduced to a small set of categories. Human emotions are dynamic, continuous, and deeply influenced by context. For this reason, the study of emotional structure does not end with basic emotions. Another major approach—often called the dimensional model of emotion—attempts to describe emotional experience along continuous axes such as valence and arousal.

이 주제에 대해 더 알아보기:

참고 자료 보기