ResearchEmotion Engineering
The Origin of Emotion: Does the Feeling Come First, or the Body’s Reaction?
Jun 02, 2025

When we experience emotions, we usually assume that the feeling comes first. We believe that we feel angry, joyful, or afraid, and then our body reacts accordingly. However, affective engineering and emotion science often view this process differently. Emotions are not simply psychological events that arise in the mind. Instead, they emerge from the interaction between physiological responses and cognitive interpretation.
Understanding where emotions originate is therefore one of the fundamental questions in affective science. Do we feel emotions because our body reacts? Or does the brain generate emotions and then trigger bodily changes? Over the past century, several influential theories have attempted to answer this question, each offering a different perspective on how emotional experience is formed.
One of the earliest explanations comes from the James–Lange theory of emotion, proposed in the late nineteenth century by psychologist William James and physiologist Carl Lange. Their argument was radical for its time: emotions are not the cause of bodily reactions but the result of perceiving those reactions. According to this view, when an external stimulus appears, the body first undergoes physiological changes—such as increased heart rate, muscle tension, or changes in breathing. The brain then interprets these bodily signals, and the subjective feeling we call an emotion emerges from that interpretation.
This idea is often summarized in a famous phrase: we are not crying because we are sad; we are sad because we are crying. In other words, emotional experience begins with the body. The perception of physiological changes—such as a racing heart, rapid breathing, or tightened muscles—forms the basis of emotional awareness.
For researchers in affective engineering, this perspective carries important implications. If emotions arise from physiological signals, then emotional states may be measurable through bodily data. Signals such as heart rate variability, electrodermal activity, electromyography, or brain activity can potentially serve as indicators of emotional states. Modern emotion-sensing technologies, including wearable biosensors and computer vision systems, are built upon this fundamental assumption that emotional experience leaves measurable traces in the body.
However, the James–Lange theory was not without criticism. Physiologist Walter Cannon and psychologist Philip Bard argued that the relationship between emotion and physiological response could not be explained so simply. They pointed out that many different emotions share similar physiological patterns. If emotions were merely the perception of bodily changes, how could the same physical response lead to distinct emotional experiences?
Their answer became known as the Cannon–Bard theory of emotion. Instead of a sequential process where the body reacts first and emotion follows, they proposed that emotional experience and physiological reactions occur simultaneously. According to this view, when a stimulus is perceived, the brain processes the information and triggers both the subjective feeling and the bodily response at the same time.
This theory emphasized the role of central brain structures, particularly the hypothalamus and the cerebral cortex. When a threatening or emotionally significant stimulus appears, these neural systems coordinate both the conscious emotional experience and the physiological response. In this framework, the feeling of fear and the racing of the heart are not cause and effect but parallel outcomes generated by the brain.
The Cannon–Bard perspective shifted the focus of emotion research toward neural mechanisms and brain-centered explanations. Emotions were no longer interpreted solely as reactions to bodily signals but as processes emerging from central neural control systems. This idea helped expand emotion research beyond physiological measurement and toward understanding how the brain constructs emotional experience.
Later developments in psychology introduced a more integrative explanation. In the early 1960s, psychologists Stanley Schachter and Jerome Singer proposed what is now known as the two-factor theory of emotion. Their argument combined elements of both previous theories while adding an important new component: cognitive interpretation.
According to this theory, emotion arises from two interacting factors—physiological arousal and cognitive labeling. A bodily state of arousal alone does not determine a specific emotion. Instead, individuals interpret that arousal based on contextual information and situational cues. The same physiological state may therefore lead to very different emotional experiences depending on how the situation is understood.
Schachter and Singer demonstrated this idea through experiments in which participants were given injections that produced physiological arousal. Participants who were placed in different social environments interpreted the same bodily state in completely different ways. Some described feelings of excitement and amusement, while others reported anger or irritation. The physiological signal itself did not determine the emotion; rather, the interpretation of the situation shaped the emotional experience.
This perspective introduced a critical insight for modern affective science. Emotional experience cannot be understood purely through physiological data or purely through cognitive interpretation. Instead, it emerges from the interaction between bodily signals, contextual information, and cognitive meaning-making.
For affective engineering, this insight has profound implications. While biosignals such as heart rate, skin conductance, and neural activity provide valuable information about emotional arousal, they do not fully explain emotional experience on their own. Emotional interpretation depends on context, personal history, expectations, and situational meaning.
As a result, contemporary emotion research increasingly adopts multimodal approaches that integrate physiological data with behavioral signals, environmental context, and cognitive factors. Facial expressions, voice patterns, biosignals, and interaction logs are analyzed together to construct a richer understanding of emotional states.
The long-standing debate about whether emotions begin in the body or in the brain remains unresolved. Yet modern research suggests that this question may be too simple. Emotions are neither purely bodily reactions nor purely mental constructs. They are complex processes that emerge from the dynamic interaction between physiology, cognition, and context.
From the perspective of affective engineering, emotion can therefore be understood as a structured form of information—an integrated signal generated by the body, interpreted by the brain, and shaped by the surrounding environment.
Understanding this structure is essential for building technologies that can sense, interpret, and respond to human emotional experience.
Learn more about this topic:
Read Reference