
EMOTIONAL SUSTAINABILITY IN THE EMPLOYEE EXPERIENCE
Master Thesis | 2024
University of Europe for Applied Sciences

Why I chose this topic
I've always been more interested in people than in products. So when it came time to write my master's thesis, I didn't want to research something abstract. I wanted to understand something that affects real people in real ways every day.Work is where most of us spend the majority of our waking lives. Yet the emotional reality of that experience, how people actually feel, what drains them, what makes them feel seen, is largely invisible in most organizations. That gap bothered me.At the same time, Emotion AI was emerging as a technology that claimed to detect and respond to human emotions. I wanted to understand what that actually meant, whether it could help, and what could go wrong.
The question I started with
-How can artificial intelligence be used to enhance workplace wellbeing and foster emotional sustainability among employees?
But as I researched, three smaller questions kept pulling at me:
-How can organizations use AI to build healthier relationships with employees?
-How can AI help HR departments actually improve the employee experience? And what does responsible use of this technology even look like?
How I approached it

I treated this as a systems problem from the beginning. Workplace wellbeing isn't just an HR issue. It sits at the intersection of three things that don't often talk to each other well.
Employee Experience: how people actually feel at work and what shapes that feeling.
Emotional Sustainability: what it means to stay emotionally healthy inside an organization over time.
Emotion AI: what the technology can detect, what it can respond to, and at what human cost.
I spent a lot of time in the literature, covering organizational psychology, HR strategy, affective computing, and AI ethics. But the most interesting moments came from two expert interviews I conducted. One with L. Holz, an HR specialist. One with B. Demir Allan, a cognitive scientist.
What the experts taught me
L. Holz surprised me. He was honest that he doesn't currently use AI in his HR work. But he wasn't dismissive. He talked about something specific that stuck with me. The idea that Emotion AI could be genuinely useful for cross-cultural communication in the workplace. Not as surveillance, but as a guide. Like Grammarly, but for emotional tone. He also flagged something important: in Germany especially, data privacy and consent aren't just ethical questions. They're legal ones.
One of the biggest barriers to emotional wellbeing in the workplace is that emotions are still considered unscientific, too personal and too subjective to take seriously. Organizations can dismiss how people feel because feelings are invisible.
B. Demir Allan's research challenged that directly. She showed that emotions leave real, measurable traces in voice patterns, physiological signals and physical responses. And once you have objective data, dismissal becomes much harder.
"Nonverbal feedback you get from people is usually the most accurate one. When it comes to employee wellbeing, employees are usually scared to raise their voices about being burnt out. People don't talk about it. But an AI system would be very quick to identify this."
— B. Demir Allan, Cognitive Scientist
Data doesn't just detect emotion. It makes emotion impossible to ignore.
B. Demir Allan gave me one of the most memorable insights of the whole research process. She said that when she studied immigrants and refugees and asked how they were feeling, people said they were fine. But when he looked at voice measurement data, they weren't. Her point was simple and powerful: nonverbal signals are often more honest than what people say out loud. That's exactly why Emotion AI has potential, and exactly why it needs to be handled with enormous care.
What I found
Five things became clear through the research:
AI is here to stay. The question is how we use it. Companies don't get to decide whether Emotion AI enters the workplace. It already has. The real question is whether it arrives with transparency and consent, or quietly, without people knowing.
Emotions are not a weakness.This sounds obvious but it isn't. Most organizations still treat emotional expression as unprofessional. The research shows the opposite. Emotional awareness drives creativity, reduces burnout, and builds commitment. Ignoring emotions doesn't make them disappear. It just makes them invisible and harder to act on.
Mental health will always be at the front desk. Burnout, anxiety, disconnection. These aren't personal failures. They are systemic problems that live inside organizational structures, communication patterns, and the environments companies design for their people. Wellness apps don't fix structural problems.
Employees come first, and that has to mean something. One line from the research stuck with me: employees are not productivity units. A positive employee experience builds genuine commitment. Not compliance. Not performance metrics. Real human investment in the work.
Regulation is urgent and incomplete. Emotion AI collects deeply personal data, including facial expressions, voice patterns, and physiological signals. GDPR covers personal data but doesn't explicitly address emotional data. That gap is a real risk. Without clear frameworks, the same technology that could support wellbeing could just as easily discriminate, surveil, and harm.

What I believe
The most important thing I took from this research isn't about AI at all.
It's that the most complex human problems are almost never solved by adding more technology. They are solved by understanding people more deeply first, and then designing the systems, services, and tools around them with care and responsibility.
Emotion AI has real potential. But only if the humans designing and deploying it ask the right questions first. What data are we collecting? Why? What is the benefit to the person, not just the organization? And does the employee have a real choice?
Those are design questions. That's why I believe service design has a critical role to play in how this technology develops.
What this thesis gave me
It pushed me to think across boundaries I hadn't crossed before, covering human emotion, organizational systems, emerging technology, ethics and law. It confirmed something I already felt but couldn't fully articulate: that the most interesting and important design challenges aren't about objects or interfaces. They're about the invisible systems that shape how people experience their lives.

-The full thesis is available on request. eceatc.work@gmail.com