Year
2024
Role
Strategist, UX Researcher, Mobile Designer
Product Duration
4 Months
The How We Feel app has been a breakthrough tool for me to log emotional states, but it was missing an opportunity to make that data actionable. The app was designed to build a suer's ability to differentiate and label emotions with specificity and precision, aka their emotional granularity. Users log what emotion they're experiencing, what might be causing it, and tag various themes to track long term patterns. It then puts the onus of analyzing that data on the user. I wanted to challenge myself to design a solution for this alone, but I had no idea how close I'd get to the real team behind the app.
Before diving in to designing, I spent some time researching the space. Despite my longstanding interest in psychology, I wanted to make sure my approach was as grounded in real world science as possible. I explored secondary sources, did a competitive analysis of existing solutions, and conducted several one-on-one interviews.
Body based logging is seen as important but unsupported
The interviews confirmed the secondary research finding that emotions are experienced primarily through bodily sensations. This wasn't reflected in the app design. In my personal use of the app I found myself using the tagging system to track bodily sensations even thought it wasn't designed for that. If this product was going to empower users with impactful information about their emotions, it was going to need to meet them where they are, in their bodies.
How We Feel visualizes data well, but it's up to the user to find insights
While the data visualization in How We Feel was strong, there was little scaffolding to help users turn those insights into change. I wanted to shift the cognitive load of translating data into actionable insights onto the product.