Experience & Interface
Now with a visual and concept that work for the data, I started working thinking about the features and the experience of the application.
Now for the boring stuff - er, really important part of the artefact. How will a user actually record their emotion experiences through a digital interface? would be a wide, short pyramid with only 2 sides.
I started by breaking down the application’s structure into three main features:
Each feature of the app is meant to encourage reflection and bring the user to awareness of their emotions.
I started with recording by dividing it into phases: Naming, Dimensions, Triggers, and Actions. Because this is going to take more time than tracking a run or checking into a location, the tracking feature of the app is a liner user experience, designed to be fluid and relatively quick to travel through. Its simplicity acknowledges the need for as few barriers as possible hindering the user from entering their data consistently.
And here’s what recording dimensions looks like after I skinned the wireframes, with a couple alternate UI sliders:
As a user interacts with the sliders, the visualization builds out in real-time. Because of the number of entries and the complexity of emotion recording, it is imperative that the user is exposed to how the visual representation is changed by an individual dimension.
Throughout the entire recording experience, the application animates just one screen: the iceberg landscape. Once they are finished tracking dimensions, the move onto triggers, which takes them underwater. Afterwards, they move into the sky, where they record actions taken during and just after the emotion experience.
I use a pretty conventional user experience and interface for recording emotion in the application. An interesting extension would be to explore direct manipulation. Each dimension could have a specific gesture (i.e. for intensity: pinch and zoom to record more intense experiences). It would add another language for the user to learn and memorize, which could be a huge barrier for some who find emotion recording not so intuitive. However, it is much more expressive than the application’s current solution.
The analysis feature has proved to be a much more complicated experience to design. The breadth to which a user can break their data down could easily push it into the “pro tool” category.
What became quickly clear is that first time users - or even frequent users for the first few weeks - are going to need some help. So I’ve begun to piece together a wizard-like interface for the user to see how to pick and order filters in order to display what they’d like to see. If I wanted to push this further, developing a few presets might be a good idea as well. Simple comparisons like seeing how valence varies by frequent activities in a user’s life (i.e. I record more pleasant emotions when I am running than when I am working my thesis).
So far, I’ve worked on options for the user to view their filtered emotions in small multiples and in list view. At any point a user can click through to a specific emotion experience and see the details. How detailed this screen needs to be, I’m still not sure. The analysis feature will be the focus of my design work in the coming weeks.