Flick is a gesture-controlled interface for music streaming on a mobile device.
Redesign an existing music streaming app interface to accommodate a language of hand gestures to allow for touch-free control and legibility when used from a distance. Produce a video demonstration of the app's capabilities using green screen video editing techniques.
Figma, Adobe Premiere Pro, Adobe Audition, Adobe Dimension
Wireframing, UI design, sketching, prototyping, video editing
Objects + Space, Graham Plumb
4 weeks, Fall 2020
A video demonstration of Flick.
My immediate concerns upon starting this project were around the usability and accessibility of a gesture-controlled mobile interface. The screen itself is quite small, and even more so when standing several feet away from it. I found that existing music streaming apps used a lot of small text, which isn't an issue up close, but becomes problematic in the context of this project.
As I tried to envision use cases for this type of product, one that occurred to me was an outdoor gathering type of setting in which you don't want to be bothered to carry your phone around with you. However, as a glasses wearer myself, when I put myself in this position, I realized that I had a hard time making out the small characters on my phone's screen when it was a distance away from me.
This informed my direction for the design. With gesture-control being so unfamiliar compared to the ubiquitous touch screen, my solution for this project would have to balance two things:
1. The need to onboard the user and teach them not only how to use this interface, but also help them to memorize a set of hand gesture controls, and
2. The need for users to be able to clearly read and interpret screen-based text and icons from a distance.
HMW . . . ?
With this in mind, I generated a "How Might We" statement to guide me in my next steps on the project. I focused on my two main concerns relating to usability.
How might we design a gesture-controlled mobile interface that is simple to learn and legible from a distance?
WHAT IS A GESTURE?
My understanding of gesture prior to starting this project was that it was simply a nonverbal movement of one's body to express a piece of information. However, while conducting research for this project, I learned that gesture and its meaning can be highly subjective and might vary based on one's biases, culture, and past knowledge and experiences. To better understand the range of information that can be conveyed through gesture, I conducted two observation exercises.
1. In the first, I recorded myself as I went about my day. I then watched the recording and captured moments in which I made a gesture. These gestures were then sorted into two categories: intentional and unintentional. Understanding which gestures were made absentmindedly or unconsciously helped gave me insight into which gestures to exclude from my app's controls.
2. My gestures and their meanings were all quite clear to me, but that was to be expected. The next part of this exercise was to observe someone else's gestures to see what movements they naturally use to express certain feelings and pieces of information. For this activity, I took screen captures from an interview on YouTube. I learned that it can be challenging to guess the intended meaning behind a gesture and that what might seem obvious to one person is incomprehensible to another.
Images and descriptions of gestures I observed during my research
This left me with more insight into what it meant to create meaning with gestures, but there is another aspect to creating a language of gesture controls: depicting them visually. I chose a gesture – prayer hands – and experimented with different ways to represent it.
Experiments in different ways of representing a gesture.
Of course, there is a list of key functions that a music streaming app needs to have. For the sake of this project, I determined that this app must allow users to: scroll up and down, skip forward and backward, pause and unpause, increase and decrease volume, select albums and songs, and shuffle songs.
I quickly sketched 2 options for gesture controls for 4 of the functions I had identified. I then tested these with 2 user testers who are both users of music streaming apps. I was looking to see:
1. Can they physically make this gesture? Is it comfortable and natural?
2. Can they correctly guess what function this gesture would have?
3. Does the meaning of this gesture make sense? Would it be hard to learn?
After learning from user testers, I selected one gesture for each of the functions to move forward with and iterate on.
Sketches for gestures used in user testing session.
Having feedback on my rough sketches, I moved on to using Procreate on an iPad Pro to create higher quality versions of my instructional drawings. After creating these, I held another user testing session over Zoom. The tester I was working with was a student who is a serious user of a music streaming app.
I shared my screen presented 1 gesture at a time to my user tester. My goals for this session were the same as before: Can she make this gesture? Is she able to tell what this gesture is for?
Instructional gesture drawings used in the app.
Hearing good feedback on most gestures from both user testers, classmates, and course instructors, I made some small adjustments before moving on to designing the app interface.
My initial challenge focused on onboarding the user and teaching them the gestural language. The primary issue that I grappled with was screen space – or lack thereof. I struggled to find the best way to provide the user with necessary instructions and feedback without taking up too much of the already small screen.
In version 1, I tried placing images, instruction, and semantic feedback in the center of the screen.
In version 2, I tried out different placements for the instructions and tried to incorporate echo feedback.
After creating a Figma prototype, I tested it with 3 user testers. There were concerns raised about the design of the gesture for volume control, and there were also issues understanding the echo feedback. However, the testers found the drawings themselves clear to interpret and understand. After making some modifications to one set of gestures, it was time to move on to the next step of the project: making wireframes of an existing music streaming app.
Due to the amount of time the video editing process for this project would take, we were prompted to modify an existing interface rather than create a brand new one. I chose to use Figma modify Apple Music to allow for touch-free control.
My Apple Music wireframes in Figma.
At first, I tried to keep the modifications to the design as minimal as possible. My thinking was that since gesture-control is such an unfamiliar concept to many people, learning a brand interface on top of learning how to operate with gestures instead of a touch screen might be too overwhelming for the average user.
However, it quickly became clear that this design did not lend itself to being read and used from a distance. The type was simply too small to make this app usable.
Screens from the first iteration of my UI design.
Hearing feedback from user testers, classmates, and my course instructor, I decided to go back to the drawing board. Usability issues aside, I even felt that there were some issues with the visual design of the app, despite the fact that I didn't make many changes.
1. Items on the screen are too small – album covers, text, and icons alike.
2. The blue color was used to differentiate Flick from the original, standard version of Apple Music, but it ultimately seems to make the connection to the brand weaker in the eyes of users.
To push the design forward, I resized and rearranged elements on the screen, increased text sizes, redesigned some icons, redrew and recolored my instructional hand gesture drawings, and shifted the color scheme back to the original one used by Apple in Apple Music.