ProjectFit

Research app for myAir users

ProjectFit is a research app designed to capture 3D facial scans of ResMed myAir users. The scan is collected along with survey questions relating to a users demographics, their equipment and their subjective experience on therapy. This data is then fed back to mask product development in order to improve mask product design.


For this project, ResMed partnered with a scanning technology company; Fuel 3D. Fuel 3D were responsible for adapting the iPhone’s TrueDepth sensor, fine tuning the imaging output and developing the native app. Our team worked on defining the UX, designing the UI, user testing the app and developing the email campaign.

THE SCAN

A precise scan was needed for the anthropometric data to be useful to our mask designers and engineers. Therefore there were a number of sensitive parameters the user needed to satisfy in order for the scan to lock on:

DISTANCE

Head must remain within the frame

HEIGHT

Hold at eye level throughout

ANGLE

Head not tilted up or down or left or right

Once the user had met all criteria they were told to hold the position and a countdown would start. Once the countdown hit zero the user would be prompted to turn their head left, then to the right and finally back to the middle. During the scan, it was important the user was aware of their:

SPEED OF ROTATION 

Too fast and the scan would fail

DEGREE OF ROTATION

At least 70 degrees of rotation

Scan Prepare

Scan Demonstration

THE INSTRUCTION

Due to the precise conditions needed for the scanning component to work properly, we created a two short instructional clips that appeared in the app before the scanning component initialised.

The first clip showed how to prepare for the scan - glasses off, hair tied back and soft ambient light.

The second clip demonstrated how to successfully capture a scan. The scan itself also included on-screen text instruction in order to guide the user to the correct position.


USER TESTING

In user testing, no usability issues arose for most of the app experience. One area in particular though, the scanning component, was difficult if not impossible for all but a few users. 

KEY FINDINGS

  • Only 20% of users could complete the scan unassisted

  • Users were removing their glasses when instructed and then encountering on-screen text instructions which they couldn’t read *face palm*

  • Users were confused and began moving their heads left to right during the pre-scan instructional video

  • Users had an incredibly difficult time locking on to the correct position. For those that did manage, the sensitivity of the lock meant they flicked in and out of the correct position many times. Doing so reset the countdown and made these users incredibly frustrated. 

  • Once locked on, some users were unable to turn their heads far enough to the side to fill up all the progress bars. For some this was a mobility issue, for others it was because they kept their eyes fixed on the screen (not necessary).


In summary, what we saw in user testing was that users were overloaded with information when it came to the scanning component. 

“Give me the right information, only when I need it”

— Participant 5

THE CHALLENGE

Deliver contextual instructions to users to improve the success rate of the scan.

SMART INSTRUCTIONS

To do this I re-designed the scanning process itself to include contextual instructions so users only encountered the direction they needed it; at the time they needed it. The scanning process now had 4 steps:

  1. An instructional sequence that gave users the 3 initial prompts 

  2. A live sequence that helped users correct any misalignments (blue screens)

  3. A holding countdown once they had acheived the correct alignment

  4. The scanning process were data was finally captured as users rotated their head

The instructions would also need to have a voice-over to assist users who couldn’t read them without their glasses. 

 

USER TESTING ROUND 2

In a subsequent round of testing the updated scanning component yielded much more success. We conducted internal bash testing and found:

  • 88% of users were able to successfully complete the scan

  • Users were less likely to encounter error states due to the correct alignment during the first step of instruction. When users did encounter error states they re-corrected quickly.

RELEASE

ProjectFit is currently rolling out to 800k of our myAir users and is available on the US app store.

Download ProjectFit