FloowDrive

Content design to improve findability

CONTENT DESIGN

An iPhone with a journey on a map

I led quantitative and qualitative research and creation of a feature that informs and guides drivers on their driving behaviour.

I collaborated with the commercial, data science and support teams, and the members from my squad to design a better experience of finding the educational content. My role:

  • Lead UX and UI designer

  • Running a series of workshops within the company

  • Planning and running remote usability tests

  • Executive and company-wide presentations

Problem

Drivers complain about the scores’ “inaccuracies” and being penalised for something they did not do whilst they were driving. This was causing a lot of frustration and support requests needing to be dealt with by our team.

Solution & Impact

The goal of the project was to educate the drivers on their driving by improving the findability of content, including: score comprehension and tips on how to improve their score. Impact:

  • Reduced support burden by 63%

  • Increased user satisfaction by 34%

Product background

FloowDrive is a mobile app that automatically detects and scores journeys driven by a driver. Insurance companies use it to assess their drivers and offer good drivers better rates.

The app provides guidance to drivers on what they should do to become safer drivers and, in return, be less likely to be involved in a car accident but also pay less for their car insurance.

Running workshops

Based on my research, users kept telling us the app was broken because they were penalised for things they did not do. The biggest area of concern was a misunderstanding of how the “Mobile use” score was calculated. This became the first thing we worked on solving.

I organised and run three remote workshops: to ideate on the problem with the potential solutions, critiquing those solutions and evaluating better ones.

I worked with:

  • commercial team (who work with our clients on the daily basis),

  • data science team (who know the ins and outs our how our scores work),

  • my product team (who work on the product on the daily basis).

Testing

I tested the potential solutions with 6 participants. All tests were done remotely. The areas I tested:

  • Discoverability of content. To learn if the users could find the information the previous participants struggled to find.

  • Tone of voice. Designed to understand if the vocabulary and keywords were clear.

  • Use-cases-based questions. Designed to learn if the participants understood how the mobile distraction score was calculated.

I did three iterations on the designs attempting to fix all of the problems my participants have found with the designs.

A collection of post-it notes from a workshop
Post-it notes with voting dots from a workshop

High-fidelity screens

""
""
""
""
""
""

New Map view screen

Dashboard screen

""
""

Resources screen

Lesson screen

Journey detail screen

Journey screen

Score notification

More on journey screen