Emergency contraception questionnaire

Asda Online Doctor

Introduction

Asda Online Doctor (AOD) is an asynchronous online service that lets users request prescription medication without seeing a doctor in person. This includes emergency contraception (EC) for women.

To request medication, users must complete an online ‘consultation’. The first step is a medical questionnaire (form) where users answer questions about their medical history and reason for treatment.

Conversion is measured by a user completing this and all other steps in the consultation (i.e. selects a chosen product, enters delivery and payment info) and submitting their request.

The problem

All services saw some level of drop-off at the questionnaire step though this was particularly high for the EC service (see more below, ‘Why this service?’).

Our Clinical team had created the original EC questionnaire. The page content broke many basic content design principles when it came to effective, user-centric forms such as:

  • unintuitive question formats

  • complicated language and jargon

  • long, dense content blocks, both above-the-fold and throughout the questionnaire

We were confident that we could make the page content work harder for users. And unlike other, hardcoded steps in the consultation journey, we could make questionnaire changes quickly and without dev resource.

If we did this work again, it would have definitely been worth learning more about users’ expectations of the questionnaire - and the consultation as a whole - through user testing. Off the back of this, we might have looked at earlier steps in the journey (e.g. the service page). For example, we could provide more information about how long the questionnaire takes, to align users’ expectations with reality.

Why this service?

Using Looker, I identified the emergency contraception (EC) service as a priority area as it:

  • saw high drop-off at the questionnaire step

  • was popular all year round

  • had a high average order value

  • had a good repeat usage rate, with a possibility of satisfied users recommending the service to others


The EC questionnaire was particularly long and complicated due to the nature of the service. To prescribe safe, appropriate treatment, we had to ask about sexual activity, current contraception etc. This level of detail wasn’t typically needed for other services.

Also, we surfaced a lot of chunky clinical advice about treatment options based on users’ responses throughout.

Digging into the problem

Incoming user feedback (via Hotjar) didn’t reveal any significant pain points with the page to explain the high drop-off. We couldn’t implement scroll, tap or clickmaps on questionnaire pages either to indicate where users dropped off etc. to identify specific problem areas either.

Our hypothesis

By optimising the page content and making the questionnaire easier to complete:

  • we could improve step success rate - this was our primary metric to measure performance

  • overall conversion would see a knock-on positive impact too

We didn’t have an objective goal set in stone (e.g. an x% uplift in step success). However, an uplift of around +5% would have been looked at favourably based on traffic volume and average order value.

Actions

I began by speaking to our Clinical team to get a comprehensive understanding of the service.

This covered:

  • the typical user flow, from consultation to a prescription being issued

  • why we asked questions or did so in this way

  • the typical audience and key concerns when seeking emergency contraception

  • how the questionnaire was built including technical limitations of the platform

  • any problems which content could help with (e.g. we cannot prescribe treatment if a man has completed the consultation on behalf of their partner)

I also referred to Google Analytics and Looker, mainly to fill in blanks about the audience demographic and to benchmark performance.

I then reviewed the full questionnaire in collaboration with an in-house clinician.

This involved:

  • rewriting copy to improve comprehension (e.g. replacing medical jargon with simpler or familiar phrases ‘a family member might use’)

  • amending above-the-fold content to set expectations and communicate key information only

  • changing question formats to user-friendly alternatives, particularly for mobile (e.g. free text fields to radio buttons)

  • reordering questions to engage users quickly with simpler questions at the beginning

  • moving all clinical advice to the end so users could focus on the primary task (i.e. completing the questionnaire) before consuming this detailed information

Challenges and trade-offs

We encountered some challenges or moments where we had to make trade-offs with Clinical.

  • We suggested changing one question (‘What time did it [unprotected sex'] happen?’) from a long radio button format to a simpler, intuitive time input field format. However our clinicians were reluctant to touch this given the complex backend logic tied to this and the time it would take to unpack this.

  • To meet clinical regulations, there was specific clinical advice that we had to show based on users’ responses to individual questions. Originally this was scattered throughout the questionnaire. This advice would guide users when they selected a product at the next step. However, there could be edge cases where conflicting advice was pulled through which would be confusing. Further Product work would be needed to explore a better solution. As part of this project, we focused on moving this advice to the end of the questionnaire so users could focus on the primary task of answering all the questions and then consume said advice in one go so it would be fresh in their minds once they moved to the next step.

  • Clinical were not placed to quickly determine whether a user’s current contraceptive pill was a mini or combined pill based on the name alone given the wide range of pills available in the UK and overseas. Therefore we could not consolidate these questions.


At the end of the day, our priority was helping users complete the questionnaire quickly (but accurately) and engage them in the overall consultation. We didn’t want to distract them with rambling questions or low priority information.

Results

Questionnaire step success increased +7% following our changes. This consistently hit X% WoW (note, I’ve redacted this figure for confidentiality reasons). We had only achieved this feat twice in the previous 8 months since launch.

Overall conversion saw a small increase too. This was a secondary metric given the scope of our work. We made changes to the questionnaire step only and understood that users may drop off at subsequent steps for other reasons.

View the live questionnaire pages: