‘How it works’ CRO test

ZAVA

Introduction

ZAVA is an asynchronous online service that lets users request prescription medication without seeing a doctor in person. To request medication, users must complete an online ‘consultation’ which they can enter from service pages. 

All service pages display a static, iconised ‘How it works’ module halfway down the page that explains how the ZAVA service works in 3 simple steps.

The problem

I spotted a small trend on a handful of service pages while reviewing click map data (via Hotjar): users regularly hovered over or clicked on the static ‘How it works’ module, specifically the first step where we mentioned the medical questionnaire to request treatment.

This suggested* that:

  • users were interested in this content

  • users believed this element was clickable or interactive based on the design

  • users believed this would take them to said questionnaire (to request treatment)

Unfortunately the CMS did not allow editors to add links to this static module. This seemed like a missed opportunity to drive conversion (plus a potential source of frustration from a user experience perspective). 

This heatmap data was a good start to build a case for development work to give editors this control. However we needed further evidence in terms of volume and impact on conversion (i.e. completed orders).


*Note, usability testing was not possible at the time. This would have been another method to understand how users interacted with the module and gauge their expectations around this.

Actions

I pitched an A/B test to compare the original page against a variation where we linked through to the consultation from this module.

Once approved by the wider team (including Product and SEO), I built this in our A/B testing platform, Sitespect. This was implemented on a high traffic service page so we could reach significance quickly.

The primary key performance indicator (KPI) was set to checkout success (i.e. users completing the consultation, which we counted as a conversion). We also tracked clicks on the module element.

Results

We saw a positive difference in conversion of +12% between the variation and original once the test reached significance. On the variation, the link in the module element was one of the most clicked areas despite sitting halfway down the page. 

This gave us evidence to continue our conversations with Product around future development work.