No A/B Test for the Wicked
The ‘Give feedback on care’ (GFC) participated in a team based A/B workshop, where we collaborated on potential solutions to address key issues exposed in the data and through user research. The outcome? More A/B testing.
The workshop was a great way to bring together innovation from the team working on the GFC service and brainstorm ideas that the UX team were then able to refine and implement through A/B testing.
This was inspired by a similar workshop run by HMRC.
Previously we’d run A/B tests looking at similar issues, however the variants proved unsuccessful when compared to the original.
Key issue areas
- High drop off rate on the free text feedback page
- Users don’t always understand what we do with the feedback information
- Large amounts of users aren’t providing contact details
- How do we reassure users about their contact details
Test 1 — Addressing drop off rate on the feedback page
Hypothesis: By changing the messaging and explaining what actions the CQC can take with poor care, can we encourage more users to provide us with their feedback and be less likely to drop off of the journey.
The variant offers a drop down where the user can click to find out more information about what happens with their feedback after submitting. The expectation being that it would act a push to encourage users.
The result?
We ran the test from 13th July — 30th July 2020 where 2,761 sessions were measured.
- Original: 1,404 sessions
- Variant: 1,357 sessions
The data highlighted that after the experiment completed the Variant marginally outperformed the Original by 2.4%.
This was further supported by the exit rates, in which during this time:
- Original — 15.75%
- Variant — 13.02%
Using Google Optimize’s analysis, we were able to deduce that the variant had an 84% probability to be best.
Action: Implement the Variant in place of the Original.
Test 2— Addressing users providing their contact details
Hypothesis: Improve the number of non-anonymous users submitting feedback by changing ‘we’ to ‘inspectors’ and using alternative wording on ‘information’ to emphasis who would be contacting and more specific detail rather than ‘information’. The end goal being to encourage users to provide their contact details.
This test required us to return to the ‘Can we contact you’ page and review how we could improve the messaging.
Main metric: the number of users saying ‘yes’ to contact details
Secondary metrics: exit rate, completion rate, clicks on drop down
The result?
We ran the test from the July 13 — July 30th 2020. This covered 1,857 sessions, in which:
- Original — 878 sessions
- Variant — 979 sessions
The Variant continued to outperform the Original on a daily basis. For the overall number of users agreeing to provide their contact details:
- Original — 47.78%, 419 submissions
- Variant — 50.46%, 494 submissions
Exit rates between the two were marginally different. Both pages had a the same amount of click engagement on the drop down.
Google Optimize’s analysis gave the Variant an 80% probability to be best.
Action: To implement the Variant over the Original.
Note the team do have a concern regarding the amount of text on this page for mobile users. There is a lot of information before user gets to the end of the page and then the simple ‘yes’/’no’ responses may not be clear as to what the question was.
Next steps are to review further areas we can tackle and start developing those ideas into hypothesises for testing.