Does our data ‘spark joy’?
‘Give feedback on care’ (GFC) uses a number of tools to analyse its performance, observe user behaviour, and provide insights. Therefore, it’s important to ensure that we’re using valid sources of data when reporting. We need to be certain we can trust our data sources and ultimately make sure they continue to ‘spark joy’.
We use Google Analytics as our primary tool for more granular page by page detail to examine user acquisition, engagement and interactions.
Incomplete picture
When moving into Private and Public Beta, the ‘Give feedback on care’ journey was split across the main domain and GFC sub-domain.
- cqc.org.uk — the main website, where the start page of the GFC journey is hosted. (www.cqc.org.uk/give-feedback-on-care)
- give-feedback-on-care.service.cqc.org.uk — the subdomain where users search for the service they wish to provide feedback on and complete the form.
Considering most of the GFC journey is linear, where one section of the form will always come before the other, we can review exit rates to understand the pain points of the user journey.
An exit rate is a percentage of the number of exits divided by the total number of page views for a particular page. If we notice that a particular section of the site has a larger number of users leaving the website, we know that there is room for improvement.
Pain points by exit rates
The above table shows us the form page order of the journey and their respective exit rates on the sub-domain. It indicates the following:
- ‘Find a service’ has one of highest exit rate of 19.32%. This is the first point of the journey after the user has clicked on ‘start now’
- ‘Give Your Feedback’ — exit rate of 11.48%
- ‘When and where’ — exit rate of 8.6%
- ‘Tell us which service’ — exit rate of 11.41%. However, this page is only for users who have been unable to find their service through the site’s search functionality
In addition to this, we can see on the main CQC domain that 44.55% of users would exit the website on the GFC start page.
After reviewing the way in which we collect our data, we could see that there was a data gap in how we were monitoring user between the main domain and the sub-domain.
If we weren’t connecting user data between the start page and ‘find a service’ how could we justify saying that almost half the users leave on the start page? In reality, they are likely continuing their journey to provide feedback. Furthermore, how could we assess the submissions in more specific detail i.e.;
- how many submissions were from a particular campaign or referral?
- which CQC content drives submissions?
- if make changes to the start page, can we fairly assess their impact?
Mind the gap
GFC was initially set up with a distinct tracking code from the main CQC site. Whilst there isn’t anything wrong with this approach, it does mean that we need to look at two different sets of data that aren’t linked together to understand our users.
The main CQC dataset tells us how we acquire users and their interactions before getting to the start page of GFC, however we have no information on how successful our channels are in getting our users to complete the feedback process. In addition to this, the CQC main site treats the GFC start page as the end point of journey before users leaving the CQC domain.
The GFC dataset treats the ‘Find a service’ page as the main landing page of the journey. Whilst true in a technical sense of the sub-domain, it isn’t the perception for our users or the GFC journey.
This causes confusion for the UX team where we’re unable to rely on the data — so how do we fix this and bridge the gap?
Sub-domain tracking — By placing the same Google Analytics tracking code on both the main and sub-domain, and a few adjustments to the settings regarding cookieDomain and the Referral Exclusion List, we can tell Google Analytics that the same user that leaves one domain is the same that appears on the sub-domain.
For more information on how to implement sub-domain tracking: https://www.directom.com/google-analytics-subdomain-tracking
What’s the impact been on our data?
Initially the ‘Give feedback on care’ start page was indicating a 44.55% exit rate, this has now been reduced to 22.55%. ‘Find a service’ page had an exit rate of 19.32%, which has now reduced to 8.44%.
Why is this important? Our data needs to reflect the pain points in the journey for our users, so that as a team we can focus on pages that are obstacles for the end user.
The above table displays the revised exit rates for the ‘Give feedback on care’ form. It highlights that the top 3 exit points for our users are:
- the start page (22.55% exit rate)
- Give your feedback (12.5% exit rate) — the main section to provide your feedback
- When and where (9.13% exit rate) — free text fields for ‘when’ and more specific detail on ‘where’ at the provider.
We’ve been able to collect information on which channels are more successful in providing submissions. Organic Search and Direct channels provide the most submissions at a 66% and 23.6% respectively. Referrals have a decent amount of submissions at 9%, however in the next few weeks we’ll be able to interrogate which referral links are more successful. Social channels are dramatically lower at around 1.4% of submissions.
Further to this, upcoming changes tests to the start page message can be analysed by more metrics than just exit rates on the start page.
Removing the noise
We use filters in our Google Analytics data to ensure that we aren’t collecting data on internal traffic i.e. members of the GFC and NCSC team. These groups will actively be using the CQC and GFC websites to test and improve the service, therefore impacting the data being collected.
Normally, we can create a simple filter that will exclude the office locations based on IP address. However, two major changes in 2020 have meant that we are no longer able to exclude users by this method;
- Anonymized IPs — based on the recommendations from our Private Beta assessment, GDPR and privacy policy commitments required us to remove the IP addresses of users. On the flip side, if we aren’t collecting users IP addresses, we also can’t exclude by it.
- Working from home — in the midst of a global pandemic, the switch from the working in the office to working from home provides an additional challenge to our data collection process. The team is now based in multiple locations, meaning multiple IP addresses. The challenge here is the organisation of excluding all potential internal users. However, note that having anonymized IPs would supersede this.
As an alternative, we’ve created a link that once clicked will place a 1st party cookie on our team’s browsers, and through custom dimensions we mark this group of users as internal traffic, excluding them from the main data view.
For more information on how to remove internal traffic with anonymized IPs: https://www.amazeemetrics.com/en/blog/how-to-filter-internal-traffic-with-anonymized-ips-in-google-analytics/
The graph below represents the internal team activity from the 29th June — 14th July 2020 for sessions where our team have visited the ‘Give feedback on care’ start page. The link was more widely shared week commencing the 6th July 2020, hence the spike in page views above 400 in a day.
Why is this important? Our internal team aren’t the primary users of GFC or CQC. All internal interactions create noise in our data and distorts the measurements of our KPIs.
In short, our recent changes will remove noise and bridge the gap in our data — sparking joy? Possibly.
What’s next?
- Hurrah — We can now map the entire journey from CQC all the way through to form completion. I’ve updated the metrics for conversion rate, so we’re only looking at the one dataset. We’ll continue to monitor the data closely.
- Conversion rate has dipped in the last week to 39%.
- CQC launched the ‘Because we all care’ campaign on the 8th July 2020. With sub-domain tracking in place, we’ll be able to gain insight into the success of the campaign across both the main and sub-domain. Tracking links for service providers and non-service providers have also been set up and distributed in the campaign toolkit.
- The engagement team has been in contact for specific tracking links to understand the impact of ‘Choice Support’ engagement services with GFC.
- Further A/B tests are in motion to understand test messaging on the ‘Give Your Feedback’ page and encourage users to provide contact details on the ‘Can we contact you?’ page.
- Last sprint, we implemented changes to how users filter their search results on desktop. I’ll be providing more detail in the next few weeks on the effect of successful searches for desktop users.