There’s Still a Gap
by Ali Carmichael, Managing Director and Owner

There’s Still a Gap

How confident are you when you make an important decision about your customer experience? What data are you relying on?

There was likely a moment – a strategy day, a board meeting, a new head of digital – when data became the thing. More of it. Better organised. More visible. And somewhere along the way it also became the assumption: that if you were measuring enough, you were understanding enough.

Dashboards were built. Surveys were sent. Analytics platforms were procured. AI tools were introduced to synthesise it all.

You’re not short of data. That’s not the problem.

The problem is that none of it quite tells you what it feels like to be your customer.

And yet something is still missing. Not more data. A clearer picture of what it actually feels like to be your customer.

There’s a gap between the picture your data paints and the experience your customers are actually having. In my experience working with organisations across the UK, that gap is wider than most people realise – and it tends to show up in five specific places.

The AI Gap

There’s a lot of noise right now around AI and what it can do with customer data. Some of it is justified. AI tools can synthesise large volumes of research quickly, surface patterns, and make information more accessible across an organisation. That has genuine value.

But there’s a risk that’s easy to overlook.

If you feed years of accumulated data into a tool and assume the output is a reliable picture of your customer, you’re making a significant assumption about the quality and currency of what went in. Data ages. Behaviours shift. Research that was valid three years ago may now be quietly steering decisions in the wrong direction.

The more subtle risk is that confidence can grow faster than understanding.

AI-synthesised outputs look polished. They reference real research. They feel authoritative. But if that research isn’t being regularly refreshed through direct observation, what you’re looking at is a processed version of the past – not a window into the present.

The AI Gap isn’t a technology problem. It’s the assumption that better processing of existing data is the same as better understanding of real people. A research repository is only as valuable as the human insight going into it – and that insight needs to be ongoing, not historical. The organisations that get the most from AI-assisted research treat their repositories as living resources, not archives.

If your AI tools synthesised everything in your research repository tomorrow, how useful and accurate would that picture of your customer actually be?

The Customer Service Gap

Customer service is a useful signal. When something goes badly wrong, it tends to surface there. But it’s a narrow signal, and it’s worth being clear about what it captures – and what it doesn’t.

The people who contact customer service are the ones who’ve hit a wall hard enough to complain. That’s a small proportion of the people who’ve experienced friction.

Most don’t complain. They find a workaround. They lower their expectations. They quietly decide that’s just how it works. Some abandon the journey altogether and say nothing at all.

What customer service data rarely gives you is context. Someone calls frustrated at a specific point in a process, but the real friction may have started much earlier. Without seeing the whole journey, you end up treating symptoms without ever understanding the cause.

There are positive signals too – moments where something works better than expected, where a small improvement could unlock real value – that never reach customer service at all.

That’s the Customer Service Gap. The loudest voices get heard. The quieter signals – friction never raised, opportunities never surfaced – remain invisible.

If you looked beyond the feedback you receive, how much of your customers’ wider experience do you think you’re actually seeing?

The Analytics Gap

Analytics are very good at telling you what’s happening. Drop-off at checkout. Time on page. Conversion by channel. The data is precise. The what is often clear.

The why is a different matter.

An 8% drop-off at a particular step tells you there’s a problem. It doesn’t tell you whether the form is confusing, the language isn’t landing, trust signals are missing, or people are simply distracted.

You can make educated guesses. You can test and iterate. But without understanding the why, you’re investing time and resource in solutions to a problem you haven’t properly diagnosed.

When you watch real users move through a journey in real time, guessing stops. You can see hesitation, confusion, workarounds, misinterpretations. You can also see the hidden opportunities – the places where relatively small changes would have disproportionate impact – that analytics alone will never surface.

The Analytics Gap is the space between having the data and understanding what it’s telling you.

For the critical data points that concern you most – do you have a true grasp of why it’s happening, or just that it is happening?

The Persona Gap

It’s a pattern most organisations recognise when they look closely – personas that have quietly drifted toward reflecting internal assumptions rather than observed reality.

Personas, when they’re current and well-maintained, are a genuinely useful tool. Marketing teams use them to shape campaigns. Product teams use them to guide decisions. They represent a real return on the investment made in creating them.

But personas are a snapshot. They reflect what was understood about customers at a specific point in time. And time moves on.

The challenge isn’t usually that teams are blindly following obviously outdated personas. It’s that validation quietly slips down the priority list. Other things feel more pressing. The personas are still on the wall. They still feel familiar. And gradually, often without anyone noticing, they start to reflect what the organisation wants to be true rather than what customers actually are.

We call this the Convenient Customer – the unintentional drift of user personas toward internal assumptions rather than observed reality. It’s a pattern worth watching for. You can read more about it at experienceux.co.uk/ux-blog/the-peril-of-the-convenient-customer

Behaviours shift. Expectations change. The digital landscape moves quickly. How people interact with products and services today may look very different from how they behaved 18 months ago. Keeping personas useful requires ongoing investment in real research – not a one-off project, but a regular overlay of what’s being observed against what’s being assumed.

Without that, the Persona Gap opens quietly. Familiarity replaces validation, and nobody quite notices when the picture drifts.

If you were to use your personas today to plan and design a new service, how confident are you that they still match?

The NPS Gap

NPS is a useful benchmarking tool. Year on year, it gives you a directional read on whether sentiment is improving or declining. That has value.

But NPS is a prompted response. You’ve asked customers to react to a number scale, often at a single point in their journey, often some time after the experience itself. The score reflects a moment, a mood, a general impression. It isn’t a description of what actually happened.

And people aren’t very good at articulating their own experience, even when invited to. The follow-up question – what would make it a 10? – gets you language straight from customers, which is worth having. But verbatim feedback is still feedback out of context. Someone writes “the process felt slow” without being able to tell you which part, or why, or what they were trying to do when it happened.

A customer who scores an eight may be gritting their teeth through a clunky interaction every single time they log in. A promoter may be reacting to one genuinely brilliant moment, while the rest of the experience is merely tolerable. The score doesn’t distinguish between the two.

At best, NPS is a signpost – it can tell you where to look, but not what you’ll find when you get there. That’s the NPS Gap: sentiment captured at a distance, without the context that makes it actionable.

The feedback you receive from your most recent NPS score – how well do you understand the experience that produced it?

What the data can’t do

These five gaps have something in common.

They all stem from the same underlying condition: the data organisations rely on is collected and processed from the inside. It reflects what was measured, when it was measured, and through the lens of how the organisation chose to measure it.

Customers are on the outside.

What they notice, what confuses them, what frustrates them, what they quietly give up on – those things don’t always find their way into a dashboard.

Closing the gap isn’t about replacing data. It’s about complementing it with direct, outside-in understanding. Bringing decision-makers closer to the lived reality of the people they’re designing for.

The ‘Why Behind the What’ exercise

Pick a data point that’s been nagging at you. Gather a small group of colleagues and explain what the data is showing. Then ask each person to write on a post-it note why they think it’s happening – independently, no discussion first.

If everyone agrees, you have a hypothesis worth acting on. If any notes diverge, you don’t yet understand the problem well enough to solve it.

This is a simple exercise, but it surfaces something important. Are we acting on guesswork?

The Why Behind the What is a starting point. It will highlight some of the gaps in understanding your customer’s experience. But to tackle all five gaps – with confidence, and with findings you can act on – you need to get closer to your customers than any dataset will allow.

Closing the distance

When teams watch real people try to use their product or service, something changes.

Assumptions get challenged. Conversations become more grounded. Decisions feel calmer – because they’re based on what people actually did, not what we assumed they would do.

At Experience UX, this outside-in perspective is what we focus on. Through moderated usability testing and qualitative user research, we help organisations see where the gap really is – and what to do about it.

Working with us is straightforward. We handle recruitment, research design, and facilitation – bringing real users from your audience into moderated sessions that you can watch live. What you get at the end isn’t a data dump. It’s a clear, prioritised account of what we observed, why it matters, and what to do about it.

We’re not here to ignore or replace data. We’re here to make better sense of it through human behaviour understanding. To provide the why behind the what, and to close the distance between your organisation and the people you serve.

If any of this feels familiar, I would welcome a conversation. Great things happen through connection. Contact Us or drop me an email and let’s find a time to chat:

alicarmichael@experienceux.co.uk

Other reading you will enjoy:

The peril of the ‘Convenient Customer’ | Experience UX

UX Consultant Emma Peters

Let’s Chat

Got a question? We’re here to listen and help you and your organisation become more user-centric. Talk to us about how usability testing and user research can help you. Contact us today.

01202 293652 emmajones@experienceux.co.uk

Experience UX
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.