Share on facebook
Share on twitter
Share on linkedin
cost of customer journey

You Can’t Ignore Poor Customer Journeys Any Longer

The mid-2010s were a tumultuous period for Dell’s customer experience.

The computing giant had just merged with EMC Corporation, creating a monstrously large organization packed with complex political dynamics. The new company lacked a unified design or experience function. Instead, more than 100 independently-funded teams independently crafted their small part of the customer journey, hoping they dovetailed with those on either side.

Deb Zell
Deb Zell, former Director of Ecosystem-level Experience Strategy at Dell.

“People were excited about journey mapping so it had spun up lots of different efforts, but we lacked common understanding across the board,” says Deb Zell, former Director of Ecosystem-level Experience Strategy at Dell. “Nothing was tying all the pieces together.”

The end customer experience was “crunchy” with components fitting together poorly. Dell’s website experience was one of these moments. As soon as a user arrived on their website, it funneled them into a specific product silo: laptops, desktops, workstations, thin clients, and so on. The problem was that people didn’t always know what they wanted.

“Prospects and customers were really frustrated trying to understand what we offered that would help them solve their problems,” Zell explains. “They really didn’t like what they described as the “marketecture” approach to content: gorgeous visuals with buzzwords that were light on telling them what these buzzwords meant in practical terms.”

Problems like these didn’t just hurt specific business units. Each underwhelming interaction or awkward transition from one segment of the journey to the next hurt the whole organization because customers didn’t see the underlying organizational nuances.

The impact of off-brand experiences is immense. Speaking to Freshworks, customer experience expert Greg Tucker quantified the impact of poor customer experience: “In Net Promoter Score (NPS) terms, we see about a 50 point difference between an on-brand experience and an average experience.”

With customer feedback highlighting dissatisfaction with the company’s user journeys, Dell’s leadership could not ignore the problem. They embraced the criticism, recognizing that they had to evolve or it risked slipping behind its competitors. This challenge isn’t unique to Dell. Indeed, it’s a well-known risk shared by organizations around the world.

VisionCritical once pegged the overall cost of bad customer experiences in the United States at more than $537 billion. Across the Atlantic, in the United Kingdom, the figure stands at £234 billion. Customer journey is not merely a loss prevention strategy, though.

Research from the Massachusetts Institute of Technology showed that companies that focus equally on transforming customer experience and achieving operational excellence significantly outperformed their peers, earning net margins 16 percentage points above their industry average.

In other words, customer journeys are not design frivolity. Effective customer journey research, implementation, and optimization drive positive commercial performance. But selling an organization on change isn’t always easy. To earn buy-in for sweeping change, you must first quantify the risks and impact of change. Then you must build a case for transformation.

Create a means of measurement

Perhaps the greatest barrier to customer journey investment is quantification. How do you measure the impact of a customer’s interaction with your entire business? How do you demonstrate the potential losses of inaction and the prospective gains of investment? 

While there are abstracted and objective measures—the studies mentioned earlier from VisionCritical and MIT—organizations require a personalized yardstick, too. Here, some design leaders are turning to an old organizational framework: the Balanced Scorecard or BSC.

The concept dates back to the early 1990s when David Norton and Robert Kaplan pitched the idea in Harvard Business Review. The pair spent a year studying “companies at the leading edge of performance management,” distilling their best ideas down into a universal scorecard. 

Source: The Balanced Scorecard—Measures That Drive Performance in Harvard Business Review

Think of the BSC as an airplane dashboard. Just as pilots require information on fuel, airspeed, altitude, and bearing, business leaders require an instant summary of what’s important to them. In order to apply this metric to customer journey performance, CX leaders need to tweak the original metrics. A customer experience scorecard may include categories like these:

  • Financial metrics (top line, bottom line, margin)
  • Operational metrics (productivity, time saved, etc)
  • Customer metrics (CES, LTV, NPS, etc)

Like the original BSC, a journey measurement scorecard and succinct. The creators believed brevity forces leaders to focus on a handful of critical measures, rather than a deluge of peripheral KPIs and statistics.

Although Zell didn’t implement Norton and Kaplan’s BSC at Dell, she did help the company design its own customized scorecard. The specifics varied between projects — brand campaigns weren’t attached to the pipeline; conversion optimization plays didn’t include awareness metrics. But every initiative had some means of quantification and measurement.

“There’s an engineering culture at Dell so you need numbers to get buy-in, because if work doesn’t move the needle on business outcomes, why are we doing them?” Zell explains. “Finding ways to measure improvement in the experience was really important.”

When Zell tackled Dell’s overly siloed website, one of her first tasks was laying a foundation for evaluation. Here, she designed a scorecard around customer engagement.

  • Bounce rate: How many people engaged with a page vs. those who immediately left?
  • Engagement: How many users engaged with recommended offerings?
  • Pipeline: What dollar value user journeys were adding to Dell’s pipeline?

For each initiative, Zell and her colleagues ran a scorecard exercise with quantitative and qualitative research to figure out what metrics to include. The output formed their new baseline, allowing them to track changes and improvements.

Build a case for change

Today’s consumers expect frictionless end-to-end customer journeys. As they move through the buying process, they don’t want to feel clunky transitions between departments or endure awkward handoffs from one employee to another. However, creating holistic journeys is not easy, especially in large enterprise companies.

Recall the environment Zell arrived in: post-merger, complex political dynamics, more than 100 independent teams, separate budgets, and no overarching design system. Change of any sort would be difficult, let alone something as fundamental as the customer journey.

“It was terrifying,” she says, recalling her early days at Dell. “I was thinking, ‘How are we actually going to do this?” But you may have heard this riddle: How do you eat an elephant? One bite at a time.” 

Zell didn’t try to implement wide-reaching change from day one. Instead, she and her team focused on smaller, short-term pilots, where they could drive impact and prove their worth.

Dell’s siloed website was one of them. After identifying the problem (the website forced users to niche down into a product silo before they understood their requirements) and a means to track change (Dell’s user experience scorecard), Zell turned her attention to research and improvement.

Previously, Dell had relied on a data science-centric approach. For example, Zell’s team once focused on a metric called High-Value Engagement, which she describes as “a fancy way of saying that once someone looked at product X, Y, and/or Z, they were likely to take the valuable action of A.” 

Optimizing an experience to deliver High-Value Engagements makes sense quantitatively. But in reality, the end action may have been the culmination of frustration or boredom. “You don’t get that understanding without qualitative observational data,” explains Zell.

During her tenure, Dell introduced a new process that combined qualitative and quantitative data. It worked like this:

  • Observation: Researchers studied customer behavior at an individual level through the actual observation of customers in their environment.
  • Design thinking process: They took their learnings from the observation phase and framed friction points as problems, which could be taken through the design thinking process to identify potential solutions.
  • Prototyping: They used early testing and rapid concepts with those same design thinking participants, refining, retesting, and ultimately delivering products and experiences.
  • A/B testing: Once they had built a working, usable experience, they deployed it for A/B testing to understand how well that solution performed against the baseline. They gathered data from the desirable experience to feed a smart experience/machine learning experience.

In the case of Dell’s “crunchy” website experience, Zell learned her customers needed an ‘outcome and specification-based understanding of the company’s products.’ This led them to develop a tool called Product Finder.

“Gathering a small amount of information around what they were trying to do and what sort of environment they were operating in allowed us to return options from across the portfolio to consider,” says Zell. “Those options were further narrowed down based on more specific data about their needs, at which point they had the option to engage with a technically knowledgeable person via email or call.”

To better streamline the move from research to recommendation, the tool passed all relevant context to the knowledgeable person. Instead of starting from scratch, the prospect was able to seamlessly continue their research.

But this was only one of many small improvements. Zell lifted another project straight from Sun Tzu’s The Art of War. Tzu wrote about how all generals need a common strategy. That way, if they lose communication they can act independently. Zell and her colleagues centralized the design of the overarching customer journey and communicated it out to contributing teams. Suddenly, subteams could see how their work slotted into the larger picture.

“I see the customer journey map as the playbook,” she explains. “It shows what, why, and when people are doing things. If you can put that artifact together, you encourage open lines of communication with your peers.”

Another initiative reassessed industry best practices.

“It sounded kind of crazy but we said, ‘Hey, we found that industry best practices are wrong,’” recalls Zell. For example, when tested, the industry’s obsession with C-level stakeholders didn’t hold water. Zell discovered that functional leaders and individual contributors often acted as “boots on the ground” gatekeepers, who needed to be won over before a deal could close.

Buyer personas had changed, too. Centralized system administrator roles splintered into “a wide variety of technical and non-technical positions.”

“This is another manifestation of how problematic selling products versus solutions can be,” Zell explains. “They may be looking for a way to handle 4K rendering across multiple sites, and have no idea whether they should look at your server, storage, or other products. Selling based on those product siloes simply does not work for them.”

By testing principles and ideas held up as best practices, Zell cut away the best practice misconceptions and provided her teams a stable foundation from which to build. For example, Dell’s multi-billion dollar marketing strategy was predicated on a high-level story that didn’t match up to customer reality. By clarifying how buyers actually researched, compared, and bought products, Dell’s marketers could recraft their customer journeys around reality, not guesswork.

Each small project improved Dell’s customer experience—as evidenced by positive trends on their scorecard. After a few years, the company had all the proof it needed to make a significant bet.

A universal starting point for a world of diversity

All companies must consider their customer journeys, but no two will follow the same trajectory. Some businesses like Amazon will build whole customer journey atlases around a process like self-service. Others like Nordstrom will design their journey maps around buyers, leveraging personal shoppers and white-glove service.

Simply put: there is as much diversity in customer journeys as there is in customers themselves.

But despite this heterogeneity, success stems from the same place. 

To deliver exceptional customer journeys, companies must relentlessly advocate for their customers’ success. Every decision must be prefaced with a simple question: Does this maximize the chance of success for our customers? Identifying what success looks like allows organizations to translate that into impactful customer journeys. But setting the destination isn’t sufficient on its own.

Companies must have a method to measure performance and progress. Without it, they’re floating on the ocean at the whim of the currents. With a solid foundation and a performance yardstick, all companies can ensure successful journey design and execution—and better outcomes for their customers.

Leave a Reply

Your email address will not be published.

Ask an expert

Ask an expert Have a question related to this story? Or a thought you’d like an expert to weigh in on? Write to us.