Using Design Analytics to Identify and Solve Actual Problems

January 14, 2020 | 6 minute read
Kaiti Gary
Senior Director of Analytic & Strategic Services, Oracle Digital Experience Agency
Text Size 100%:

Marketers have access to innovative technologies and massive amounts of data to enable connected customer experiences, but time and time again we see our clients struggling to activate it all. Marketers are defining their strategies, selecting key performance indicators, and determining the customer experiences they need to drive those KPIs. Then they’re working with their data scientists and analytics experts to get the reports and models they need.

And that’s the source of the problem. 

To understand why, let’s compare marketing to the evolution of car engineering. As automotive technology advanced, the industry recognized the need for a new kind of worker, engineers, who became critical to the design and innovation phases of product development. While modern marketing has acknowledged the need for engineers to execute as our technology and capabilities have advanced, we’ve left our data and analytics teams on the production line.

We’ve siloed our analytics team members, so even though they’re doing what we’re asking them to do, we’re often not driving the results we want. The problem isn’t in the execution, but in the design phase. We’re strategizing and building solutions without one of the key engineers in the room—the analyst.

Enter Design Analytics

Design analytics is a method for developing analytics solutions that enable business strategies and drive measurable business impact. It operates under three beliefs:

  1. That analysts are designers, not producers. 

  2. That analysts provide solutions, not facts. 

  3. That analysts’ work should be measured in impact, not reports created.

In practice, design analytics has two major components. First, it brings together a holistic team that at least includes the following:

  • Strategist: This person owns the business strategy and must clearly outline the state of the business, its goals, and key performance indicators.

  • Analyst: The analyst and strategist work together to provide analytic-based solutions that enable the strategy and address the business goals.

  • Data Steward: Once the strategist and analyst are aligned on the best strategy, they work with a data steward to determine how to get and then leverage the necessary data.

  • Project Manager: After helping identify any additional stakeholders needed to executive a strategy, the project manager works with all stakeholders to ensure the solution fits within budget, scope, capabilities, etc.   

And second, to ensure that the team comes up with the most effective marketing and customer experience solutions, design analytics applies the Design Thinking methodology so that any solution truly addresses root causes and actual problems rather than symptoms and assumed problems. The five steps of the Design Thinking process are:

  1. Empathize: Research and understand your users’ needs.

  2. Define: State your users’ needs and problems.

  3. Ideate: Challenge assumptions and brainstorm solutions to those needs and problems.

  4. Prototype: Create and quickly build and experiment with solutions.

  5. Test and Optimize: Iterate on the most impactful solutions.

What does all of that look like in action? Let’s explore an example…

Design Analytics: A Sample Scenario

Meet marketer Margaret, who wants to know, “How many promotional emails should I send to my customers per month?” To determine the answer, let’s task our analytics expert with determining the ideal number of emails to send based on subscriber feedback and customer engagement.

But wait… We’re practicing design analytics, so we’re not going to do that. Instead, we’re going to stop and Empathize. Why does Margaret want to know this? How does it impact her company? What are customers currently experiencing?

“My CEO thinks we send too many communications to customers around promotions,” says Margaret. “We’re afraid our customers are opting out because they get too much. If we knew how many to send to each person, we’ll be able to adjust accordingly.”

This is a key point of discovery: Margaret assumes their problem is email frequency. However, do we have data to even support that frequency is a problem? Have we quantified opt-outs are a real problem? Do we understand the customer experience?

In design analytics, we take what started as a request for an answer and focus on identifying a problem that requires a solution. Rather than trying to find a solution to a frequency problem, an analyst might recommend that we begin by looking at what is currently happening to customers in a baseline analysis. For instance, we’d determine…

  • How many communications do customers get on average? Does this vary greatly between segments or by persona?

  • Are subscribers opting out of communications from the brand at a higher rate than is deemed acceptable?

  • Is there a relationship between frequency and opt-outs?

In this case, answering these questions reveals that customers are opting out from certain communications from Margaret’s company at a higher rate than is acceptable, but there is no relationship between frequency of communications and engagement. However, there is a correlation between content and engagement. We see while some customers respond to almost all content, a significant number of customer segments are only engaging with certain types of content. Simply put, our problem is content, not send frequency. Now we’ve been able to accurately Define the problem.

Knowing that, we can begin to hold some brainstorming sessions and Ideate some potential solutions that might remedy the content problem we now know we have. Together, Margaret and an analyst might come up with several solutions, including a content testing strategy, a content recommender solution, and a new send strategy all together.

At this point, our stakeholders need to align to consider the feasibility, scope, and impact of our new ideas. We ultimately want to land on a solution is...

  • Feasible, in that the data and technology allow it to be designed

  • Actionable, in that a marketer can easily understand and leverage the output

  • Profitable, so that the time spent to develop it is justified by the impact on the business

In this scenario, the content recommender ticks all three of those boxes. Our data steward tells us we have the data to know who the unengaged customers are and what they engage with. Our project manager confirms access to this data takes little effort. Our analyst knows we can quickly build a prototype content recommender, given that we have access to this data. And our strategist, knowing they can leverage the content recommender to send the content each subscriber is most likely to engage with, can shift their overall send strategy. Once all stakeholders have given the green light, we quickly Prototype a content recommender solution.

However, sometimes things don’t move forward this smoothly. For example:

  • The project manager or data steward might have discovered that we already have an existing solution for what we are trying to do. This happens more than you’d think, especially in large organizations. At this point, we’d take a step back and Ideate more about how we might use the newly discovered solution.

  • The data steward may tell us we don’t have the necessary data to build the solution we want to. At this point, we might design a plan that allows us to collect the needed information, assuming our project manager confirms the effort to do so would be appropriate.

  • The marketer may hear a solution from the analyst and when they identify a hiccup in how it applies to their marketing strategy or process. This might result in the analyst approaching the solution differently.

But thankfully, our content recommender is feasible, actionable, and profitable, so we move on to the Test and Optimize step. The analyst delivers the content recommender and Margaret applies it to her send strategy over the next month. The results are verified, and we see that by applying the model we are improving engagement and reducing opt-outs. We then begin to dive into discovery for future iterations that might do even better.

Enabling Design Analytics at Your Organization

In our example above, a design analytics approach helped our marketer, Margaret, identify her real problem, ultimately helping her focus on a solution that would impact her company’s marketing communications. You can do the same.

Applying design analytics to your organization will take time as it requires a big shift in thinking and alignment, but getting started doesn’t have to be hard. Here’s a few tips to get started:

  1. Introduce the key stakeholders to the method, being sure to give your analysts a seat at the table. Ensure your internal processes enable this new way of collaborating.

  2. Focus on one or two key business problems that you want to solve. Work through them with pilot groups.

  3. Check in regularly. As you work through this within your own organization, ensure you’re recording and paying attention to what’s working and not working with the process. Every organization will have their own set of unique needs and may customize to work for them.

Once you get the hang of it, design analytics—and Design Thinking, more broadly—will help you come up with better solutions and avoid wasting time on misdirected solutions that address symptoms and non-problems. And who doesn’t want that?


Need help with design analytics or Design Thinking? Oracle Marketing Cloud Consulting has more than 500 of the leading marketing minds ready to help you to achieve more with the leading marketing cloud, including a Strategic Services team that’s experienced with design analytics and a dedicated Design Thinking & Innovation Services team.

Learn more or reach out to us at

For more information about data-driven marketing and the tools needed to succeed in it, such as a DMP, CDP, and digital analytics platform, please visit:






Kaiti Gary

Senior Director of Analytic & Strategic Services, Oracle Digital Experience Agency

Kaiti (Livermore) Gary is a Senior Director on the Analytic & Strategic Services team at Oracle Digital Experience Agency. Her background includes over 16 years of client and agency consulting experience in the in a variety of marketing capacities including product management, customer experience and digital marketing. Given her diverse background, she excels in the development of holistic and innovative marketing solutions that balance strategy, technology and operational needs.

Previous Post

Modern Marketing Blog Influencer Series - Scoring Big: Understanding the Buyer’s Journey Is the Key to Lead Scoring

Matt Heinz | 3 min read

Next Post

Forrester Report: Getting Customer Data Management Right

Michael McNichols | 2 min read