Welcome to the Oracle Modern Marketing Blog:
The latest in marketing strategy, technology, and innovation.

How You Test Your Marketing Is Just As Important As What You Test

A Methodology to the Madness

Optimization testing is key to successful digital marketing. What makes it so invaluable is the fact it enables companies to build customer experiences that speak to the needs, goals, and interests of each individual person. The goal is to produce customer journeys that inspire loyalty, confidence, and conversions by giving people content that actually means something to them.

But how does a company go about creating such effective journeys? It is a process that involves collaboration across teams, careful campaign coordination and orchestration, the right testing platform, and a willingness to constantly learn about your customers. For this reason organizations must consider how they operate before they attempt to use optimization testing in their marketing efforts. This "how" can also be defined as the methodology that is used to ideate, build, execute, and maintain marketing programs.

Agile vs. Waterfall: A Raging Debate

The two leading approaches are agile and waterfall. As the terms agile and waterfall originated in the technical world for developers, not all marketers are familiar with them—so some definitions are in order.

The waterfall approach is “the grandfather of all software methodologies,” writes W. Clay Richardson in his book Professional Java, and it is called this because steps, officially called phases, “flow through each other sequentially.” Waterfall starts with managers and teams agreeing on a project’s expectations and requirements. These guidelines never change in the project’s lifecycle, and once a phase is completed and approved it cannot be changed.

Agile emerged in response to waterfall. In this methodology, while requirements are laid out at the start of the project, teams can—and are encouraged to—revisit these requirements as the project moves forward. Teams have regular meetings to update each other on progress, successes, and challenges, which can help teams reexamine and redefine short-term goals. In agile, previous phases can be revisited as teams learn more about their limitations, their abilities, what the market wants, and more.

Both approaches have pros and cons. While waterfall “is simple and easy to use” because each phase has a rigid start and end, this same rigidity causes “high amounts of risk and uncertainty.” That’s because the inability to go back and reexamine previous phases often means a team won’t know if its project was worthwhile until it’s over, since factors that may have made it seem valuable could’ve changed in the time it took to complete it.

The ever-evolving nature of an agile project is also its biggest pro and con. “Although collaboration, coordination, and knowledge sharing are critical to large projects,” writes Jim Highsmith in his book Agile Project Management: Creating Innovative Products, “the downside … can be endless meetings and wading through tons of documentation and emails.” With the freedom of being able to constantly change your mind comes the torture of being able to constantly change your mind!

Which Methodology Should You Use?

So, should you use agile or waterfall to run optimization tests? While it is possible to run a successful test using either approach, agile’s core principles better enable marketers to make data-driven decisions that can adapt to real-time industry trends and data discoveries. This approach lets analysts use test results to make fast updates to personalized sites and emails.

For example: If a client is running a multivariate test (MVT) but its pages aren’t receiving as much traffic as expected, it should consider running an A/B test instead, as these typically require less traffic to generate usable data.

Another possibility: Say a client has slated a test to run for two months, but data reveals an obvious pattern in customer behavior after only two weeks. Would there be more value in letting this test continue to run, strengthening proof for an already-clear trend, or in using the remaining six weeks to clarify follow-up questions to this trend? Whichever path the client chooses, the fact it is willing to deviate from the original plan as new insights emerge is great agile thinking.

Contrast this to the waterfall approach, in which a marketing team would be encouraged to make a test and run it to completion before looking at the results. Agile is not only more responsive and adaptable (hence the name); it can also help companies use their resources more efficiently.

The long and short of it? A marketing team most likely has to run optimization tests in the agile framework—or at least adopt some agile thinking—to ensure customers get dynamically personalized experiences.

Do Go Chasing Agifalls

But what if your team or company doesn’t work in the agile approach? That’s OK! While agile and waterfall both have staunch, purist defenders, the reality is many teams use a hybrid model (often called “agifall”) because neither approach is perfect. “Agifall combines the best of both worlds,” says Mark Fromson, founder of LocalSolo.com, “injecting agile methodologies into a loose waterfall process to increase speed, decrease cost, and improve quality.”

Part of what makes optimization so special is its utilization of real-time insights. Agile is better than waterfall at helping marketers use data as it comes in to tweak their tests, enabling better, faster test results for a better customer experience.

And speaking of CX, did you know more than 90 percent of CMOs and VPs of e-commerce state that customer experience optimization (CXO) is a must-have for their organization to increase revenue growth, engagement, and ROI?

Download the CXO Buyer’s Guide and see why successful CXO requires an optimization platform that supports all aspects of testing and personalization for web, mobile, and apps.

Be the first to comment

Comments ( 0 )
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.