Writing Survey & Progressive Profiling Questions that Deliver Valuable Insights

June 13, 2024 | 9 minute read
Chad S. White
Head of Research, Oracle Digital Experience Agency
Text Size 100%:

 

A version of this post was originally published on MarketingProfs.com.

With the impending end of third-party cookies, brands are increasingly focusing on collecting more zero-party data directly from their customers and prospects so they can better understand their needs and wants. While larger surveys will play a role, most brands will gain insights a few answers at a time through signup forms, profile pages, polls, and other forms of progressive profiling.

It sounds simple, but writing a good question and collecting reliable answers is harder than it seems. Indeed, things can go wrong:

  1. Before you write a question
  2. When writing the question
  3. When writing answer choices
  4. Around the timing
  5. When analyzing the responses
  6. When repeating data collection

Let’s talk about best practices and things to look out for during each of those steps.

1. Before You Write a Question

It’s possible to go entirely wrong before you even begin with your progressive profiling efforts. Consider these:

Always start by understanding exactly what your brand wants to learn from your audience. What’s your objective? Why do you want a certain piece of information from them? How are you going to use or operationalize that data point? Does that data give you the insights you want?

That last question gets at a disconnect that many brands struggle with. The goal isn’t to collect data. Not really. It’s to gain insights you can use to drive the desired outcome. Yes, you need data to get insights, but the two aren’t the same thing.

My favorite example of this from B2C marketers is when they ask customers for their gender. Most of the time this data is used to personalize message content with products for men or women. However, a person’s gender doesn’t tell you what kind of products they want to buy, because they could be buying primarily for someone else or could be interested in products for the opposite gender or both genders.

So, the better question is the more direct one: Are you interested in products for men, women, or both?

Don’t ask questions you won’t act on. Simply asking a question sets an expectation that you’ll use that information to make the customer experience better in some way—even if you’re just going to share it out with the community. If you don’t do anything with it, that can lead to disappointment and lower response rates for future progressive profiling efforts.

Also, gone are the days when you’d collect information because you might need it in the future…at some point…maybe. You don’t want the liability of retaining data you’re not using, so don’t collect it in the first place.

2. When Writing the Question

Once you’re clear on your objective, then it’s time to craft your question. Here are some things to keep in mind:

Craft questions that are universally understood. To the degree that it’s appropriate, avoid jargon and technical language. If it’s needed, consider providing quick definitions in parentheticals. If your audience is international, think about non-native speakers, who might struggle to understand some long words, colloquialisms, and cultural references. And lastly, use unambiguous time windows, such as saying the past 12 months instead of the past year, which some might interpret as the last calendar year. 

Provide any needed context before the question. The primary concern here is that some people, once they’ve read a question, will skip to the answer choices because they’re in a hurry. (And everyone’s in a hurry.) Another reason is that in the absence of immediate context, people bring their own context to questions, which forces your post-question context to work harder to override the respondent’s initial thinking.

Avoid context and introductory statements that might impose a bias on answers. For example, you shouldn’t ask, Given the current state of the economy, do you think now is a good time to change supply chain management software providers? You’ll get more accurate answers without that introductory clause.

Ask judgment-free questions. Marketers are great at asking leading questions in marketing copy, but you don’t want to do that in polls and surveys if you want meaningful results. Sometimes that means you need an introductory statement or clause that gives the respondent cover to answer truthfully about something that might otherwise make them look or feel bad. For example, you might preface a question with a clause like Recognizing that you don’t have full control over your program… to make it easier for respondents to answer truthfully.

Recognize that people are bad at remembering past behavior. They provide the most reliable answers about now and the recent past. When you’re asking about the actions of their organization, things can get even hazier, since the respondent may be relatively new to their company. Consider asking about actions or behaviors from the past 12 months at most.

Avoid redundant questions. The more questions you ask, the lower your completion rate will be. So, try to ask as few questions as necessary. For example, I saw a recent B2B lead-gen form that asked for both the person’s country and world region. If you get the person’s country, you can figure out the region of the world, so that question was completely unnecessary.

3. When Writing the Answer Choices

Most likely, the vast majority of the polling or surveying you’ll be doing will involve answer choices rather than open-ended questions. So, consider the following when crafting those answer choices.

Make answering easy. While this has a lot to do with the questions you ask, the answer choices you provide also have a major impact on how easy a question is to answer. For example, a recurring question I’ve asked marketers is:

What percentage of your company’s email marketing revenue is generated by automated and transactional emails?

  • Less than 20%
  • 20% to 50%
  • More than 50%
  • Not sure

Ranges make answering the question much easier, because the chances of them knowing the exact percentage is really low—and you absolutely don’t want people to go hunting for information, because they probably won’t come back. Five- and 3-point rating scales (e.g., Always, Sometimes, Rarely) generally produce better results than larger scales, while keeping things easy.

Be careful when using subjective measures. Sometimes, beauty is indeed in the eye of the beholder. Other times, it’s not. For instance, some people think that an email deliverability rate of 50% is good, but it’s actually horrible. So, if you asked about brands’ email deliverability, you’d likely get very different distributions if asked if their inbox placement was Excellent, Good, or Poor versus Over 95%, 90%-95%, or Below 90%.

Provide an N/A option. Even if answering a question is optional, give people the option to not answer the question by selecting N/A, Not sure, or Don’t know—or a combination of those, like Not sure or don’t know. Otherwise, they’ll guess or put down the answer they think you want to hear, degrading the accuracy of your responses.

4. The Timing

When you ask your audience questions depends on several factors, but the most consequential is whether the responses are useful long-term or short-term.

Answers that are useful long-term—for many months to years—include demographic information like a prospect’s names, mailing address, company name, and industry; and technographic information like details about their tech stack. This information doesn’t change often, so it’s useful over a long period of time. That also means you can collect it throughout the year over the course of multiple campaigns.

Answers that are useful only in the short-term—for a few weeks or months—include, for example, whether a prospect is attending an upcoming industry event. This kind of information is incredibly valuable. However, this kind of question needs to be asked close enough to the event that respondents are sure they’re going, but not so close that you don’t have enough time to act on the responses. 

5. When Analyzing the Responses

Get all that right and you can still stumble when it comes time to interpret the results. Consider these issues:

Whether to report N/A, Not sure, and Don’t know responses. Generally, these answers aren’t meaningful, so it’s best to remove this noise from your reported results. However, it can be telling if, say, a majority of respondents select these answers. That can signal the technology, tactic, product, or whatever else you’re asking about has low awareness, which can be interesting in and of itself. (It could also signal that your question is confusing.)

Look for opportunities to simplify the story. Just like you don’t have to report your N/A answers, it’s okay to roll together some of your answer responses in some cases to tell a cleaner story. For example, say you asked respondents to respond to a statement using a 5-point Likert scale of (1) Strongly Disagree, (2) Disagree, (3) Neither Agree nor Disagree, (4) Agree, and (5) Strongly Agree. In many circumstances, and especially if the Strongly respondent percentages are small, it will probably make sense to combine the two disagree answers and the two agree answers when reporting results.

Reporting statistically significant results across segments. Particularly with demographic questions, brands often provide a long list of answer choices because they want more granular data. For example, they might ask about company size and provide lots of choices, like Fewer than 10 employees, 11-25, 26-50, 51-100, 101-200, 201-500, 501-1000, 1001-2000, 2001-10000, More than 10000.

Reporting those responses is fine, as it gives your audience valuable perspective on your respondents. However, sometimes brands then try to report out how each of those groups answered other questions. Depending on your brand’s audience and how the survey or poll was fielded, you might have a relatively small number of respondents in some of those buckets—too small for the results to be meaningful.

In many instances, it makes sense to combine several answer choices. For example, in past surveys, I’ve rolled together respondents by company size into two buckets: 500 or fewer employees and More than 500 employees. That dividing line created two groups that were fairly evenly powered and provided interesting insights about the differences between what Smaller companies and Larger companies were doing.

Understanding intent. Just like people are bad at remembering things that happened more than 12 months ago, they also tend to be bad at predicting what they and their organization will do in the year ahead. Of course, it varies depending on what you’re asking about and how much effort, cost, and buy-in is required, but in my experience fewer than half of respondents follow through on what they say they’ll do in polls and surveys.

Needless to say, that’s totally fine. Your respondents are answering to the best of their ability. That said, when you report the results, you shouldn’t overstate the results of intent-based questions.

6. When Repeating a Question, Survey, or Poll

Collecting data on the same question over and over is powerful, but there are a couple of issues to be mindful of.

Be mindful of changing previously asked questions and their answer choices, as that will likely render historical answers useless for comparison or trend purposes. Even changing an introductory statement or clause can change the responses so much that you can’t compare them to past responses. That said, if you’ve identified a serious flaw in the wording of a past question, don’t hesitate to reword it so you get more reliable answers going forward. Similarly…

Be mindful of controlling your audience. For example, if you promoted a poll question to your email and social audiences last year, but then this year you work with partners to have them also share your poll question, the poll results could be materially different simply because of that audience change. That’s not to say that expanding your audience is bad, but don’t ignore that change when analyzing results.

Tighter Privacy Protections

As protections strengthen, both in terms of privacy regulations and platform privacy (i.e., MPP, LTP, HME), brands need more ways to collect information about their customers and prospects so they can understand their audience better and create more relevant experiences. Asking your audience questions through forms, surveys, polls, and other progressive profiling mechanisms is a highly valuable way of staying close to them and serving them better.

—————

Need help with your progressive profiling and targeting? Oracle Digital Experience Agency has hundreds of marketing and communication experts ready to help Responsys, Eloqua, Unity, and other Oracle customers create stronger connections with their customers and employees—even if they’re not using an Oracle platform as the foundation of that experience. With a 96% satisfaction rate, our clients are thrilled with the award-winning work our creative, strategy, and other specialists do for them, giving us an outstanding NPS of 70.

For help overcoming your challenges or seizing your opportunities, talk to your Oracle account manager, visit us online, or email us at OracleAgency_US@Oracle.com.

Chad S. White

Head of Research, Oracle Digital Experience Agency

Chad S. White is the Head of Research at Oracle Digital Experience Agency and the author of four editions of Email Marketing Rules and nearly 4,000 posts about digital and email marketing. A former journalist, he’s been featured in more than 100 publications, including The New York Times, The Wall Street Journal, and Advertising Age. Chad was named the ANA's 2018 Email Marketer Thought Leader of the Year. Follow him on LinkedIn, Twitter, and Mastodon.

Show more

Previous Post

Evolving Your Account-Based Marketing Program [with on-demand webinar]

Cristal Foster | 4 min read

Next Post


Digital Marketing in 2030: Major Changes & How to Adapt [with on-demand webinar]

Lauren Gannon | 4 min read