Research Panel Quality Check: Trust Then Verify

Research Panel Quality Check: Trust Then Verify

Avatar photo
Authored byKrista Daly
Avatar photo
Authored byPhilippe Boutros

“Quality, quality, quality: never waver from it, even when you don’t see how you can afford to keep it up. When you compromise, you become a commodity, and then you die.”—Gary Hirshberg

The market research industry has struggled for years to recruit the right respondents and ensure the best research panel quality. It’s a well-documented problem, as evidenced by the small sampling of online articles below:

We’ve talked about issues like this on our blog before, but we figured it was time for a more in-depth exploration of the problem. So, we’re writing a multipart series where we’ll assess what it means to have a quality panel, how to make a great survey, and where panels and panel firms came from.

First Things First: Why Do Panel Firms Exist?

You might wonder why a market research firm would outsource its recruiting at all, especially when having the right respondent is so crucial to the success of a research study. After all, the best questions and analysis won’t generate meaningful insights if you’re talking to the wrong people.

Yet, outsourcing a recruiting effort is remarkably common. Nearly every market research firm that exists outsources its recruiting of research participants. And nearly everyone in the world has been targeted for recruitment at some point in their life.

For decades, both B2C and B2B surveys have used the same methods: phone calls, emails, text messages, engagement through online communities, and even in-person interceptions at public spaces. Were you ever asked to take a quick survey while you were at the mall? Have you recently received a phone call or text message from someone asking you to participate in a survey?

This is the panel provider contacting you on behalf of the market research firm. Even if you haven’t been contacted yet, it is likely that you are in one or several panel provider databases already.

For example, panel firms like Dynata and Innovate MR have pre-built databases of qualified respondents who are B2B professionals. These constantly maintained databases allow panel providers to quickly find respondents for quantitative and qualitative research studies.

These same panel providers can quickly source international respondents. And they continuously look for new respondents to update and add to the database. As a result, going with a panel firm can be an affordable option. Especially when compared with building your own database of respondents and maintaining it over time as job titles, roles, and employers change.

Seeking Out the Experts

You might also be familiar with expert networks. These organizations take a slightly different approach than panel firms. Expert networks tend to focus on qualitative research recruitment more so than quantitative research. Additionally, expert networks tend to focus on candidates who have skills that line up with a particular niche.

Expert networks were initially popular with hedge funds and investors as they needed a quick crash course on an industry they wanted to invest in. Expert networks have grown substantially in the last five years. Especially as data about businesses and their employees has become readily available through platforms like LinkedIn.

Expert networks also do a degree of vetting to ensure that an expert is truly a knowledgeable person to interact with. Finally, they also ensure that each expert is willing and able to be contacted for research efforts.

Unfortunately, many expert networks require you to sign an agreement that limits your ability to engage with the experts you are put in contact with unless you use that same expert network provider. Over time, if you make extensive use of expert networks, this can substantially limit the number of experts you can contact for research efforts.

How Do I Work With A Panel Firm?

Let’s say you need to conduct a survey of 1,000 IT professionals across multiple categories: manufacturing, hospitality, education, finance, and marketing. You know you don’t have the resources to field all these respondents yourself via LinkedIn, your own database of customers, prospects, or any other method. So, you decide to hire a panel firm.

Before you decide who you’re working with, there are some important questions you should get answers to. Here are just a few:

  • How does this panel firm build up its panel?
  • Who is in the panel?
  • How often is profile information updated for panelists?
  • How often are panelists contacted or invited to participate in surveys?
  • What is the panel firm doing to guard against bad data?
  • How do I know respondents to my survey are providing quality answers I can rely on?

Once you’ve decided on the right panel firm, you’ll need to meet with a panel manager to kick off your project. Start by relaying all your project specifications. Explain the length of the interaction, the number of respondents you’re aiming for, the project timeline, and—most importantly—who your target audience is.

Actual outreach will begin with a soft launch, where only a subset of potential respondents receive the survey. The soft launch lets you make sure the participants you’re getting line up with your expectations.

During the soft launch, you’ll be able to evaluate the following:

  • Are you targeting the right people, or does your audience need to change?
  • Is your questionnaire too lengthy, resulting in high levels of respondent drop-off?
  • Do your questions flow well from one to the next?
  • Do respondents understand the questions correctly or are the responses not quite answering what you intended to ask?

Validate these questions first. Then the panel firm can move forward with the rest of the respondents and complete the study.

Working with Panel Firms: Trust Then Verify

The Russians have a proverb that goes like this, “Doveryai, no proveryai.” The proverb basically means it’s fine to build a relationship on good faith, but one should always verify expectations and commitments throughout the relationship.

Based on over 15 years of experience doing B2B tech market research, we’ve seen nine ways you can verify the quality you’re getting from a research panel. By asking the right questions and analyzing the answers from early respondents, you can catch these problems before they negatively affect your study.

1. Fishing in the Wrong Pond

It’s true panel firms may have enormous databases of B2B professionals. But do you know how often they are keeping on top of changes in their panel?

Unlike in consumer surveys, in B2B it’s critical to remain current on promotions, job changes, and new careers.

For example, a financial director in a mid-market company who moves on to an enterprise is not the right fit anymore for a survey targeting leaders with fewer than 1,000 full-time employees. Or, an IT manager at an enterprise who becomes a chief information officer at a startup will no longer be your target audience when you’re surveying non-executives.

To highlight these differences in B2C vs. B2B participants, we made a video focusing on one persona: B2B Brian.

In the video, you’ll learn that Brian is a 38-year-old who loves playing guitar and likes pasta. One year later, his interests are still about the same. And since personal preferences don’t change much year-to-year, it’s easy to recruit him for a B2C study.

Brian’s career path, however, might change relatively quickly. At 38, he might be a senior software engineer working on data center initiatives at Microsoft. The following year, he could become the CTO of a tech startup focused on biotech.

While 38-year-old senior software engineer Brian might be a good fit for one B2B quant study, he fits a completely different type of study once he transitions to his CTO role.

In sum, to get the most accurate panel profile information, ask your panel firm the frequency at which they update this information and what they do to stay up to date on these changes.

2. Slow and Steady Wins the Race

You should have a sense of how long your survey should take. When there’s a significant difference between estimated and actual time, you know there’s a problem.

“If the survey is supposed to take eight minutes, but someone finishes in two, that’s extremely suspicious.”—Alissa Ehlers, Senior Research Analyst, Cascade Insights

Panel firms that aren’t that diligent might have a number of professional survey takers in their panels. In a web-based survey, this type of respondent will choose any answer and respond randomly to write-in responses. Essentially, the respondent completes the bare minimum in an effort to rush through a survey just to get to an incentive payment at the end.

To disqualify extreme outliers like this, you should check each participant’s actual completion time against an average or estimated time to complete. Then remove anyone who clearly didn’t read through the questions.

3. Beware of Bogus Claims

Companies often include questions to measure perceptions of themselves and their competitors in surveys. To narrow down the list of brands each recipient is asked about in depth, initial questions often ask which of a list of brands the respondent is aware of and/or has had experience using.

For example, a survey may include a question like, “What is your familiarity with the following: Oracle, Snowflake, VirtuaCloud, and Salesforce?” The respondent would select from a multiple-choice list to indicate if they currently use, have used in the past, have heard of but never used, or have never heard of each of these brands.

Panel firms who fail to do their due diligence can present you with respondents who say they are familiar with fake company names (like VirtuaCloud) that are often incorporated into the survey as a quality check.

Check if these types of participants have fallen into other traps, and if they have, disqualify them. Also, immediately disqualify those who say they’re actively using a fake brand or tool.

“While it’s possible to misremember hearing a name, it’s far less reasonable to claim you’re actively using something that doesn’t exist.”—Alissa Ehlers

4. Keep Them On Their Toes

Do you know how well research panel participants maintain their attention as they’re going through your survey? While not as much of a problem in a phone interview, respondents taking an online survey might be tempted to start picking answers at random. Or respondents could also easily pick a pattern like “CAB” and repeat it throughout just to finish.

To catch respondents like this, you can ask your panel firm to incorporate survey questions such as “What is 3×20?” or “Which of the following is not a US state. Regardless of the true answer, please select Oregon.” These are easy traps that will catch respondents who aren’t paying full attention to the questions.

Additionally, patterns like “CAB” are quick to find as you skim through the survey. Many survey platforms (such as Alchemer) include data quality validation tools. These tools will automatically flag respondents giving patterned responses to survey questions. It’s best to look further into respondents with patterned responses, and if they are hitting multiple traps, throw them out.

5. Are You Seeing Gibberish?

Quantitative surveys tend to have at least a few questions that ask for a write-in answer. Low-quality respondents often give themselves away by answering in ways that don’t make sense. Respondents could easily write a random series of letters or numbers. Or write phrases that might look like a real answer at a glance but don’t make any sense in the context of the question.

While data cleansing tools, like those from Alchemer, can catch obvious issues, you need an expert to skim through responses to understand if something is gibberish. For example, “we’ve virtualized all our containers” isn’t a word salad, it’s a true statement for a new VMware customer.

Read through these write-ins yourself early on to ensure you’re not disqualifying any false positives.

6. Do the Math

Before fielding your survey, you should identify questions that you can apply consistency checks to. Between two questions, there may be some patterns of responses that you’d never expect from a valid response. Data cleansing tools can help flag these respondents.

For example, it wouldn’t make sense for a person to say they’ve never heard of a specific tool but then later indicate that they use that tool. Or for a person to say they’re extremely satisfied with a particular platform but then say they’re also extremely likely to buy a new one. It doesn’t add up.

The respondent may have made an honest mistake. However, like the awareness questions, you should flag that survey for further review to see if the respondent has fallen into other traps.

7. Non-Reflective Sampling

Your panel firm’s respondents might not be as representative of the market as you’d like. You may have too many respondents from the same industry or are missing a demographic.

It’s a good idea to check the responses against a separate data source. For example, if all respondents in the sampling are AWS users, but the market isn’t 100% AWS, flag that.

In one study for a large cloud-computing platform, we surveyed data scientists about which cloud provider they viewed as the leader for deep learning workloads. Some odd patterns emerged from the very beginning.

Based on our knowledge of the market, it seemed strange that many data scientists we sampled named Oracle as the leader, given their low market penetration in the market we were surveying. We flagged it for our panel provider.  And kept a close eye on the results to ensure the responses reflected the major players (AWS, Azure, Google Cloud).

Alternatively, you might also discover the insights you are receiving are coming from the wrong role or job title. Even if you specified it up front.

For example, during one study we conducted for a cybersecurity enterprise company we found an early anomaly in the responses we received. The responses, while technically correct, didn’t fit what we knew to be true in the market.

That’s because the survey was initially targeting directors of IT. Once the panel firm switched its focus specifically to cybersecurity-focused roles, the results were much more aligned with expectations.

As B2B tech experts, we understand nuances in the market that other firms wouldn’t necessarily be able to catch. As a result, we catch issues sooner, and our surveys produce more accurate results.

8. Sidestepping Responsibility

Panel firms sometimes hand things off to a different sub-contracted panel firm without informing you that they’re not doing the work themselves. Since you’re unaware of who you’re working with, you can’t as easily check the research quality of the now sub-contracted panel firm.

If you start seeing a new and suspicious pattern of low-quality samples from a formerly reliable panel, that is one clue that they’ve begun outsourcing some work. Ideally, you should aim for mutual transparency with the panel. It’s important to ask panels to communicate any contractor usage with your firm. Implement extra caution and quality checks if a contractor is brought in.

9. Clever Bots

Even at the best of times, the quality of panel firms is questionable.

Despite eliminating a large percentage of poor-quality participants, quality checks don’t catch every single one. Some manage not to trip the traps through sheer luck.

Other “participants” are bots with the ability to learn about new traps from each additional survey they take. As the bots get smarter, they’ll trip fewer and fewer traps, and ultimately get through more surveys.

To stop bots from picking up on traps, consider adding “soft disqualification questions”. Don’t build in logic that automatically disqualifies someone for poor quality (which bots might learn from). Instead, allow these respondents to complete the survey, and then filter them out of the final data analysis.

Constant Vigilance

Considering the questionable quality of panels, you need to ask yourself, “What am I doing to remain vigilant?” The best answer to this question is to verify, verify, and verify again that you are getting the quality you expect.

By neglecting quality checks, you’re causing enormous harm to your business strategy. You might be misled into thinking a feature is worthwhile or that the market is primed for your new product. But ultimately, you might just be plunging off the cliff into failure.

To make smarter decisions, based on quality research, consider each of the nine hurdles in front of you. Tackle them one by one, and you’ll be able to come away from your market research study with some real insights.


With 15 years of experience in B2B tech market research, Cascade Insights can help ensure your research is of the highest quality. For more information on B2B market research, visit What is B2B Market Research

Special thanks to Alissa Ehlers, Senior Research Analyst, Scott Swigart, President & CTO, and Philippe Boutros, Director & Chief of Staff, for advising on this piece.

Home » B2B Market Research Blog » Research Panel Quality Check: Trust Then Verify
Share this entry

Get in Touch

"*" indicates required fields

Name*
Cascade Insights will never share your information with third parties. View our privacy policy.
Hidden
This field is for validation purposes and should be left unchanged.