Survey Panel Pains – What is Really Going On?

OMG Survey Panel Pains – WTH is Really Going On?

Matt Dusig recently published a great article on his Innovate MR blog. It was a solid counterpoint to the decreasing lack of confidence that researchers are feeling with the state of online panels.

Commenters like Adrianna Rocha and Dave McCaughan have positioned online panels as no longer responsive, email click-through rates are abysmal, panelists must try 10 surveys before earning a significant reward. So is there a solution for online panel quality or are we doomed for all eternity?

Well, good news from Matt: the sad state of online survey panels is really no one’s fault!

Panel companies create databases of people interested in participating in surveys to earn a reward. For the most part, these people genuinely want to provide honest data in exchange for this reward. Survey panelists use their spare time and don’t really equate the time spent in terms of the value of their hourly rate (at work). While creating the databases, the panel sites will ask the user for basic demographic, geographic and psychographic information.

When researchers have a study that needs sample, they send a specific request for the types of people that need to be targeted. Sometimes the sampling firm can match 100% of the client request data fields to the panelist data. But more often than not, the sampling firm doesn’t have 100% data match in their database. Why? Because there are millions of possible questions that could be asked of a respondent. For example: “I want men, 35-54, living in California, have psoriasis, but don’t actively treat it with medicine.” A sampling firm may know everything except for whether or not the respondent is using medicine. Therefore, either the sampling firm asks the user to answer extra questions before being redirected to the client survey or they send the user into the client survey to be screened. The latter increases the possibility of a poor user experience as the user will most likely answer many other questions before getting to the specific termination point regarding usage of medicine.

Technical integration

(Note: Matt is talking about common problems with most survey software providers – and it truly is a big problem! But the Voxco platform freely exchanges information with panel suppliers, including demographics, quotas, and more. It pays to support a panel with a solid survey software platform.)

Whether the sample company is programming the survey or just sending panelists to a client-hosted survey, there’s almost no technical integration between sampling and most survey software. Meaning, sampling firms usually cannot send panelist’s demographic or geographic data that’s already known about a panelist, into most lower-end survey software. Also, sampling technology systems usually do not have visibility into open quotas, in real-time, via API. So, a panelist clicks into survey only to be told, “Sorry you don’t qualify” or “Sorry the survey is closed.” That notification usually occurs after the panelist has already answered many different questions, and most of these answers are already known by the panel company. But because sampling firms are only paid for each user who completes a survey, there are financial constraints on how much a rejected panelist can be paid when they don’t qualify.

The solution is multi-faceted.

We (The Research Industry) need better integration between survey software and sampling technology platforms AND we need agreement that there are best practices surrounding survey design which help panelists provide valuable data. Years ago there was an industry-wide initiative for research agencies and sampling companies to create a shared data warehouse, but I don’t think there was much interest in bearing the costs of managing the platform. I still think this would be a valuable asset for the entire industry.

Mobile mobile mobile.

Do you check your email on your mobile phone? So do panelists! Do you want to take a 20+ minute survey staring at your phone? Neither do panelists! If you write and design surveys, you have to ask yourself, “Would I enjoy taking this survey?” When the answer is “Yes,” then we’ll all be helping to build better quality databases of respondents. I think the saying is: “Help me, help you.”

Read the source article from Innovate MR

Let us help you.

Discover how we can help YOUR organization solve its current survey needs.