6 ways technology can help you tackle sensitive survey questions

Members of the market research industry find themselves fighting tooth and nail for decent response rates and representative sample. So when your study is about drug use, sexual behaviors, political and religious beliefs, or other sensitive topics, you’ll need to ensure that you’re doing everything possible to keep response rates steady.

Asking sensitive questions via surveys or interviews can be considered intrusive or offensive, regardless of what the respondent’s answer may be. And if the respondent feels that their response could be contrary to popular societal norms, they may be more inclined to provide a dishonest answer that conforms to their perception of normal (this is called ‘social desirability bias’). Compounding the issues of question intrusiveness and social desirability bias, respondents may not trust that their responses will remain anonymous, or that the data will be kept secure.

Sensitive questions can negatively affect three important survey measurements: overall response rates, single question decline rates, and response accuracy – due to a higher percentage of respondents who answer sensitive questions dishonestly. That means that the inclusion of sensitive questions in a survey needs to be handled in an intelligent and delicate way whenever possible. Fortunately, technology can help.

The chosen survey channel can compound the issues around sensitive questions. The selected channel can exponentially increase respondents’ hesitation to answering sensitive questions:

  1. Self-completion surveys (e.g. Voxco Online): in which respondents complete surveys on their own time and in privacy;
  2. Telephone interviews (e.g. Voxco CATI): in which survey call center interviewers ask respondents questions over the phone;
  3. Personal interviews (e.g. Voxco Mobile Offline): in which interviewers use mobile devices to ask respondents questions directly in a face-to-face setting.

In general, respondents are more likely to answer sensitive questions honestly via self-completion surveys. But there’s tremendous value in choosing personal interviews over self-completion surveys, or choosing a complimentary multichannel approach, so interviews should not be ruled out immediately.

Here are six ways to use technology to boost response rates and response accuracy on sensitive questions:

1. Channel selection

As we mentioned above, self-completion surveys that can be taken by respondents in privacy offer a better atmosphere of anonymity which helps respondents feel more comfortable answering sensitive questions. The elimination of an interviewer can also reduce the social desirability bias introduced by the presence of another person.

2. Multichannel studies

Your studies don’t need to stick to only one survey method. Using an integrated survey platform, you can target a single respondent database and follow sample across multiple channels. For example, after a phone interview is completed, invite a segment of the respondents to complete an online survey. The logic of the online survey can trigger follow-up questions based on their CATI responses and probe into more sensitive topics that respondents would be less likely to confide to an interviewer.

3. Live channel switching

Consider redirecting interviewees mid-interview to a self-completion channel to ask sensitive questions. This temporarily gives respondents the anonymity and privacy needed for sensitive topics. Redirect CATI respondents to an IVR system to answer a few sensitive questions using their phone keypad, then return them to the interviewer to finish the survey. Pass the CAPI tablet to respondents directly and let them enter their answers directly into the questionnaire without having to say the words aloud to an interviewer.

4. Live question wording changes

As your survey progresses, be sure to keep an eye on real-time response analytics. If you note significant drop off rates at a specific question, or higher-than-average item nonresponse rates on specific questions, take action! Alter question wording, and instantly push the updates live. Add in some language that emphasizes survey anonymity and data security when surveys start veering into sensitive territory. That’s the benefit of live survey updating.

5. Survey logic/flow changes

If early results show that respondents are being turned off by sensitive questions, you could also adjust the logic and flow of the survey. Move sensitive questions further into the survey to build more trust and rapport with respondents before they get to them. If you have a sensitive question earlier in the survey, trigger ease-in questions and language to show only for those respondents who chose ‘refuse’ on the first sensitive question. Or go the other way: build trust with those respondents who did answer the sensitive question by hiding demographic or identifying questions at the end of the survey that could make respondents feel that their responses will be connected back to them.

6. Data hosting

Sometimes it’s enough for respondents to know that the survey is anonymous. But other respondents could want assurances that their data is being stored with the highest possible security and encryption, or on internal servers versus stored “out there” in the cloud. So give them those assurances.

Use technology to maintain response rates!

In a period of declining response rates, it is likely that respondents will be even more reluctant to take part in surveys that tackle sensitive topics. And in an age of growing concerns over data security, those who do respond to your surveys could be less inclined to reveal potentially embarrassing information about themselves.

Even in self-administered surveys, respondents may still misreport or refuse answers on sensitive questions. But there are a number of ways to use survey technology to your advantage when asking the tough questions. Get in touch with the team at Voxco if you’d like to discuss more unique ways to use a flexible survey platform to collect sensitive respondent data.

December Feature Updates


At the end of another calendar year, we’re continuing to crank out new features for Voxco Online. Think of it as a holiday gift for our clients – we’re answering their wishes for new features that boost their productivity. We couldn’t have planned it better. 😉

Here are December’s new features, which mostly aim to better support SSS XML v1.2 with triple-s exports. Get in touch with us if you’d like help with executing any of them:

  • Added ‘Dichotomize multiple’ options for TXT exports. Previously, the ‘dichotomize multiple’ option was only available for CSV, Excel and SPSS formats. Starting this month, you will have the option to also export as TXT files (not available for SAS, SCT, SPS, or QAX formats).
  • Export all columns. If your data has multi-response questions (e.g. checkbox questions) and numeric codes, you will now be able to to select ‘dichotomized multiple’ and a .SSS format to create a data file with dichotomized numeric codes.
  • Exporting loops in questionnaire order. Traditionally, the default order for exporting looped question lists followed a response-focused order (e.g. Q1_R1, Q1_R2, Q1_R3, Q2_R1, Q2_R2, Q2_R3, Q3_R1, etc.). Now you can export looped question lists in a question-focused order (e.g. Q1_R1, Q2_R1, Q3_R1, Q1_R2, Q2_R2, etc.). This will include the option to export in TXT (SSS, Triple-S XML), Excel and CSV formats.
  • Added filter option to exclude surveys where panelists are inactive. We’ve added a filtering property to the Web API for Panel Manager that allows the ability to exclude surveys for which the panelist respondent is inactive. Enabling this will only return results where the respondent status is active.
  • Delete results via API after exporting. If you integrate with our system through an API, you can now delete a survey export task from a third party system. This prevents export task lists from growing indefinitely on the third party system.
Updates rolling out now!

Our SaaS servers are getting this update as early as this week. If you’re a client who hosts Voxco Online on your own premises, get in touch with us to schedule an update that will include these new exporting options.

Online surveys are not enough


Part 1: The oversaturation of online surveys.

Online will remain the #1 survey channel for the time-being, and deservedly so. The channel offers the ability for respondents to privately complete surveys on their own time. For researchers, data can be processed immediately, and there are no additional costs for interviewers.

But because of the simplicity and efficiency of online surveys, the market is quickly becoming oversaturated and respondents are getting overwhelmed. Online survey invitations are everywhere, and the surveys they link to are often sloppy. And the resulting online survey clutter leads to survey fatigue, which leads to a range of data collection issues, none of which are good news for researchers:

  • Declining response rates. Many online surveys see single-digit response rates as low as 2%. And respondents who do complete the surveys generally have an extreme opinion: they’re either ecstatic or livid. It’s difficult to get a balanced opinion when you’re only talking to outliers.
  • Reduced respondent attention span. For respondents who still complete surveys, the increased frequency can affect their in-survey attention span. The more survey questions they see, the higher their tendency to burn through questions too quickly without adequate thought.
  • Market saturation. With so many surveys vying for a respondent’s attention, how can one organization get their survey noticed and completed?

So are online surveys doomed? Of course not – but researchers tasked with deploying them are certainly being challenged. We’ve discussed numerous times in the past how to make online surveys more engaging. Here is the TL;DR version:

  • Design surveys well. A no-brainer, but so important. Keep online surveys short and sweet on the surface. Nowadays, if your survey looks cheap or takes longer than five minutes, you’ll start losing a portion of the respondents who were willing to click on the link in the first place.
  • Listen and Adapt. Surveys are a two-way conversation. Ensure that the respondent knows you’re listening to their answers. Use logic to skip irrelevant questions or pipe in past responses. If a respondent feels like they are being heard, they’re more likely to share.
  • Incentivize. Incentives increase response rates which offsets your sample cost. Unincentivized survey requests are a primary target in the rising pushback against online survey invitations.
  • Personalize the invitation. Ensure that survey invitations speak to the right people, and acknowledge the respondent situation. Use a clean sample source and personalize the message to clarify why they were chosen (eg. “Thanks for your purchase of X last Tuesday…”).
  • Create a community. Cut to the heart of your customer base and nurture your own panels of your loyal consumers. It’s a lot of work to create and manage, but an incentivized, permanent panel will give you a constant finger on the pulse of your most important customer segment.

Online surveys are an essential part of any Voice of Customer or research program. But in today’s industry reality, you need to accept that getting a respondent’s attention online is extremely challenging. Get your surveys noticed by complementing well-designed online surveys with surveys conducted via alternate channels.

Break through the heads-down, clutter-ignoring patterns of an average respondent’s daily routine. Think differently about how to get your survey noticed by respondents. Taking your survey project out of your own comfort zone can take you into a new zone where respondents actually notice your surveys.

We’ll expand more on this idea through this three-part blogpost series. Read Part 2 (Get surveys noticed in the offline world) now and Part 3 (The multi-channel advantage) will land in the coming weeks!

Avoid these 10 mistakes when managing panel communities

11 mistakes to avoid when running an online MR community

Stephen Cribbett of Dub Research in London recently published a post that was picked up by Quirks. The article perfectly highlights the downfalls that many researchers face when planning and launching a new panel community.

The article was written from the POV of focus groups and qualitative communities, but the concepts are still heavily influential for launching and nurturing survey panel communities. We’ve shortened and a little and highlighted the most important tips below.

In short, remember that panelists are busy people. Busy people who have a genuine interest in participating in your research. So be sure to cater to them – acknowledge them, make them feel welcome, and ensure that they feel comfortable enough that they will stick around and share their experiences with you. On with the list:

1. Creating a virtual ghost town

When your research community launches and the first person to arrive finds a community devoid of people, conversations and any life forms, they’ll be experiencing what’s referred to as a virtual ghost town. And nobody wants to be the first person at a party.

Establish some energy by seeding content from the very beginning, and sometimes even before the community has launched to participants.

2. Failing to welcome and brief participants

Eager to begin the research activities, researchers can sometimes forget the critical importance of establishing rapport and warming up participants. Encourage them to open up and express themselves, making certain that they understand that there is no right or wrong answer.

To facilitate this, consider using ice-breaking activities such as an open discussion or mini ‘fun’ surveys. When new participants come into the research community and see these existing activities, it gives them a head start in getting to know the other respondents (if structured as open discussion), or understanding the structure of the panel and future surveys (if structured as introductory surveys). Here are some example questions:

  • What do you always carry with you in your bag?
  • Who do you most admire in life and why?
  • What can’t you live without?
  • When did you last do something for the first time?

Note that some of these ‘fun’ questions can double as panelist profile attributes for future filtering!

3. Not sharing the purpose and objectives

Both the community and the individual tasks and discussions that you are launching need clear explanation. The more you can clarify the objective of asking a question or setting a task, the more likely you are to get a great response.

Trust leads to greater openness and expression from the members, which means you are more likely to uncover unexpected insights in your research. Transparency with regards to the community mission will be vital in order for you to accomplish this endeavor.

4. Not being flexible with time and structure

Structure is important but rigid adherence to pre-determined tasks can be your enemy. You want to leave enough in your agenda to allow members to touch on further topics, since this can lead to interesting ideas and themes bubbling up to the surface via qualitative discussion. Remember to build in time to explore these unanticipated topic areas.

5. Peaking too soon

The energy and excitement around the initial launch of a community can sometimes result in it peaking too early, with moderators losing focus or getting distracted with other work commitments as time passes. Be on guard against this, even for relatively short communities. Working to establish solid relationships within the community early on can help keep things going strong and lead to productive discussions and valuable insights.

6. Not getting to know your chosen community platform

When it comes to selecting the best and most appropriate market research technology, spend time doing your due diligence and learning how it works. You don’t want to make the common mistake of assuming the technology can do something it can’t. Choose a panel portal tool with a robust backend management toolbox and an engaging portal experience for panelists.

Once you’ve made your choice, make time to experiment or run trials. Give yourself time fully understand its capabilities and how far you can push the tasks. Promotional note: Voxco Panel Manager offers a Day 1 integration kit that gives you the source code and design templates you need to get started immediately. Get in touch with us to see how it can help!

7. Expecting too much of your members

Avoid cramming in too many tasks and discussions because, ultimately, it will over-load members who will quickly lose interest and slope off. Always try to take a step back and ask yourself, “Would I be prepared to do this myself?” or “Would I be able to achieve this in the allotted time?”

8. Thinking, “If I build it, they will come.”

Simply building a research community is not enough. Communities are organic, living things that require time, effort, intellect and resources in order to be successful and reveal the insights that you need.

Take the community through its life stages, investing in people and relationships along the way. When obstacles come – and they will – try to see them as opportunities and be confident that you can always find a way to work around them.

9. Not being prepared for the volume of data

Many first-timers end up underestimating the amount of data they will collect, and then don’t have a good plan in place for how to analyze and report on that data. So a top tip is to formulate hypotheses, and be prepared for rolling analysis techniques and tools.

On a typical basis you should seek to commence your analyses halfway through the community. If you have resources available, consider how to combine efforts and work together as a team.

10. No plan in place

You can’t launch your community on day one and then wait until day two to figure out what to do! Be sure to know how the next few weeks or months will play out. What will your role and input be? Only then can you manage client expectations. Community planning requires a careful balance. Although you should look to avoid over-planning and preparing, It’s recommended that you have a loose framework to utilize your members time effectively.

The welcoming phase is crucial, demanding effort and energy, so having early activities set up and ready to roll will make your life so much easier, leaving you time to concentrate on planning and re-shaping further activities in the light of feedback that you gather.

Read the source article at Research Industry Voices

Survey Panel Pains – What is Really Going On?

OMG Survey Panel Pains – WTH is Really Going On?

Matt Dusig recently published a great article on his Innovate MR blog. It was a solid counterpoint to the decreasing lack of confidence that researchers are feeling with the state of online panels.

Commenters like Adrianna Rocha and Dave McCaughan have positioned online panels as no longer responsive, email click-through rates are abysmal, panelists must try 10 surveys before earning a significant reward. So is there a solution for online panel quality or are we doomed for all eternity?

Well, good news from Matt: the sad state of online survey panels is really no one’s fault!

Panel companies create databases of people interested in participating in surveys to earn a reward. For the most part, these people genuinely want to provide honest data in exchange for this reward. Survey panelists use their spare time and don’t really equate the time spent in terms of the value of their hourly rate (at work). While creating the databases, the panel sites will ask the user for basic demographic, geographic and psychographic information.

When researchers have a study that needs sample, they send a specific request for the types of people that need to be targeted. Sometimes the sampling firm can match 100% of the client request data fields to the panelist data. But more often than not, the sampling firm doesn’t have 100% data match in their database. Why? Because there are millions of possible questions that could be asked of a respondent. For example: “I want men, 35-54, living in California, have psoriasis, but don’t actively treat it with medicine.” A sampling firm may know everything except for whether or not the respondent is using medicine. Therefore, either the sampling firm asks the user to answer extra questions before being redirected to the client survey or they send the user into the client survey to be screened. The latter increases the possibility of a poor user experience as the user will most likely answer many other questions before getting to the specific termination point regarding usage of medicine.

Technical integration

(Note: Matt is talking about common problems with most survey software providers – and it truly is a big problem! But the Voxco platform freely exchanges information with panel suppliers, including demographics, quotas, and more. It pays to support a panel with a solid survey software platform.)

Whether the sample company is programming the survey or just sending panelists to a client-hosted survey, there’s almost no technical integration between sampling and most survey software. Meaning, sampling firms usually cannot send panelist’s demographic or geographic data that’s already known about a panelist, into most lower-end survey software. Also, sampling technology systems usually do not have visibility into open quotas, in real-time, via API. So, a panelist clicks into survey only to be told, “Sorry you don’t qualify” or “Sorry the survey is closed.” That notification usually occurs after the panelist has already answered many different questions, and most of these answers are already known by the panel company. But because sampling firms are only paid for each user who completes a survey, there are financial constraints on how much a rejected panelist can be paid when they don’t qualify.

The solution is multi-faceted.

We (The Research Industry) need better integration between survey software and sampling technology platforms AND we need agreement that there are best practices surrounding survey design which help panelists provide valuable data. Years ago there was an industry-wide initiative for research agencies and sampling companies to create a shared data warehouse, but I don’t think there was much interest in bearing the costs of managing the platform. I still think this would be a valuable asset for the entire industry.

Mobile mobile mobile.

Do you check your email on your mobile phone? So do panelists! Do you want to take a 20+ minute survey staring at your phone? Neither do panelists! If you write and design surveys, you have to ask yourself, “Would I enjoy taking this survey?” When the answer is “Yes,” then we’ll all be helping to build better quality databases of respondents. I think the saying is: “Help me, help you.”

Read the source article from Innovate MR

Software Update: Voxco Online v5.5.1

Voxco Online v5.5.1 blogpost header

At Voxco, we are constantly developing software features that make our platform better, faster, and more intuitive. We are excited to announce an upcoming minor update to Voxco Online from version 5.5 to version 5.5.1.

The new version introduces two brand new, in-demand features: panelist attribute synchronization and SMS text invitations.

Panelist Attribute Synchronization

Panelist attribute synchronization is a key new feature for clients who have integrated Voxco Panel Manager into their Voxco Online survey platform. The new feature automatically synchronizes panelist attribute updates with the responses of each panelist’s linked Voxco Online surveys (past, present and future).

These features will improve insights and boost panelist engagement, ensuring that no basic attribute questions need to be asked repeatedly.

  • Panelist attribute auto-synching. Panelist attribute data (e.g. email, language, gender) can now be automatically synchronized with the results of questions in your surveys. So if a question is asked on a survey that offers panelist attribute information (eg. ‘What is your language preference?’) and an existing panelist answers it, the answer will automatically be recorded as a panelist attribute for future surveys. This helps with survey targeting and so that you don’t have to ask your panelists the same questions in the future. (Default on. Can be turned off. Exclusion filters available).
  • Reverse survey data updates. You can now enable reverse data updates, which allow past survey data to be automatically updated to reflect changes made to panelist attributes. For example, if you add an attribute to a panelist that indicates college graduation, their responses in past surveys can now be categorized as coming from a college graduate. As you add more panelist attributes, past survey results will become more robust. (Default off)
SMS Distribution

New SMS distribution gives you one more way to reach an increasingly mobile-dependent population. The feature allows you to invite respondents to answer surveys via SMS text message. This distribution method adds to the diverse survey invitation channels already offered via Voxco Online.

The process is easy to integrate:

  • Phone capturing & assignment. Capture, import, and validate respondent phone numbers via completed surveys. Assign phone numbers to panelists.
  • Invite via SMS Text Messages. Once phone numbers are added as attributes, you can send or schedule SMS text message invitations.
  • Flexible distribution settings. You can target respondents based on their phone attribute status (e.g. only invite respondents who clicked on the last SMS invitation).
  • Automated list updates. Lists are auto-updated with SMS unsubscribes.

Voxco Online will be automatically upgraded to v5.5.1 for SaaS clients on Friday, March 11. Clients hosting the platform on-premise are able to schedule a time to upgrade at any time. Once the platform is upgraded, all of the features above will be in place, and users can activate SMS invitations or Voxco Panel Manager by contacting a Voxco rep.

If you have any further requests for our next scheduled software update, let us know! We’re always listening and improving the platform!

The risks of mass-recruited online panels

Using Online Panels for Market Research?

A great write-up from professional survey-creator Scott Weinberg has shone some light on the risks of using mass-recruiting panels for your online surveys. We’ve edited down an excerpt below. A reminder to be sure to maintain panel health and quality as strictly as you monitor your survey quality!

“An MR leader in a huge tech company said something interesting on a call I remember vividly. He asked: ‘when is the last time you washed your rental car?’ The context here pertained to online sample. The problem is this: why would you ever wash your rental car? Why change the oil? Why care for it at all? You use it for a day, or a week, and you return it. Mass-recruited online panelists are no different. You use them for 5 minutes, or 20, and return them to the provider.

If we actually cared about them, the surveys we offer them wouldn’t be so stupefyingly, poorly written. Half of the surveys that mass-recruited panelists complete are flat out laughable. Filled with errors and illogical questions. Around a quarter consist of nothing but pages of matrices. If you’re an online panelist, they’re simple boring. Are the semi-professional survey-takers in a mass-recruited panel really going to thoughfully answer 30 questions about laundry detergent for a dollar? Ever think about who is really taking these surveys?

One of the saddest observations is the lazy mechanics of sample purchasing. In the last couple years, sample quality is simply assumed. When buying access to mass-recruited panels, MR firms assume suppliers check their own panel quality (like the rental car analogy!). Quality is not winning sample provider orders. Quantity and margin are.

Now that I’ve thoroughly depressed you, one may wonder, is there any good news? Yes – ‘invite-only’ panels give the best shot at good, clean sample. When you open your front door to anyone with a web connection, and tell them there’s money to be made, well, see above. As a quality check, design surveys with an open-end comment capture, to avoid fraudulent panelist activity. It’s much more difficult to fake/auto-complete good, in-context open ended verbiage. Yes it takes a bit more work on the back end, and there are many solutions that can assist with this.”

In many cases, it’s best to bite the bullet and manage your own panels rather than dive into the discount bin of mass sample providers. Let us know if you have any questions about getting started!

Read the source article by Scott Weinberg on LinkedIn.

The commoditization of respondents

Commoditization in market research

Panelists are valuable assets who require attention, engagement and regular nurturing. But to sample buyers, panelists are becoming more attribute-laden commodity than real-person respondent. Can suppliers still stay relevant with this POV, and is this commoditization even a bad thing?

Maybe it’s an indication that insights are now valued by a wider market, which could mean more research revenue. It could breed a ‘survival of the fittest’ industry environment – which isn’t always a bad thing, as the strongest offerings remain.

Commoditization of respondents is definitely a worry for MR. Nobody wants to feel like just another seat at a crowded table. Technologies and methodologies are quickly evolving, which keeps things exciting; but commoditization is an inevitable side effect of the increasing supply.

Sample procurement efficiency has long needed an overhaul (think of the fragmented processes that lead to diminished quality, transparency and control). There are only a few stubborn researchers left who would claim that the past decade’s worth of technologies have done anything but improve the industry landscape and allowed for evolution.

Now that sample supply, panel management tools and interactive results reporting dashboards are more accessible, the industry expects quicker insights that keep pace with the evolving world around us. And this desire for quick, representative survey projects is what allows for and encourages respondent commoditization.

So respondent commoditization is a double edged sword that contributes both positively and negatively to industry innovation. Like any new trend, the hype around sample commoditization is quickly running away from us, leaving some researchers underserved and frustrated.

Target groups should have different values; outside the general consumer profile, there are many target profiles being bought and sold that are quite scarce, almost to the point of becoming non-representative. There’s a very real need for B2B samples, niche audiences and specific markets, and as everything else nowadays, we require access to these groups within tight timelines.

It’s a truth that fewer and less varied types of people are joining panels, granting express contact consent, and completing surveys. So as the supply of willing participants shrinks, demand grows – especially of niche, harder to reach audiences. Economics 101 tells us this may quickly price niche markets at a point higher than we’re willing to pay. Respondents are, however, the livelihood of our industry and so this could be a source of great worry.

Ultimately, there should always be a real focus on sample quality, and so market researchers need to invest wisely in samples and sample maintenance. Without respondents, what are we?

Read the source article at Research

Let us help you.

Discover how we can help YOUR organization solve its current survey needs.