6 ways technology can help you tackle sensitive survey questions

Members of the market research industry find themselves fighting tooth and nail for decent response rates and representative sample. So when your study is about drug use, sexual behaviors, political and religious beliefs, or other sensitive topics, you’ll need to ensure that you’re doing everything possible to keep response rates steady.

Asking sensitive questions via surveys or interviews can be considered intrusive or offensive, regardless of what the respondent’s answer may be. And if the respondent feels that their response could be contrary to popular societal norms, they may be more inclined to provide a dishonest answer that conforms to their perception of normal (this is called ‘social desirability bias’). Compounding the issues of question intrusiveness and social desirability bias, respondents may not trust that their responses will remain anonymous, or that the data will be kept secure.

Sensitive questions can negatively affect three important survey measurements: overall response rates, single question decline rates, and response accuracy – due to a higher percentage of respondents who answer sensitive questions dishonestly. That means that the inclusion of sensitive questions in a survey needs to be handled in an intelligent and delicate way whenever possible. Fortunately, technology can help.

The chosen survey channel can compound the issues around sensitive questions. The selected channel can exponentially increase respondents’ hesitation to answering sensitive questions:

  1. Self-completion surveys (e.g. Voxco Online): in which respondents complete surveys on their own time and in privacy;
  2. Telephone interviews (e.g. Voxco CATI): in which survey call center interviewers ask respondents questions over the phone;
  3. Personal interviews (e.g. Voxco Mobile Offline): in which interviewers use mobile devices to ask respondents questions directly in a face-to-face setting.

In general, respondents are more likely to answer sensitive questions honestly via self-completion surveys. But there’s tremendous value in choosing personal interviews over self-completion surveys, or choosing a complimentary multichannel approach, so interviews should not be ruled out immediately.

Here are six ways to use technology to boost response rates and response accuracy on sensitive questions:

1. Channel selection

As we mentioned above, self-completion surveys that can be taken by respondents in privacy offer a better atmosphere of anonymity which helps respondents feel more comfortable answering sensitive questions. The elimination of an interviewer can also reduce the social desirability bias introduced by the presence of another person.

2. Multichannel studies

Your studies don’t need to stick to only one survey method. Using an integrated survey platform, you can target a single respondent database and follow sample across multiple channels. For example, after a phone interview is completed, invite a segment of the respondents to complete an online survey. The logic of the online survey can trigger follow-up questions based on their CATI responses and probe into more sensitive topics that respondents would be less likely to confide to an interviewer.

3. Live channel switching

Consider redirecting interviewees mid-interview to a self-completion channel to ask sensitive questions. This temporarily gives respondents the anonymity and privacy needed for sensitive topics. Redirect CATI respondents to an IVR system to answer a few sensitive questions using their phone keypad, then return them to the interviewer to finish the survey. Pass the CAPI tablet to respondents directly and let them enter their answers directly into the questionnaire without having to say the words aloud to an interviewer.

4. Live question wording changes

As your survey progresses, be sure to keep an eye on real-time response analytics. If you note significant drop off rates at a specific question, or higher-than-average item nonresponse rates on specific questions, take action! Alter question wording, and instantly push the updates live. Add in some language that emphasizes survey anonymity and data security when surveys start veering into sensitive territory. That’s the benefit of live survey updating.

5. Survey logic/flow changes

If early results show that respondents are being turned off by sensitive questions, you could also adjust the logic and flow of the survey. Move sensitive questions further into the survey to build more trust and rapport with respondents before they get to them. If you have a sensitive question earlier in the survey, trigger ease-in questions and language to show only for those respondents who chose ‘refuse’ on the first sensitive question. Or go the other way: build trust with those respondents who did answer the sensitive question by hiding demographic or identifying questions at the end of the survey that could make respondents feel that their responses will be connected back to them.

6. Data hosting

Sometimes it’s enough for respondents to know that the survey is anonymous. But other respondents could want assurances that their data is being stored with the highest possible security and encryption, or on internal servers versus stored “out there” in the cloud. So give them those assurances.

Use technology to maintain response rates!

In a period of declining response rates, it is likely that respondents will be even more reluctant to take part in surveys that tackle sensitive topics. And in an age of growing concerns over data security, those who do respond to your surveys could be less inclined to reveal potentially embarrassing information about themselves.

Even in self-administered surveys, respondents may still misreport or refuse answers on sensitive questions. But there are a number of ways to use survey technology to your advantage when asking the tough questions. Get in touch with the team at Voxco if you’d like to discuss more unique ways to use a flexible survey platform to collect sensitive respondent data.

4 Reasons Survey Organizations Choose On-Site Hosting

Most organizations across the market research industry have chosen cloud hosting for their survey data storage. For them, it’s easier to manage, easier to budget for, and secure enough for their needs.

But a core group are not prepared to jump to the cloud. For them, they choose to physically control survey data centers located on company property. And we’re very familiar with their rationale, since Voxco offers one of the few professional survey software platforms that is available on-premise.

So why is this portion of the industry sticking with in-house data hosting? Here are the four reasons we hear over and over again:

1. Complete Data Control

Market research organizations manage the kind of sensitive data that is commonly protected via strict privacy regulations. That means they want to know exactly where their data is at all times, and they choose to be in total control of storage, and avoid third party suppliers.

Many of our healthcare and financial services clients need to prove conclusively that their data is protected to the letter of the law. In some situations, Canadian and European clients need to prove that data is stored within their own borders. It’s not always clear how offsite data is being stored, who maintains ownership, and who else might have access to it. That can be a real worry.

On-premise set-ups often make it easier to prove total compliance. Even when cloud companies can guarantee compliance, some IT managers feel more comfortable absorbing the risk and controlling the data storage themselves. 

2. Infrastructure Costs

Cost is always a deciding factor. It often boils down to prioritizing fixed capital expenditures over monthly operational expenditures. Monthly hosting fees can be significant for governments and large organizations with huge data requirements and numerous users. At some point, the economics tip in the favor of a fixed capital investment.

This is especially true for organizations with existing infrastructure in place for data storage in-house. It’s a very easy decision for them to select on-premise hosting for their survey software.

3. Physical Server Customization

Cloud hosting providers have existing server structures. However, in-house hardware is custom-tailored to an organization’s personal needs. This offers levels of local control, visibility, and auditability which are unattainable from cloud providers.

Retaining infrastructure control internally also allows instant fixes and improvements to how data storage is structured. The larger the cloud provider, the harder it is to request fixes or customization.

4. Internet/Bandwidth Restrictions

We’re spoiled in most of the western world with high bandwidth and uninterrupted internet connectivity. But many parts of the world are still catching up; internet and bandwidth can be slow or spotty. For these situations, hardwired internal databases are often the most productive and efficient solution available.

Sound familiar?

We get it. We work with many survey organizations who rely solely on their in-house infrastructure, and we talk with hundreds more. Some organizations we have talked with are actually returning to an in-house solution, which seems to be becoming increasingly common.

Whatever works for you. We have always believed in the flexibility of choice. Whether SaaS, on-premise or a hybrid data storage solution, Voxco is here to help. Reach out to us and let us know your situation – we’ll make it work.

4 raisons pour l’hébergement d’enquêtes à l’interne

La plupart des sociétés du secteur des études de marché ont choisi de stocker leurs données d’enquête en nuage. La gestion des données et des budgets s’en trouve facilitée et la sécurité est adaptée à leurs besoins.

Mais les groupes centraux ne sont pas prêts à passer aux services infonuagiques. Ils préfèrent contrôler physiquement les centres des données d’enquête, situés au sein même de leurs locaux. Et nous connaissons bien leur raisonnement, puisque Voxco offre une des rares plateformes d’enquêtes professionnelles qui proposent un hébergement interne.

Alors pourquoi cette partie du secteur en reste à l’hébergement des données en interne ? Voici les quatre raisons que nous entendons sans cesse :

1. Mainmise sur le contrôle des données

Les sociétés d’études de marché gèrent des données très confidentielles, généralement protégées par des règlements très stricts sur la protection des renseignements personnels. Cela signifie qu’elles savent toujours exactement où se trouvent leurs données, qu’elles choisissent d’avoir un contrôle total sur le stockage des données et qu’elles évitent de faire appel à des fournisseurs tiers.

Beaucoup de nos services à la clientèle des secteurs médicaux et financiers ont besoin de prouver de façon évidente que leurs données sont protégées par la loi. Dans certains cas, les clients canadiens et européens doivent prouver que leurs données sont stockées sur leur territoire. Il n’est pas toujours facile de déterminer la façon dont les données hors site sont stockées, de savoir qui en est propriétaire et qui d’autre pourrait y avoir accès. Cela peut représenter une véritable préoccupation.

Il est souvent plus facile de prouver le respect total des normes avec les installations internes. Même lorsque les entreprises ayant recours à l’hébergement en nuage peuvent garantir le respect des normes, certains gestionnaires de services informatiques se sentent plus à l’aise de réduire les risques en contrôlant eux-mêmes le stockage des données.

2. Coûts d’infrastructure

Le coût est toujours un facteur décisif. La priorité est souvent donnée aux dépenses à capital fixe plutôt qu’aux dépenses opérationnelles mensuelles. Les gouvernements et les grandes sociétés ont des exigences énormes en matière de données et un grand nombre d’utilisateurs, par conséquent leurs frais mensuels d’hébergement peuvent être importants. À un moment ou à un autre, l’économie penchera en faveur de l’investissement en capital fixe.

Ceci est particulièrement vrai pour les sociétés ayant déjà une infrastructure en place pour le stockage interne des données. Pour ces sociétés, il est très facile de se décider à autohéberger leur logiciel d’enquête.

3. Personnalisation du matériel du serveur

Les fournisseurs d’hébergement sur nuage possèdent des structures de serveurs déjà existantes. Cependant, le matériel informatique interne est fait sur mesure afin de répondre à leurs besoins. Ceci offre plusieurs niveaux de contrôle local, de visibilité et de vérifiabilité qui sont inatteignables pour les fournisseurs de services infonuagiques.

Conserver le contrôle de l’infrastructure en interne permet également de trouver des solutions et d’apporter des améliorations instantanées à la structure du stockage des données. Plus grands sont les services infonuagiques, plus il est difficile de trouver des solutions et de fournir des services personnalisés.

4. Restrictions de l’utilisation Internet et de la largeur de bande

La majorité du monde occidental a accès à une forte largeur de bande et à une connectivité Internet ininterrompue. Cependant, beaucoup d’endroits n’ont pas encore accès aux mêmes conditions et la connexion Internet et la largeur de bande peuvent être lentes ou irrégulières. Dans de telles situations, des bases de données internes et câblées sont souvent la solution la plus productive et la plus efficace.

Vous avez déjà rencontré ce problème ?

Nous comprenons et nous travaillons avec un grand nombre d’instituts des études de marché qui se fient uniquement à leur infrastructure interne, et nous sommes en contact avec des centaines d’autres. Certaines sociétés avec lesquelles nous avons parlé reviennent à une solution interne, ce qui semble être de plus en plus fréquent.

Peu importe ce qui fonctionne pour vous. Nous avons toujours cru à la flexibilité des choix. Que vous ayez besoin d’un stockage de données SaaS, en interne ou hybride, Voxco est là pour vous aider. Communiquez avec nous et faites-nous part de votre situation et nous trouverons une solution.

4 Gründe, warum sich Marktforschungsinstitute für das On-Site Hosting entscheiden

Die meisten Unternehmen in der Marktforschungsbranche haben sich für das Cloud-Hosting für die Speicherung ihrer Umfragedaten entschieden. Dadurch können sie diese einfacher managen und budgetieren, und dies ist für ihre Anforderungen sicherlich ausreichend.

Aber eine wichtige Gruppe ist nicht bereit, sich für das Cloud-Hosting zu entscheiden. Sie möchten Zentren für Umfragedaten physisch in firmeneigenen Einrichtungen kontrollieren. Und wir kennen ihre Gründe, da Voxco eine der wenigen professionellen Umfrage-Software Plattformen anbietet, die vor Ort verfügbar sind.

Warum bevorzugt also dieser Industriezweig das In-House Daten-Hosting? Hier erfahren Sie die Gründe, die wir immer wieder hören:

1. Vollständige Datenkontrolle

Marktforschungsinstitute verwalten jene sensiblen Daten, die normalerweise durch strenge Datenschutzrichtlinien geschützt werden. Dies bedeutet, dass sie genau wissen möchten, wo sich ihre Daten jeweils befinden, und sie entscheiden sich für eine umfassende Kontrolle über die Speicherung und vermeiden Drittanbieter.

Viele ihrer Kunden aus dem Gesundheits- und Finanzsektor müssen überzeugend nachweisen, dass ihre Daten aufgrund der Gesetzesvorschriften geschützt sind. In einigen Fällen müssen Kunden aus Kanada und Europa nachweisen, dass die Daten innerhalb der Grenzen ihres eigenen Landes gespeichert werden. Es ist nicht immer klar, wie Offsite-Daten gespeichert werden, wer das Eigentumsrecht für diese besitzt und welche weiteren Personen möglicherweise Zugriff auf diese haben. Dies kann ein wirkliches Problem darstellen.

Durch Einrichtungen vor Ort kann die vollständige Einhaltung von Vorschriften leichter nachgewiesen werden. Auch wenn Cloud-Unternehmen die Einhaltung von Gesetzesvorschriften garantieren, fühlen sich einige IT-Manager wohler dabei, das Risiko und die Datenspeicherung selbst zu kontrollieren.

2. Infrastrukturkosten

Kosten sind immer ein entscheidender Faktor. Häufig werden feste Kapitalaufwendungen monatlichen betrieblichen Aufwendungen vorgezogen. Die monatlichen Hosting-Gebühren können bei großen Unternehmen mit hohen Datenanforderungen und zahlreichen Nutzern sehr hoch sein. An einem gewissen Punkt spricht die Wirtschaftlichkeit für feste Kapitalaufwendungen.

Dies gilt insbesondere für Unternehmen mit einer vorhandenen Infrastruktur für die In-House Datenspeicherung. Es ist für sie eine einfache Entscheidung, sich für das On-Premise-Hosting für ihre Umfragesoftware zu entscheiden.

3. Physische Server-Anpassung

Cloud-Hosting Anbieter verfügen über vorhandene Serverstrukturen. Die In-House-Hardware ist jedoch genau auf die persönlichen Bedürfnisse eines Unternehmens angepasst. Hierdurch ist eine bestimmte lokale Kontrolle, Sichtbarkeit und Nachvollziehbarkeit möglich, die Cloud-Serviceanbieter nicht erreichen können.

Durch die Aufrechterhaltung der internen Infrastrukturkontrolle sind außerdem sofortige Fehlerbehebungen und Verbesserungen in Bezug auf die Struktur der Datenspeicherung möglich. Je größer der Cloud-Serviceanbieter desto schwieriger ist es, Fehlerbehebungen oder individuelle Anpassungen zu verlangen.

4. Internet-/Bandbreiten-Einschränkungen

Wir sind in weiten Teilen der westlichen Welt durch hohe Bandbreiten und ununterbrochenen Internetzugang verwöhnt. Aber viele Teile auf der Welt holen noch auf; Internet und Bandbreiten sind möglicherweise langsam oder werden häufig unterbrochen. In solchen Situationen stellen festverdrahtete interne Datenbanken häufig die produktivste und effizienteste verfügbare Lösung dar.

Kommt Ihnen das bekannt vor?

Wir finden eine Lösung. Wir arbeiten mit vielen Marktforschungsinstituten zusammen, die sich ausschließlich auf ihre interne Infrastruktur verlassen, und wir führen mit hunderten weiteren Gespräche. Einige Institute, mit denen wir gesprochen haben, kehren derzeit wieder zu einer In-House-Lösung zurück, die wieder verbreiteter zu sein scheint.

Was auch immer das Beste für Sie ist. Wir haben eine flexible Auswahl immer für wichtig gehalten. Egal, ob es sich um SaaS, On-Premise Speicherung oder eine Hybridform der Datenspeicherung handelt, Voxco hilft Ihnen gerne. Kontaktieren Sie uns und schildern Sie uns Ihre Situation – wir helfen Ihnen gerne.

4 ways survey call centers are adapting to new TCPA changes

Change is sweeping across the decades-old phone survey industry, and large survey call centers across the US are reacting in a variety of ways to the new TCPA regulations that we summarized last week.

As one of the industry’s leading suppliers of phone survey research systems and dialers in the US, the Voxco team has a unique point of view on the 2016 reality of the dialing landscape. We have a direct connection to many of the largest and most advanced survey call centers in the US, and we have talked numerous times with most of them since July 2015, when the TCPA changes were first announced.

We’ve seen first-hand how most of them initially reacted to the updated regulations, and how they have adapted in the 18 months since. Here are four distinct groups that we have observed:

1. Integrated Manual Dialing Solution

These survey call centers have absorbed the full cost of adapting and complying, and it’s already paying off. They have fully adapted by setting up a second, distinct dialing environment where they have integrated a manual dialing solution with their CATI survey software.

Based on the respondent list, these survey call centers split projects across the two environments:

  • They use an autodialing environment for dialing known landlines or for dialing mobile numbers where they have received consent to call (usually respondent panels). This environment can now also be used for some government projects.
  • They use a manual dialing environment for dialing unknown numbers and known mobile numbers.

The benefit of this decision is the ability to retain interviewer productivity by having CATI systems feed phone numbers directly to interviewers who then manually activate the dialing hardware with as little as one click. Because the dialer is integrated with a CATI survey system, project accuracy and analytical consistency are retained via native call recording, live monitoring, and dialing analytics pulled directly from the dialer.

2. Manual Dialing on Detached Phones (Analog or PBX)

This second group has their interviewers using physical telephones to manually dial mobile and unknown respondents in a separate environment from autodialing projects. When a new case is presented to an interviewer via their CATI system, they switch their focus to the physical phone, and manually dial the 10-digit number.

While the process is technically compliant, it negatively affects call productivity and accuracy, which is hurting bottom lines. Using real phones (vs. a fully integrated manual dialing system) to dial causes a huge drop in interviewer productivity – some project managers we have spoken with have told us it can add 30-50% more time per call.

With no CATI-integrated dialer to assist interviewers, project managers are seeing lower calling accuracy via interviewer misdials, and the lost benefits of built-in call recording, live monitoring, and integrated call analysis that come with dialing hardware.

3. Manual Dialing via an Existing Autodialing Solution

For various reasons, this third group is unwilling to make sweeping changes to their internal dialing environment set-up. Yes, they are aware of the TCPA changes, and try to comply with the manual dialing rules by having their autodialer prompt interviewers to physically dial mobile and unknown numbers. But they are still manually dialing from within an autodialing environment, which negatively affects the ‘evidence’ that proves those calls were manually dialed.

When projects have razor-thin profit margins, it’s hard to justify making huge internal changes but this solution really is the worst of both worlds: lost productivity via interviewer manual dialing, paired with the inability to prove compliance since projects are completed from within an autodialing environment.

4. Laying low & observing

Yes, there are still some survey call centers who when we first reach out to them, admit that they have been taking a wait-and-see approach. It’s clear that the definitions swirling around the TCPA are still fluctuating, so some survey call centers may still be continuing with no major changes, waiting for more concrete definitions of compliance.

Revenue is dropping as they avoid major new projects, and active projects are being completed using the same processes that were in place 18 months ago, including the use of autodialers. Retaining maximum productivity while accepting maximum risk. And it’s probably only a matter of time before a respondent who is aware of the new regulations pushes back.

Next steps

The new TCPA regulations are here to stay. Yet there may be survey call centers out there who are not fully compliant with the new rules, and remain at risk. And many of those who are compliant, are doing so at greatly reduced productivity by using detached phones to manually dial mobile and unknown respondents.

The TCPA regulations will remain an obstacle for many survey research studies, so it is becoming more essential by the day to move towards compliance sooner rather than later. Consider Voxco TCPA Connect, which allows a distinct manual dialing environment while retaining maximum project productivity, consistency and accuracy. Contact us to review how the process works.

Still shifting: An update on TCPA and survey dialing in 2016

Despite the FCC labeling their sweeping new TCPA changes as final back in July 2015 when they were announced, the actual dialing reality for survey call centers in the US is still shifting and settling.

The initial July 2015 ruling was a set of strict, sweeping changes that affected everyone in the phone survey research industry. In summary, it stated:

  • An autodialer may not be used to contact any mobile number in the US unless explicit consent was received from the number’s owner.
  • Dialing hardware could not be used, even if it only had the ability to be used as an autodialer, regardless of whether those features were being used.
  • The burden of knowing which numbers were mobile and which were landlines rested on the shoulders of the dialing party. If a number had been reassigned, a call center was responsible for tracking the change, and was only given a single erroneous dial to figure it out.

The ruling itself aimed to counteract an outdated reality – it seemed to be a response to the cost of incoming calls to mobile phone owners. It ignored the fact that major carriers across the United States have all but abolished the charging of talk and text by the minute in favor of unlimited anytime calls.

Another thing that seemed clear to industry analysts was that the changes were designed to dissuade fully robotic telemarketing and automated random-digit dialing. But since the changes were so sweeping and did not specify anything regarding the content of the calls, it also unfairly targeted real interviewers at survey call centers who use automated dialers to contact numbers off of authentic lists of respondents.

So naturally, the changes were not taken lightly by valid survey call centers across the nation. Industry leaders from CASRO and the MRA filed a motion to intervene. They demanded clarity around the definition of an autodialer, and sought relief from the risk of dialing reassigned numbers.

Possibly due in part to the pressure from legitimate researchers, the FCC has since announced exemption for the federal government and its contractors, indicating a shift back towards exemptions for legitimate survey call centers working on legitimate research projects.

Shortly thereafter, a closely followed TCPA court case came down on the side of the defendant. Thanks to this ruling, the hotly debated definition of an autodialer finally had a little clarity, and it shone a little more light onto the FCC’s definition. Because a real human had activated the dialing within their CATI system to dial the phone, it was not considered an autodial. The hardware used did not have the capacity to auto-dial without human intervention, and had no predictive or random number capabilities.

So it’s clear that the initially sweeping TCPA regulations are being relaxed a little to account for human-staffed survey call centers conducting valid research. More changes are still expected and being pushed for, and far more concrete definitions should soon help survey call centers better adapt.

We’re keeping a close eye on the shifting definition of the ruling, and ensuring that Voxco TCPA Connect always adapts to the legal reality. TCPA Connect is a manual dialer with no capability of autodialing or predictive dialing. It can be used as a single button-dialer that connects directly to Voxco CATI and lets human interviewers contact respondents far more efficiently than manual dialing. We offer multiple possible deployment scenarios that fit varying needs.

Designed and managed by phone survey experts, Voxco TCPA Connect will always offer the maximum productivity while allowing call centers to adapt to the 2016 dialing reality. Give us a shout to see how it could work for you.

Canada Privacy Act-Compliance in Survey Hosting & Access

Survey Hosting & Data Access that’s Canada Privacy Act-Compliant

Market Researchers who have been following the news about data privacy and protection in the past few years know that there are major concerns with any data that is hosted in the US, or accessed by US parties.

Recently, we’ve seen that US survey software providers come under fire from provincial governments, even those who have operations in Canada. The major problem is that US employees of these firms often have access to their Canadian clients’ personal information data even when it is stored on Canadian servers. This is in direct conflict with many Canadian Privacy Acts.

We’re proudly Canadian, and have been for over 25 years. So we understand Federal and Provincial Privacy Acts and how they relate to safeguarding our Canadian clients’ data, including the secure access of personal respondent data. Voxco’s secure servers host our Canadian clients’ data here in Canada, but with Voxco your data will never be accessed by any party in the United States.

We’re currently providing survey tools and secure-access data hosting to Federal and Provincial governments, universities, and Canada’s largest research firms. Many of them have chosen us because Voxco offers industry-leading survey software with the added confidence of knowing that your data is securely stored in Canada, only accessed by Canadian parties.

Let us know if we can help you keep your Canadian survey data in Canada.

5 Reasons to support the 2016 Canadian Census

Why Marketers Should Support the 2016 Canadian Census

When Justin Trudeau’s newly formed majority government announced the reinstatement of the mandatory long-form census in its very first Liberal caucus meeting, researchers and businesses across Canada celebrated. The mandatory long-form census was a huge step back in the right direction for lovers of data across the nation that Voxco calls home.

Here’s a great article by Jan Kestle, Chair of CMA’s Customer Insights & Analytics Council, first published on the Canadian Marketing Association website here:

Starting at the beginning of May, Canadians are being asked to complete the 2016 Census. And there is good news for businesses, consumers and citizens alike.

As you’ll recall for the 2011 Census, the mandatory long-form census was replaced by a voluntary survey. This move resulted in a non-response bias that meant a lot of small-area data for 2011 could not be released—including data on income, education, labour force activity and special populations like immigrants and aboriginal peoples. The loss of the long-form census was a significant blow to policy makers, researchers, data scientists and marketing professionals.

But today we can breathe a sigh of relief because the mandatory long-form census is back, and the complexity of the Canadian population will once again be documented in a comprehensive and statistically sound way. Support for the restoration of a “proper” census came from all corners of Canadian society and has reinforced the fact that the Census of Canada is a national treasure to be protected and supported.

In this era of Big Data, some people wonder why we need the census. There are many reasons, but here are my top five for marketers:

  1. Key metrics like market share and untapped potential use census-based data to benchmark an organization’s users against total population.
  2. Census-derived demographics are used to weight surveys to make them applicable to the general population—and more actionable.
  3. Segments derived from small-area census data are used by marketers to link disparate data sources together—connecting demographics to lifestyle and media preferences and thereby helping marketers get the right message to the right people using their preferred media.
  4. Census data for neighbourhoods can help harness unstructured data from digital and social media. This allows marketers to link internal offline data to online sources-–getting closer to a single view of the customer.
  5. Using census-derived data to leverage existing surveys and administrative databases results in businesses asking consumers fewer questions. Businesses don’t have to collect data if it already exists in another form—and consumers benefit.

Because of these and many other reasons, we urge Canadians to fill out the Census—whether they get the short or long form. The Census is essential to the fundamental operation of the country—from planning the provision of social services to managing economic growth.

We live in a time when busy consumers expect brands to know and connect with them. The resulting good data will also help us be better marketers.

Read the source article at CMA

This Survey Is Unwieldy And Intrusive. And Invaluable.

This Survey Is Unwieldy And Intrusive — And Invaluable To Understanding Americans’ Health

As the length of the NHIS grows, the attention spans of respondents shrinks which means declining response rates. Maggie Koerth-Baker at FiveThirtyEight has put together an excellent article on why the impact that decreasing the length of the NHIS could have in the US:

“Every year since 1957, tens of thousands of Americans have opened their homes to government survey takers who poke and prod their way through a list of intimate and occasionally uncomfortable questions. That process is part of the National Health Interview Survey, the gold standard of health data in the United States.

Experts say it is unique in its sample size, the scope of its questions, and how long it has existed. The NHIS is crucial to our ability to track the prevalence of diseases and health-related behaviors, and to answer complex questions involving health, income and family demographics. And now it’s about to change. The National Center for Health Statistics, the branch of the Centers for Disease Control and Prevention that conducts the survey, is planning a big shift in how the NHIS works, one that some scientists fear will impair their ability to learn about things like how stepmothers can subtly disadvantage children — investigations that end up shaping everything from the way your money gets spent to the policies your legislators vote on.

At issue is the question of how long anybody can reasonably expect Americans to put up with sharing intimate details of their lives with taxpayer-funded bureaucrats.

The 35,000 households — more than 87,000 individuals — who will be interviewed for the NHIS this year will have to devote an average of 90 minutes of their time to the survey. They aren’t paid. There’s no prize at the end. “That’s a lot to ask of people,” said Stephen Blumberg, associate director for science at the National Center for Health Statistics.

The thing about a gold standard is that every researcher wants to be a part of it. Twenty years ago, the NHIS lasted about an hour. But over time, more government agencies and scientists wanted to add more and more questions to make it more and more useful. Meanwhile, participation has been dropping. In 1997, the survey had a 91.8 percent response rate. “The response rate today is about 70-73 percent,” Blumberg said.

So scientists and the government are caught in a tug of war over the survey. Pull too hard one way, and they risk losing access to valuable information. Yank it back the other, and Blumberg worries that the participation rate could fall below 60 percent.

[…]

But nobody knows whether making it shorter will actually increase the response rate. The NHIS is not the only survey that’s losing respondents over time. Across the board, fewer and fewer Americans are choosing to participate in surveys of all kinds. This trend has been documented by the National Research Council and the Pew Research Center. Scientists I spoke to repeatedly referred to this problem as “the elephant in the room.”

It’s not clear why this is, but the length of the survey is probably not the primary driver, said James Lepkowski, director of the University of Michigan’s program in survey methodology. “If length is an issue, what you should see is that people break off in the middle. That’s actually very rare.”

Read the full article at FiveThirtyEight

This article is a perfect illustration of a primary problem facing researchers today. Is shorter and simpler better? Not always.

Online Polling: More or less accurate than phone polls?

Surveys Say: Online Polls Are Rising. So Are Concerns About Their Results.

The New York Times recently shone a light on the polling industry and how two of the most important trend lines in the industry may be about to cross.

As online polling rises, and the use of telephone surveys declines due mostly to rising costs and diminishing response rates, there are concerns among public opinion researchers that web polling will overtake phone polling before online polling is truly ready to handle the responsibility.

Ready or not, online polling is here and political analysts are facing a deluge of data from new firms employing promising (but not always proven) methodologies. The wide margin of difference in poll results for Donald Trump is noticeable – he fares better in online polls than in traditional polls. But it’s still not clear which method is capturing the true public opinion.

The abundance of web-based polling reflects how much easier it has become over the past decade. But big challenges remain. Random sampling is at the heart of scientific polling, and there’s not yet any way to randomly reach people on the Internet as auto-dialers allow for telephone polls. Internet pollsters obtain their samples through other, decidedly non-random, means. To compensate, they sometimes rely on extensive statistical modeling.

The online pollsters may get grouped together because they’re using the same channel, but they’re using a far greater diversity of methodology than phone pollsters. The statistical techniques that the Internet pollsters then use to adjust these data vary nearly as much. And because no one is yet sure of the ‘right’ methods, that variance in methodology is a risk.

There is one big reason why online polling (and other non-personal polling methods like IVR) numbers are so varied: Anonymity. Voters are likelier to acknowledge their support of a given candidate in an anonymous online survey than in an interview with a real person. Research suggests that the social acceptability of an opinion shapes a respondent’s willingness to divulge it.

This social acceptability bias may be making online surveys more accurate. But we don’t know. What is also possibly adding to the problem is that the online polls have biased, non-random and non-representative samples. The fact that online pollsters have sometimes relied on extensive modeling to adjust their results could be a sign of the limitations of online polling.

The current web-based approach to polling are high-profile experiments into the next generation of opinion research, and the industry is watching closely.

Read the source article at The New York Times

Let us help you.

Discover how we can help YOUR organization solve its current survey needs.