4 ways survey call centers are adapting to new TCPA changes

Change is sweeping across the decades-old phone survey industry, and large survey call centers across the US are reacting in a variety of ways to the new TCPA regulations that we summarized last week.

As one of the industry’s leading suppliers of phone survey research systems and dialers in the US, the Voxco team has a unique point of view on the 2016 reality of the dialing landscape. We have a direct connection to many of the largest and most advanced survey call centers in the US, and we have talked numerous times with most of them since July 2015, when the TCPA changes were first announced.

We’ve seen first-hand how most of them initially reacted to the updated regulations, and how they have adapted in the 18 months since. Here are four distinct groups that we have observed:

1. Integrated Manual Dialing Solution

These survey call centers have absorbed the full cost of adapting and complying, and it’s already paying off. They have fully adapted by setting up a second, distinct dialing environment where they have integrated a manual dialing solution with their CATI survey software.

Based on the respondent list, these survey call centers split projects across the two environments:

  • They use an autodialing environment for dialing known landlines or for dialing mobile numbers where they have received consent to call (usually respondent panels). This environment can now also be used for some government projects.
  • They use a manual dialing environment for dialing unknown numbers and known mobile numbers.

The benefit of this decision is the ability to retain interviewer productivity by having CATI systems feed phone numbers directly to interviewers who then manually activate the dialing hardware with as little as one click. Because the dialer is integrated with a CATI survey system, project accuracy and analytical consistency are retained via native call recording, live monitoring, and dialing analytics pulled directly from the dialer.

2. Manual Dialing on Detached Phones (Analog or PBX)

This second group has their interviewers using physical telephones to manually dial mobile and unknown respondents in a separate environment from autodialing projects. When a new case is presented to an interviewer via their CATI system, they switch their focus to the physical phone, and manually dial the 10-digit number.

While the process is technically compliant, it negatively affects call productivity and accuracy, which is hurting bottom lines. Using real phones (vs. a fully integrated manual dialing system) to dial causes a huge drop in interviewer productivity – some project managers we have spoken with have told us it can add 30-50% more time per call.

With no CATI-integrated dialer to assist interviewers, project managers are seeing lower calling accuracy via interviewer misdials, and the lost benefits of built-in call recording, live monitoring, and integrated call analysis that come with dialing hardware.

3. Manual Dialing via an Existing Autodialing Solution

For various reasons, this third group is unwilling to make sweeping changes to their internal dialing environment set-up. Yes, they are aware of the TCPA changes, and try to comply with the manual dialing rules by having their autodialer prompt interviewers to physically dial mobile and unknown numbers. But they are still manually dialing from within an autodialing environment, which negatively affects the ‘evidence’ that proves those calls were manually dialed.

When projects have razor-thin profit margins, it’s hard to justify making huge internal changes but this solution really is the worst of both worlds: lost productivity via interviewer manual dialing, paired with the inability to prove compliance since projects are completed from within an autodialing environment.

4. Laying low & observing

Yes, there are still some survey call centers who when we first reach out to them, admit that they have been taking a wait-and-see approach. It’s clear that the definitions swirling around the TCPA are still fluctuating, so some survey call centers may still be continuing with no major changes, waiting for more concrete definitions of compliance.

Revenue is dropping as they avoid major new projects, and active projects are being completed using the same processes that were in place 18 months ago, including the use of autodialers. Retaining maximum productivity while accepting maximum risk. And it’s probably only a matter of time before a respondent who is aware of the new regulations pushes back.

Next steps

The new TCPA regulations are here to stay. Yet there may be survey call centers out there who are not fully compliant with the new rules, and remain at risk. And many of those who are compliant, are doing so at greatly reduced productivity by using detached phones to manually dial mobile and unknown respondents.

The TCPA regulations will remain an obstacle for many survey research studies, so it is becoming more essential by the day to move towards compliance sooner rather than later. Consider Voxco TCPA Connect, which allows a distinct manual dialing environment while retaining maximum project productivity, consistency and accuracy. Contact us to review how the process works.

Still shifting: An update on TCPA and survey dialing in 2016

Despite the FCC labeling their sweeping new TCPA changes as final back in July 2015 when they were announced, the actual dialing reality for survey call centers in the US is still shifting and settling.

The initial July 2015 ruling was a set of strict, sweeping changes that affected everyone in the phone survey research industry. In summary, it stated:

  • An autodialer may not be used to contact any mobile number in the US unless explicit consent was received from the number’s owner.
  • Dialing hardware could not be used, even if it only had the ability to be used as an autodialer, regardless of whether those features were being used.
  • The burden of knowing which numbers were mobile and which were landlines rested on the shoulders of the dialing party. If a number had been reassigned, a call center was responsible for tracking the change, and was only given a single erroneous dial to figure it out.

The ruling itself aimed to counteract an outdated reality – it seemed to be a response to the cost of incoming calls to mobile phone owners. It ignored the fact that major carriers across the United States have all but abolished the charging of talk and text by the minute in favor of unlimited anytime calls.

Another thing that seemed clear to industry analysts was that the changes were designed to dissuade fully robotic telemarketing and automated random-digit dialing. But since the changes were so sweeping and did not specify anything regarding the content of the calls, it also unfairly targeted real interviewers at survey call centers who use automated dialers to contact numbers off of authentic lists of respondents.

So naturally, the changes were not taken lightly by valid survey call centers across the nation. Industry leaders from CASRO and the MRA filed a motion to intervene. They demanded clarity around the definition of an autodialer, and sought relief from the risk of dialing reassigned numbers.

Possibly due in part to the pressure from legitimate researchers, the FCC has since announced exemption for the federal government and its contractors, indicating a shift back towards exemptions for legitimate survey call centers working on legitimate research projects.

Shortly thereafter, a closely followed TCPA court case came down on the side of the defendant. Thanks to this ruling, the hotly debated definition of an autodialer finally had a little clarity, and it shone a little more light onto the FCC’s definition. Because a real human had activated the dialing within their CATI system to dial the phone, it was not considered an autodial. The hardware used did not have the capacity to auto-dial without human intervention, and had no predictive or random number capabilities.

So it’s clear that the initially sweeping TCPA regulations are being relaxed a little to account for human-staffed survey call centers conducting valid research. More changes are still expected and being pushed for, and far more concrete definitions should soon help survey call centers better adapt.

We’re keeping a close eye on the shifting definition of the ruling, and ensuring that Voxco TCPA Connect always adapts to the legal reality. TCPA Connect is a manual dialer with no capability of autodialing or predictive dialing. It can be used as a single button-dialer that connects directly to Voxco CATI and lets human interviewers contact respondents far more efficiently than manual dialing. We offer multiple possible deployment scenarios that fit varying needs.

Designed and managed by phone survey experts, Voxco TCPA Connect will always offer the maximum productivity while allowing call centers to adapt to the 2016 dialing reality. Give us a shout to see how it could work for you.

Canada Privacy Act-Compliance in Survey Hosting & Access

Survey Hosting & Data Access that’s Canada Privacy Act-Compliant

Market Researchers who have been following the news about data privacy and protection in the past few years know that there are major concerns with any data that is hosted in the US, or accessed by US parties.

Recently, we’ve seen that US survey software providers come under fire from provincial governments, even those who have operations in Canada. The major problem is that US employees of these firms often have access to their Canadian clients’ personal information data even when it is stored on Canadian servers. This is in direct conflict with many Canadian Privacy Acts.

We’re proudly Canadian, and have been for over 25 years. So we understand Federal and Provincial Privacy Acts and how they relate to safeguarding our Canadian clients’ data, including the secure access of personal respondent data. Voxco’s secure servers host our Canadian clients’ data here in Canada, but with Voxco your data will never be accessed by any party in the United States.

We’re currently providing survey tools and secure-access data hosting to Federal and Provincial governments, universities, and Canada’s largest research firms. Many of them have chosen us because Voxco offers industry-leading survey software with the added confidence of knowing that your data is securely stored in Canada, only accessed by Canadian parties.

Let us know if we can help you keep your Canadian survey data in Canada.

5 Reasons to support the 2016 Canadian Census

Why Marketers Should Support the 2016 Canadian Census

When Justin Trudeau’s newly formed majority government announced the reinstatement of the mandatory long-form census in its very first Liberal caucus meeting, researchers and businesses across Canada celebrated. The mandatory long-form census was a huge step back in the right direction for lovers of data across the nation that Voxco calls home.

Here’s a great article by Jan Kestle, Chair of CMA’s Customer Insights & Analytics Council, first published on the Canadian Marketing Association website here:

Starting at the beginning of May, Canadians are being asked to complete the 2016 Census. And there is good news for businesses, consumers and citizens alike.

As you’ll recall for the 2011 Census, the mandatory long-form census was replaced by a voluntary survey. This move resulted in a non-response bias that meant a lot of small-area data for 2011 could not be released—including data on income, education, labour force activity and special populations like immigrants and aboriginal peoples. The loss of the long-form census was a significant blow to policy makers, researchers, data scientists and marketing professionals.

But today we can breathe a sigh of relief because the mandatory long-form census is back, and the complexity of the Canadian population will once again be documented in a comprehensive and statistically sound way. Support for the restoration of a “proper” census came from all corners of Canadian society and has reinforced the fact that the Census of Canada is a national treasure to be protected and supported.

In this era of Big Data, some people wonder why we need the census. There are many reasons, but here are my top five for marketers:

  1. Key metrics like market share and untapped potential use census-based data to benchmark an organization’s users against total population.
  2. Census-derived demographics are used to weight surveys to make them applicable to the general population—and more actionable.
  3. Segments derived from small-area census data are used by marketers to link disparate data sources together—connecting demographics to lifestyle and media preferences and thereby helping marketers get the right message to the right people using their preferred media.
  4. Census data for neighbourhoods can help harness unstructured data from digital and social media. This allows marketers to link internal offline data to online sources-–getting closer to a single view of the customer.
  5. Using census-derived data to leverage existing surveys and administrative databases results in businesses asking consumers fewer questions. Businesses don’t have to collect data if it already exists in another form—and consumers benefit.

Because of these and many other reasons, we urge Canadians to fill out the Census—whether they get the short or long form. The Census is essential to the fundamental operation of the country—from planning the provision of social services to managing economic growth.

We live in a time when busy consumers expect brands to know and connect with them. The resulting good data will also help us be better marketers.

Read the source article at CMA

This Survey Is Unwieldy And Intrusive. And Invaluable.

This Survey Is Unwieldy And Intrusive — And Invaluable To Understanding Americans’ Health

As the length of the NHIS grows, the attention spans of respondents shrinks which means declining response rates. Maggie Koerth-Baker at FiveThirtyEight has put together an excellent article on why the impact that decreasing the length of the NHIS could have in the US:

“Every year since 1957, tens of thousands of Americans have opened their homes to government survey takers who poke and prod their way through a list of intimate and occasionally uncomfortable questions. That process is part of the National Health Interview Survey, the gold standard of health data in the United States.

Experts say it is unique in its sample size, the scope of its questions, and how long it has existed. The NHIS is crucial to our ability to track the prevalence of diseases and health-related behaviors, and to answer complex questions involving health, income and family demographics. And now it’s about to change. The National Center for Health Statistics, the branch of the Centers for Disease Control and Prevention that conducts the survey, is planning a big shift in how the NHIS works, one that some scientists fear will impair their ability to learn about things like how stepmothers can subtly disadvantage children — investigations that end up shaping everything from the way your money gets spent to the policies your legislators vote on.

At issue is the question of how long anybody can reasonably expect Americans to put up with sharing intimate details of their lives with taxpayer-funded bureaucrats.

The 35,000 households — more than 87,000 individuals — who will be interviewed for the NHIS this year will have to devote an average of 90 minutes of their time to the survey. They aren’t paid. There’s no prize at the end. “That’s a lot to ask of people,” said Stephen Blumberg, associate director for science at the National Center for Health Statistics.

The thing about a gold standard is that every researcher wants to be a part of it. Twenty years ago, the NHIS lasted about an hour. But over time, more government agencies and scientists wanted to add more and more questions to make it more and more useful. Meanwhile, participation has been dropping. In 1997, the survey had a 91.8 percent response rate. “The response rate today is about 70-73 percent,” Blumberg said.

So scientists and the government are caught in a tug of war over the survey. Pull too hard one way, and they risk losing access to valuable information. Yank it back the other, and Blumberg worries that the participation rate could fall below 60 percent.


But nobody knows whether making it shorter will actually increase the response rate. The NHIS is not the only survey that’s losing respondents over time. Across the board, fewer and fewer Americans are choosing to participate in surveys of all kinds. This trend has been documented by the National Research Council and the Pew Research Center. Scientists I spoke to repeatedly referred to this problem as “the elephant in the room.”

It’s not clear why this is, but the length of the survey is probably not the primary driver, said James Lepkowski, director of the University of Michigan’s program in survey methodology. “If length is an issue, what you should see is that people break off in the middle. That’s actually very rare.”

Read the full article at FiveThirtyEight

This article is a perfect illustration of a primary problem facing researchers today. Is shorter and simpler better? Not always.

Online Polling: More or less accurate than phone polls?

Surveys Say: Online Polls Are Rising. So Are Concerns About Their Results.

The New York Times recently shone a light on the polling industry and how two of the most important trend lines in the industry may be about to cross.

As online polling rises, and the use of telephone surveys declines due mostly to rising costs and diminishing response rates, there are concerns among public opinion researchers that web polling will overtake phone polling before online polling is truly ready to handle the responsibility.

Ready or not, online polling is here and political analysts are facing a deluge of data from new firms employing promising (but not always proven) methodologies. The wide margin of difference in poll results for Donald Trump is noticeable – he fares better in online polls than in traditional polls. But it’s still not clear which method is capturing the true public opinion.

The abundance of web-based polling reflects how much easier it has become over the past decade. But big challenges remain. Random sampling is at the heart of scientific polling, and there’s not yet any way to randomly reach people on the Internet as auto-dialers allow for telephone polls. Internet pollsters obtain their samples through other, decidedly non-random, means. To compensate, they sometimes rely on extensive statistical modeling.

The online pollsters may get grouped together because they’re using the same channel, but they’re using a far greater diversity of methodology than phone pollsters. The statistical techniques that the Internet pollsters then use to adjust these data vary nearly as much. And because no one is yet sure of the ‘right’ methods, that variance in methodology is a risk.

There is one big reason why online polling (and other non-personal polling methods like IVR) numbers are so varied: Anonymity. Voters are likelier to acknowledge their support of a given candidate in an anonymous online survey than in an interview with a real person. Research suggests that the social acceptability of an opinion shapes a respondent’s willingness to divulge it.

This social acceptability bias may be making online surveys more accurate. But we don’t know. What is also possibly adding to the problem is that the online polls have biased, non-random and non-representative samples. The fact that online pollsters have sometimes relied on extensive modeling to adjust their results could be a sign of the limitations of online polling.

The current web-based approach to polling are high-profile experiments into the next generation of opinion research, and the industry is watching closely.

Read the source article at The New York Times

Bad Survey Data and how to avoid it

The Illusion of Technology and Decision Making

When the technology of the internet and mass acceptance of desktop computers arrived in the late ’90s, things were supposed to change. And they have. But better access to customer and market information should have led to smarter management decisions across the board. Faster access to customer and market information was supposed to give companies the ability to make production or inventory adjustments on the fly.

But if either of those things were true, then in theory the 2008 crash shouldn’t have happened, and we wouldn’t be in a situation where U.S. consumers have slipping confidence and almost half feel that their country’s economy is still in a recession. Why didn’t our incredible access to technology and consumer data help us avoid these results? It’s not a simple problem with a simple answer, but one cause was highlighted recently by Vic Crain via the MRA blog. The cause? Bad Data.

Technology gives the nearly limitless access to data that decision makers crave. But if the data that is collected is wrong, so will be the decisions based on that bad data. So where does bad data come from?

  1. Lazy data entry
  2. Programming errors
  3. Managers and sales people who fudge or hide numbers to meet monthly or quarterly targets
  4. Flawed statistical analysis (for example, segmentation)
  5. Wrong customer data
  6. Bad customer satisfaction and market research data
  7. Changes in government data collection

Bad research data comes from several common sources listed above, including (a) asking the wrong questions, (b) response bias, unqualified respondents and fakery, and (c) biased data collectors who really don’t want to collect negative information.

Government adds to the problem. Public health inspectors who call ahead to make appointments for “surprise inspections” add the the problem. Elimination of data from the U.S. Census reduces the accuracy of information with which business works.

Blind acceptance of data as fact is a problem. Sophisticated statistical models of business operations or consumer demand built on bad data will be of questionable reliability.  “The Smell Test” — a subjective tool based on experience and judgment — remains one of the best tools for assessing the quality of data and of models based on the data.

Read more in the source article at MRA

Will Market Research still be using passive data in 2 years?

Two-thirds of Market Research Clients Do Not Expect to Be Using Passive Data in Two Years

Despite the awareness of clients and suppliers of Market Research that passive data measurement is growing as a source of insight, about 2 out of 3 claim they will still not be using it two years from now.

30% of clients and 27% of suppliers identified “consumer-specific data collected passively” as the source of insights data that would be most important in two years. It was roughly even with the same number of respondents who selected “custom surveys in any mode.”

But more interestingly, when asked about their passive measurement today, and how much they foresee doing in the future, almost 70% of the respondents said they’re doing zero today and expect they’ll still be doing none in two years. Roughly 25% say they’re doing no passive measurement today but expect to be doing ‘some’ in two years.

Budget limitations are currently and are expected to remain the leading reason why researchers don’t expect they’ll increase passive data collection; but a variety of other concerns are worth noting: data integration and regulatory concerns were both ranked nearly equally as rationale for not beginning passive data measurement.

The majority of researchers (~60%) will be doing most of their research using mobile devices in two years. The ability to answer specific questions ranks as more important to researchers looking to tell a data story.

Read the source article at Market Research

The Dissolution of a U.S.-EU Personal Information Safe Harbor

European Court of Justice Strikes Down U.S.-EU Safe Harbor

On Tuesday, October 6th, the European Court of Justice (ECJ) struck down the U.S.-EU Safe Harbor Agreement. Per the Safe Harbor Agreement, US companies used to be allowed to collect and/or transfer data from the EU by “self-certifying” via the U.S. Department of Commerce that they were compliant with 7 Privacy Principles, in place of being adherent to specific privacy laws in numerous EU countries. The agreement was designed to ease of US organizations to move the personal data of European subjects across to the U.S.

But as of Tuesday, according to the ECJ, the Safe Harbor Agreement (initially approved in 2000) violates the EU’s Data Protection Directive, which forbids the transfer of personal data outside the EU to a country with inadequate privacy protections. The Directive also requires each EU member state to designate at least one Data Protection Authority (DPA) to monitor the application of the Directive within its territory. The Safe Harbor Agreement was becoming a shortcut for American companies to avoid scrutiny by individual DPAs.

The case in which the ECJ made its ruling was sparked by Edward Snowden releasing documents that showed US intelligence agencies mining personal information from the data of U.S. companies. Once the case was referred to the ECJ, it ruled that US companies removing data from the EU can no longer hide behind the Safe Harbor Agreement.

Market Researcher Compliance

Tuesday’s ruling invalidated the Safe Harbor effective immediately, but it’s likely understood that companies will need some time to assess their options and achieve compliance through another means, like avoiding any US-based data hosting or required data access by American organizations.

U.S. companies — including survey, opinion and market researchers — are subject to a number of different DPAs across the EU, each of which will be individually responsible for determining whether the United States’ data protections are “adequate.” This makes removal, access or hosting of EU data to the US far more complicated and perilous. Theoretically, U.S. companies could be subject to fines previously shielded by Safe Harbor.

We are monitoring news about the story as it advances, but for the short-term, it’s important that all research firms that deal in personal information consider the implications of US access to international personal data.

Read the source article at MRA

Europe plans General Data Protection Regulation

In an effort to centralize data protection across its 28 Member States, the EU wishes to mandate the General Data Protection Regulation (GDPR) as early as 2017. The GDPR is a proposed law that will apply to personal identifiable data collected from its residents.

The GDPR will regulate what data can be collected, how it can be collected, and what can be done with it. It will also determine what organizations must do to protect personal data and set legal/financial penalties for organizations that fail to comply.

It’s expected that MR companies in the EU will be affected by the law if it passes, despite the feeling that the law targets organizations who are less responsible with data collection and handling. Financial effects to MR firms will likely include the need to hire new staff to monitor data protection and implement regular processes that obtain legal authorization to collect data.

We’ll keep an eye on this news and clearly advise our European clients as the story progresses.

Read the source article at esomar.org

Let us help you.

Discover how we can help YOUR organization solve its current survey needs.