Language Proficiency and Survey Response Rate

Just when it started to feel as if it was going to be a boring summer, Annie Pettit of Peanut Labs stepped up to the microphone at this year’s MRA Insights and Strategy Conference and stole the show. Okay, so it might not have been that dramatic, but Dr. Pettit did share some fascinating insights on the impact of language on survey response rate and data quality.

As someone who has analyzed several data sets from surveys asked of acculturated and unacculturated Hispanics I can testify to the need to keep things simple when working with populations whose native language is not English. Even if your survey is translated, keeping it simple is the best way to go.

According to the US Census Bureau one of out of five US residents does not speak English at home. To add to this, for those who typically speak Spanish at home, less than 44% reported they spoke English “very well”. So the language we offer our surveys in does make a difference. Some findings from a survey that Peanut Labs conducted have a direct relationship to data quality.

As can be seen below, respondents whose native language was English selected far more options from the category list in multiple response questions than those who spoke with less fluency. The inverse is true for those who do not speak English as well – they were more likely to select only the minimum number of responses from these multi-punch questions.

image1

Other examples of data quality issues include ESL speakers’ greater likelihood of choosing “red herring” responses from category lists and an increased likelihood to not follow explicit instructions, e.g. select three items. ESL speakers were more likely to choose the “none of the above” option when offered. Their verbatim responses were shorter on average than those who spoke English more fluently.

Response time analysis yielded significant results. The ESL group tended to be either in the slowest or fastest deciles. This indicates they either struggled with the instructions or simply speeded their way through the exercise.

The downside to this phenomenon can be seen in the graph below. A greater percentage of ESL respondents were cut from the study because of not making it through data quality checks. We have to ask ourselves if these respondents truly deserve to be cut just because English is not their primary language.

image2

Some key takeaways include – Annie suggested, and I completely agree with, keeping your data quality focus on questions and behaviors that are less likely to rule out ESL respondents. These include: over-clicking on multiple response questions; straightlining; red herrings and not following instructions. I would also add ensuring that your surveys are as simple as the project will allow for and using multi-language formats where applicable.

Greg Timpany directs the research efforts for Global Knowledge in Cary, North Carolina, and runs Anova Market Research. You can follow him on Twitter @DataDudeGreg.

About the Author

Greg TimpanySince 1988 Greg has delved into the world of marketing, analytics and strategy. His expertise bridges the space between the structured world of IT and the creative, customer-centric needs of marketing. His thought leadership is sought out by executives in B2B, B2C and the public sector. Currently he directs the research efforts for Cary, NC based Global Knowledge. In addition he is a contributing author to the Cvent Survey blog; an instructor for Meredith College and Research Rockstar. As an entrepreneur he is co-founder of Anova Market Research. Past engagements have included working with The Los Angeles Times, Guitar Center and Wilkin Guge Marketing. He can be followed at: www.anovamarketresearch.com and http://survey.cvent.com/View all posts by Greg Timpany →

Leave a Reply