Sites like SurveyJunkie and Swagbucks offer money in exchange for taking surveys, which doesnât always produce the best quality data.
Researchers use online opt-in surveys all the time. Should they? Search around for ways to make a little extra money online, and you might find yourself at one of many sites that offer to pay you to take surveys. Thereâs Swagbucks, SurveyJunkie, InboxDollars, and KashKick, for instance. On each of these sites, users are paid small amounts of money for completing surveys, playing games, or making purchases. The surveys on these sites are âopt-inâ surveys, meaning that participants are actively choosing to take them, rather than researchers pulling a random sample of a population to poll, as professional pollsters do. Unsurprisingly, opt-in surveys can lead to some skewed results: Earlier this week, [Pew Research Center wrote]( about their analysis of one such opt-in survey that found 20 percent of US adults under 30 believe that âThe Holocaust is a myth.â Pewâs attempt to replicate this result via a random sampling of Americans found that just 3 percent of Americans under 30 agreed with an identically worded statement about the Holocaust â a percentage that was more or less the same across all age groups. The analysis also included this incredible tidbit: âIn a February 2022 survey experiment, we asked opt-in respondents if they were[licensed to operate a class SSGN (nuclear) submarine](. In the opt-in survey, 12% of adults under 30 claimed this qualification, significantly higher than the share among older respondents. In reality, the share of Americans with this type of submarine license rounds to 0%.â Oof, right? The [Google]( results for survey sites are filled with reviews from people who are mainly concerned with whether these sites are âlegitimateâ or scams. But the Pew analysis points to another question: Just how good is the data collected for a survey when its participants are incentivized to speed through as many as possible in order to earn cash? The problems with opt-in surveys, explained I dug around and, surprise! Itâs complicated. âErrors are introduced (and remediated) in the survey process at every step,â noted David Rothschild, an economist at Microsoft Research. The fact that a survey was conducted online for a small reward isnât necessarily enough information to analyze data quality in a meaningful way. As Pew noted in its analysis, the Holocaust denial survey used an agree/disagree format that can lead to â[acquiescence bias](â â a tendency for respondents to give an affirmative reply. This means that while the survey collection method might have been part of the problem, the question itself may have also led to inaccurate results. âThere are many types of opt-in online audiences; some have strong vetting to ensure the respondents are who they say they are and produce high quality responses, while others just accept whomever without any pre-response quality control,â Rothschild added. Hereâs what you need to know. How do online survey sites work? Although there are a couple different models, the online survey sites we are talking about offer small rewards in exchange for survey participation. Most say they try to âmatchâ users to relevant surveys based on the data they collect about their users, and generally speaking, you only get paid if you qualify to take the survey and complete each required question. Typically, these sites pay users in points, which translate to small dollar amounts per survey, if they pass a set of screening questions and complete the entire survey. These points often do not translate to very much money: I created an account on Swagbucks and checked a list of available surveys. They included a 20-minute survey for 119 âSwagbucks,â which translates to ⦠$1.19. Longer surveys may offer more, while some surveys with a 10-minute time estimate offer less than a dollar. These are similar to the rates I saw on SurveyJunkie. On [Amazon](âs Mechanical Turk, a marketplace for work that includes survey taking, [a survey might pay less than 10 cents](. Why would pollsters and researchers use sites like these to collect responses? In some applications like election polls, as Pew noted, opt-in surveys can perform similarly to random probability-based surveys. Which is great, because they are generally much cheaper to conduct. âLower cost survey pools are great for explorationâ and when you donât need a very precise outcome, said Rothschild. The results are generally faster, cheaper, and more convenient. âEspecially for research thatâs being done on a close-to-shoestring budget, opt-in online surveys are a natural choice for scholars trying to study diverse aspects of social behavior,â added Thomas Gift, an associate professor of political science at University College London. Gift and another researcher studied the potential of fraudulent responses in online opt-in studies after using an opt-in study themselves to study a separate question. âIt was only during the fielding of the experiment that large cohorts of respondents seemed to be giving suspicious answers about their backgrounds,â he said. âSo we investigated further.â Why, and when, are online surveys prone to bogus respondents? Researchers can use a lot of tools, including screening questions, to weed out bad responses and end up with a set of usable data. But there are some instances, such as obscure beliefs or surveys where you need really precise data, where opt-in online surveys are going to be a problem. Pew noted a few considerations here: Based on their research over the years, online opt-in polls have a tendency to overestimate fringe beliefs (they gave the example of belief in conspiracy theories). That overrepresentation is more severe among younger respondents and among Hispanic adults, they noted. Gift and his research partner hired a ânationally-recognizedâ marketing firm â which they left unnamed in their paper for liability reasons â to conduct a survey for them that collected respondents with experience in the Army. This firm, they said, distributed the survey to a number of sub-vendors that provided financial incentives for responses (these sub-vendors were also left anonymous). In order to detect whether respondents really did have experience in the Army or not, Gift used screening questions embedded in the survey. Respondents were asked about saluting protocol, and for specific information on their military background. [Based on their analysis of those screeners](, nearly 82 percent of respondents may have pretended to be associated with the Army in order to take the survey and get paid for it. About 36 percent of those respondents passed the knowledge screening test, but were identified as probably misrepresenting themselves based on their answers to the survey questions themselves. And there was also evidence in the survey results that some respondents were taking the survey a bunch of times, giving nearly identical answers and tweaking their demographic data enough to pass as different people, presumably to get paid multiple times for the same survey. How can researchers minimize bogus responses and end up with useful data from an online survey? Essentially, by testing the respondent. Online surveys use attention checks, IP tracking, anti-bot software, and monitoring the time it takes for someone to complete a survey in order to try to mitigate fraud. Asking respondents questions like the one Pew flagged about having a license to drive a submarine is a pretty good way to tell whether someone is just cruising through and answering questions as quickly as possible, or if theyâre actually reading the questions. Nothing is going to catch every single bogus response, and, as Rothschild noted, some low-quality responses will slip through attention checks. There are also other models for collecting data online, Gift noted. Opt-in volunteer surveys âarenât without their limitations,â but they create a different set of incentives for participants that donât rely on a financial reward. Gift highlighted the work of the [Harvard Digital Lab for Social Sciences](, an online platform that allows people to volunteer to participate in social science research. While researchers might not be able to catch every single bad response, they can be transparent about how they collected their data, Rothschild noted. And itâs worth looking for that information the next time you see a shocking headline about a shocking belief held by The Youth. âA.W. Ohlheiser, senior technology writer [A mobile phone is displaying the logo of Bitcoin cryptocurrency on its screen next to a keyboard.]( Nikos Pekiaridis/NurPhoto via Getty Images [Bitcoin is ... back?]( [Bitcoinâs surge is fueling crypto optimism.]( [A Canada goose stands on the snow-covered figure of a Tyrannosaurus Rex at Cologne Zoo in Cologne, Germany in January 2024.]( Federico Gambarini/Picture Alliance via Getty Images [Are we in the middle of an extinction panic?]( [How doomsday proclamations about AI echo existential anxieties of the past.]( [In this photo illustration, the Reddit logo is seen in behind a silhouette of a person typing.]( Photo Illustration by Rafael Henrique/SOPA Images/LightRocket via Getty Images [A posterâs guide to whoâs selling your data to train AIÂ]( [Those Tumblr, Reddit, and WordPress posts you never thought would see the light of day? Yep, them too.](
Â
[Learn more about RevenueStripe...]( [Wilder dressed in a purple coat as Wonka sits among oversize lollipops and mushrooms.]( Silver Screen Collection/Getty Images [The less-than-magical Willy Wonka event, briefly explained]( [The viral fiasco in Scotland that made kids cry â and prompted calls to police.]( [Justice Neil Gorsuch, left, in a navy suit and red tie, and Chief Justice John Roberts, right, in a black suit and gray tie, stand in front of the Supreme Court building.]( Win McNamee/Getty Images [The Supreme Court appeared lost in a massive case about free speech online]( [The justices look likely to reinstate Texas and Florida laws that seize control of much of the internet â but not for long.]( Support our work Vox Technology is free for all, thanks in part to financial support from our readers. Will you join them by making a gift today? [Give]( [Listen To This] [Listen to This]( [AI on Trial: Bot-Crossed Lovers]( What happens when an AI chatbot takes part in a crime? Listen to the first episode of a Stay Tuned miniseries, âAI on Trial,â featuring Preet Bharara in conversation with Nita Farahany, professor of law and philosophy at Duke University. [Listen to Apple Podcasts]( [This is cool] ["I WANNA GO TO PANERA BR"](
Â
[Learn more about RevenueStripe...]( [Facebook]( [Twitter]( [YouTube]( This email was sent to {EMAIL}. Manage yourâ¯[email preferences]( , orâ¯[unsubscribe](param=tech) â¯to stop receiving emails from Vox Media. View our [Privacy Notice]( and our [Terms of Service](. Vox Media, 1201 Connecticut Ave. NW, Washington, DC 20036. Copyright © 2024. All rights reserved.