Using panel companies to provide respondents for sampling is a valuable tool for custom research projects using online surveys. Such panel companies enable us as researchers to target a group of respondents we would not otherwise have access to. This group of individuals has voluntarily signed up to take surveys in exchange for money.

We want to ensure that we are receiving quality answers from each panel respondent. Consequently, here at Turnkey, we have implemented some guidelines for identifying those respondents we want to delete from the dataset and not use in the analysis. The quality control measures may vary based on the client and project. Outlined below are some of the guidelines we use to flag respondents whose data may not be legitimate. Panel respondents receiving 2 or 3 red flags should be removed from the final dataset.

1. Timing: Take the median of how long it took all respondents to complete the survey, flag the respondents that fall within the first third of completes.
2. Open Ends: Read through the open ended comments and highlight respondents who wrote gibberish, profanity, or a response that doesn’t answer the question (i.e. “good” when asked their favorite player).
3. Straight Lining: Look for respondents who selected the same answer option for all matrix questions (e.g., Completely Satisfied).
4. Quality Assurance Questions: When programming the survey, add a few quality assurance questions to help flag respondents who might be rushing through/not reading the survey questions. For example:

QUESTION: Have you done any of the following in the last 12 months? (Please select all that apply)

Attended a live sporting event
Graduated from college
Got engaged
Purchased a boat
Visited Finland
Attended a music festival
None of these


QUESTION: For quality assurance, please select “3” from the list below.

Once a project is closed, Turnkey will send a list of complete IDs to the panel company, highlighting those respondents whom have been flagged and should be removed from the final dataset. Additionally, the panel provider is sent the reason(s) why a particular respondent will not be used in the final analysis. The panel provider flags these respondents as “survey offenders”.  After multiple offenses these individuals are removed from their database so they will not be able to provide “bad” data to another research company in the future.

Although this process is very subjective, having these basic guidelines in place provides some objectivity to the procedure. Eliminating the “junk” respondents enables us to use the best data possible when performing our analysis.