Respondent fatigue has always been an issue in market research. And as online surveys become more common with the increasing percentage of people taking surveys on their mobile phones, it’s more important than ever. If a survey is too long, respondents will eventually give up or answer questions as quickly as possible without paying attention. Market research relies on getting valid data from engaged respondents, so respondent fatigue should be avoided as much as possible. However, keeping surveys short is sometimes easier said than done.

Before a survey is written, determining the objectives of the research is essential. The organization should consider what specific topic they’d like to learn more about, exactly what pieces of information they’d like to receive, and what they will do with the information once it’s collected. Keep in mind that if a topic is too broad, a survey will not be able to get very detailed within a reasonable amount of questions. If more detail is needed, the survey can be split up into more narrowly focused studies. For example, in addition to evaluating the general satisfaction of fans with the overall gameday experience, perhaps the concessionaire needs more detail about what exactly can be improved with the food and drinks. It’s tempting to just add a few more questions to the general survey, but using a separate survey allows an organization to get into as much detail as they need.

Once a topic has been decided upon, all questions added to a survey should be evaluated based on the usefulness of the data to be collected. In general, there are two types of questions in a survey: informational questions that provide data on the survey topic (e.g. How satisfied were you? Why?), and “cut variables”, or key groups that you’d like to compare (season ticket members vs. single game buyers, demographics, etc.). For each informational question, make sure that it’s clear how the information can and will be used. For example, asking about what types of giveaways are most appealing can inform what promotions are implemented. By contrast, asking people about what type of music they prefer is likely to produce data that varies widely across fans. Teams shouldn’t play only one type of music and risk alienating some of their fans, so it’s less likely that a change will be made. If the data is unlikely to be used or implemented, the question shouldn’t be included.

Cut variables should go through the same type of analysis. It’s common for organizations to use a default set of demographic questions without carefully considering which ones are necessary. But would it help to know that well-educated people have different preferences on gameday than less-educated people, if any differences even exist? Would the organization make changes based on differences between levels of education? If the answer is no, then an education question is not needed in the demographics section. However, it would be helpful to know that people who have young children are less satisfied with the gameday experience and fan behavior. An organization could then make an effort to be make the experience more kid-friendly or offer more kid-friendly sections of the stadium.

For simpler surveys it’s easier to identify the informational questions and cut variables and determine which ones are important to keep. If a more complex analysis will be used, questions may be required in the survey even if they are not directly related to the objectives. Even then each question in the survey should have a specific purpose and should be removed if not. As a result, respondents will remain engaged with quick surveys where every question is relevant.