Have a Surveyor question? We’ve got your answer!

Check out our FAQs.

Clients often ask us how to know when they’ve collected enough responses to a particular survey. Our answer? It depends. What does it depend on? Multiple factors, some related to statistics and some related to business.

How to Calculate Your Target Sample Size

To begin, determine the population size. In sports, this may be the number of Season Ticket Holders your club has, how many people attended last night’s Draft Party, how many people live in your market, etc.

Sometimes, this estimation can be less straightforward – for example, if your population is the number of people in your market who are fans of your team, or how many people shop for your merchandise, etc. In these cases, just make an educated guess – that’s your best option.

Then, calculate the ideal sample size. Specifically, calculate the sample size needed to achieve representative results to your survey based on your ‘population size’. Many Internet-based calculators will do this for you, including this one from Raosoft.

The calculator will ask you for margin of error (we recommend 5%), confidence level (95% is the norm), population size (see above) and response distribution (50% is often the default; leave this as is).

For those looking for a shortcut: in general, it’s safe to use 375 as a good estimate of how many responses to shoot for when considering large populations (over 20K),. For small populations (under 200), it’s very difficult to get a response rate large enough to result in a 5% margin of error. In such instances, we suggesting aiming for a 25% response rate – that should leave you with a reliable data set to work with.

Remember, when cutting your data into smaller groups (ex.: viewing responses by gender, seat location, etc.), each of these smaller groups must also include enough responses to exceed an acceptable margin of error threshold. This generally necessitates a larger number of total responses to help ensure enough responses in each of your key subgroups.

In the field of statistics, the following elements are all part of the margin-of-error puzzle:

Sample size
Level of confidence
Margin of error
Statistical power (i.e. the likelihood that your data analysis will correctly lead to a significant conclusion)

Therefore, when a client asks us “How much sample do I need?”, the true answer should be “It depends on your confidence level, margin of error tolerance, and how much power you want.” If you want a high level of confidence, a small margin of error, and a lot of power, you will require more sample than if you’re content with less confidence, a wider margin of error, and/or less power.

Before settling on a target sample size, consider the following questions:

How confident do I need to be in the conclusions I’m drawing from my results? There are times when you really need to be sure, like super-sure. On the other hand, there are times when you can be less sure. You might want to be 99.999% sure before changing the brand of hot dogs served at games, but only 90% when deciding to try a new hot dog and soda combination. The lower your required level of confidence, the smaller the sample you’ll need.

How far down the line should I go? Consider the law of diminishing returns – it applies perfectly to sample size. As sample size grows, the result is a smaller margin of error – but once the sample is past n=400-500, the margin of error doesn’t reduce much, even when the sample size is dramatically increased (for example: if n=600, margin of error=3.9%; when n=1,200, margin of error=2.7%). In other words, you need a lot of sample to achieve even a slight improvement in margin of error.

Turnkey recommends thinking about these questions when calculating your ideal sample size. If you are getting 1,000 or even 750 responses per subgroup of interest, it’s likely that you are actually oversampling. Oversampling can lead to survey fatigue by respondents and low response rates for future studies, so avoid that approach when possible. Instead, consider getting fewer responses – and doing more surveys!


When designing a survey, you may wish to keep the following hints in mind:

  1. Questions are numbered automatically according to when you add them into the system. Question numbers appear in the “Question Heading”. If you need to add a new question into the middle of your survey after having already programmed the survey, the questions will not automatically re-number. The latest question added, regardless of location, will have the last number.
  2. Answer options automatically receive a numeric value when you add them to the survey. The first answer option will be “1”, the second will be “2”, and so forth. If you re-order the answer options, the numeric values will adjust accordingly. These numeric values will calculate the mean and other statistical calculations in the Analyze Mode. Note: Surveyor includes numeric values for all questions with discrete answer choices, regardless of whether the answer choices have any inherent ordinal meaning.
  3. When editing a survey in Design Mode, prior to “Opening a Survey” (i.e. allowing the survey to collect responses), be sure to hit the “Test” button to apply the changes. If the “Test” button is not selected prior to “Opening a Survey” for distribution, the last changes will not be applied.
  4. Before “Opening a Survey”, make sure all survey questions are finalized. The process of editing a live survey differs from what you may be accustomed to. Once a survey is live, you can reword a question, add new questions, apply skip logic, and make other minor formatting changes easily. However, adding, deleting, or modifying answer options requires creation of a new survey question, in tandem with data merging once the survey is closed. If you do need to edit a survey after it’s gone live, be sure to select “Apply Changes” to publish the new changes in the already open survey.
  5. Draft your surveys in Word or some other text editor outside of Surveyor. Do not attempt to draft a survey from scratch and program as you go. Once the survey is finalized in Word, copy and paste all questions and notes into Notepad, or save the survey as a plain text file. This streamlines the process of copying and pasting each question into Surveyor by stripping any formatting. Once in Surveyor, any of the questionnaire’s bolding, italicized, underlining, etc. will need to be added manually..
  6. When using a scale for answers such as “Excellent, Good, Average, Fair, Poor”, Turnkey recommends starting with the lowest value (in this case, “Poor”) at the top. This enables Surveyor’s assigned values to match up correctly. If you’d prefer to have “Excellent” be the top value, check the “Reverse Order of Choices” checkbox in the “Appearance” section of the question editor.
  7. If you have an “N/A” answer option, in the “Options” tab for the answer, select the applicable “Coded Value”. This will keep the value of the answer option from interfering with the mean and other statistical calculations.
  8. Change the “Report Value” as necessary for each individual answer in the “Options” tab. For example, if you have a question such as “How many children 18 years of age and younger currently live in your house?” with answer options “None, 1, 2, 3, 4 or more, Prefer not to answer”, change the “Report Value” and “Code Value” for the “Prefer not to answer” choice.
  9. Add in a “Question Heading” for every question. This will help you stay organized when using advanced branching, and when you add a question to a survey at any point besides the end. Focus your heading on the content of the question, i.e., “satisfaction”, “renewal intent”, etc.).
  10. Delete the %Q from all applicable labels, located in the “Labels” tab. If you do not delete the %Q, the “Question Heading” will appear in validation messages. That may confused your respondents, depending on what you put in the “Question Heading”.
  11. Mark all questions as required, except Essay and Fill in the Blank questions. Surveyor offers you the option of making each individual question required as you program.  However, you may find it easier to complete this task in bulk (located in the “Bulk Edit” at the bottom of the “Content” tab for each individual question) upon programming completion.
  12. Add in the ability for respondents to go “Back” (i.e. to the previous page) within the survey. This feature is located in the “Formatting” tab under “Formatting”. Check the “Back” checkbox to add the feature.
  13. If you anticipate respondents accessing the survey on their mobile phone (which is very likely) and you are utilizing a “Classic” survey theme, make sure you go into “Properties” and check the “Generate Mobile HTML” box to optimize the survey for mobile devices.

Do you have a tip or trick we missed in this article? Let us know!


Rating and ranking questions are often confused by those conducting survey research.

Rating questions ask respondents to rate multiple elements using a scale (Ex.: “How important are the following service aspects using a 1-5 scale, with 5 being ’extremely important‘ and 1 being ’not at all important’?”). In these instances, the same rating (1, 2, 3, etc.) can be used to describe multiple elements.

On the other hand, ranking questions ask respondents to rank a series of aspects against one another (Ex.: “Rank the following service aspects with 1 being ’most important’ and 5 being ’least important‘.”). In a ranking question, each “rank” can only be used once.

Each of these question type has its own advantages and disadvantages:

Rating Questions


  • Easy to understand.
  • Allow the respondent to select the same value for different elements if they are equally important, equally good, etc.


  • Respondents often rate items similarly, rather than providing a wide range of ratings, and cluster the majority of ratings on the positive/higher end of the scale.
  • Respondents react to scales in a variety of ways (for example, one particular respondent may opt to never utilize the highest rating on the scale, etc.).
  • Matrix rating questions are somewhat long and tiresome to complete, which may drive the respondent to take shortcuts when answering the questions.

Ranking Questions


  • Each element receives a unique ranking (i.e., respondents cannot assign the same value to each element).
  • Question technique forces discrimination between choices, which provides more statistical “power” (which allows analysts to more easily discern real differences where they exist).


  • Ranking questions force respondents to choose between two items they may wish to rank equally (e.g., “cost” and “seat location” may be the two most important renewal drivers for a season ticket holder, yet s/he is forced to choose one over the other in a ranking question).
  • Response options can be confusing for respondents if rating questions previously asked in the survey use scales with “1” corresponding with the most “negative” response option. In these situations, the respondents may (incorrectly) think of a “1” ranking as the lowest-available ranking.
  • It typically takes longer to answer ranking questions than rating questions, often because respondents need to compare items against one another.

When should you use each question type?

Follow Turnkey’s guiding research principle and “let your objectives drive your methodology”. When determining whether to use a rating question or a ranking question, ask yourself if the elements you are asking about should have the ability to receive the same score or if they need to be differentiated between.

For example, when asking a respondent how important a set of benefits are to renewing his/her season ticket package, a rating scale is more appropriate since a few of the benefits may be extremely (and equally) important to a particular respondent. On the other hand, when asking respondents which elements should be addressed first when stadium renovations begin, a ranking question is more appropriate since renovation elements will need to be prioritized.

The content of this article is based in part on “Voice of Vovici Blog: Ranking Questions vs. Rating Questions“. 


Using skip logic is vital to ensuring the right respondents are answering the right survey questions. This article will provide a background to creating basic skip logic in Surveyor. For directions on how to utilize more advanced skip logic/branching, please click here.

Turnkey suggests programming your entire survey before adding skip logic.  Doing so makes the process of programming skip logic much easier.

There are two parts to this article. The first discusses basic skip logic (i.e., skipping a respondent to a specific question based on his/her response to a previous question). The second explains the process for skipping all respondents from one question to another question.

Skipping Based on Specific Responses

1) Go to the question your skip logic will originate from.

2) Ensure that there is a page break before and after the question(s) involved in “triggering” the skip logic (the question you will skip respondents to must be on a different page than your “origin” question).

3) Click into the answer choice that will trigger the skip. A selection of “Response Options” will appear to the left.

4) Click the drop-down menu under “Destination”; then, click on the question respondents should be skipped to if they select this answer.

5) Once the skip logic has been programmed, the destination question will be listed below the origin question.

TurnkeyAcademy_photo4 (SkipLogic2)

6) To create more skip logic originating from another response to this question, click into that answer option (Step 3) and begin the process again.

7) Test the skip logic by using the “TEST” button located in the upper right corner. Choose the answer(s) that will trigger the skip logic and confirm the correct question appears after you click “Next”. Then, test again by choosing the answers(s) that should NOT trigger the skip logic, and confirm the correct question still appears next.

Skipping All Respondents From One Question to Another Question

1) Ensure that there are page breaks before and after the question that will be skipped.

2) Click into the question that will trigger your skip.

3) Under the “Destination” heading, click the drop-down list and select the question all respondents will be skipped to, regardless of the answer(s) they select on the current question. Once a destination has been set, a notification box will appear above the question.

4) Test the skip logic by using the “TEST” button located in the upper right corner. Since this skip should apply to anyone who answers this question (regardless of the answer they select), answer the question and confirm that the correct question appears next.

TurnkeyAcademy_photo3 (SkipLogic)


After completing the drafting and programming of a survey, the next step is distribution. If you’re distributing your survey via email, it’s imperative that you create an effective email invitation – that will increase your response rate. To do so, follow these steps:

Step 1: Abide by CAN-SPAM guidelines.

To help avoid being marked as spam when sending out your survey, avoid spam triggers such as “free”, “$”, “discount”, “offer”, and “act now” in your subject line and email body. Also, make sure your emails contain an opt-out link for respondents, and your organization’s physical address.

Step 2: Create an attractive subject line. 

Your subject line must be compelling enough to motivate respondents to click in to your email. To encourage this, keep the subject line short, be straightforward, and employ a “call to action”. Don’t be afraid to use the word “survey” in the subject line. According to recent research, surveys using the word “survey” in a subject line received higher-than-average click-through rates. Along these lines, examples of appealing subject lines include “Help Us Improve Your Experience” and “Share Your Opinion”.

Step 3: Provide detail on the survey right away. 

Once a respondent clicks into your email, you have fewer than eight seconds to impress him or her. As such, your email content must be precise and direct. Know your audience, and tailor your email content appropriately, being sure to incorporate:

  1. What you want respondents to do (the “call to action”)
  2. The nature of the survey
  3. How long the survey will take
  4. What you will do with the data
  5. How long the survey will be open for

Here are a few examples of email content that follow the rules above:


Dear Jack –

The Turnkey Titans are inviting you to share your opinion by completing a survey. This survey is strictly for research purposes, and your responses will be kept completely anonymous. 

Please take a moment to complete this 5-7 minute survey by Tuesday, January 24th. Your opinion will help the Titans offer the best-possible services and fan experiences. To access the survey, please click the link below:


Thank you for your participation!


The Turnkey Titans

Gameday Experience Survey

Dear John-

The Turnkey Titans strive to provide the ultimate fan experience both on and off the field. In an effort to continue doing so, we would greatly appreciate your feedback on your most recent gameday experience at Turnkey Stadium. 


Results from this survey, which should take approximately 5 minutes to complete, will allow us to further improve the Titans gameday experience. Please provide your input by Friday, January 27th.

Thank you for your support and feedback.

-Turnkey Titans

The content of this article is based in part on “7 Steps to Highly Successful Surveys”, a webinar presented by Verint, and a Verint blog.


Once a survey is opened for distribution, the following options and elements can/cannot be modified/edited/etc.:

TurnkeyAcademy_photo20 (LiveEdits)


According to its inventors, Fred Reichheld, Bain & Company and Satmetrix, the Net Promoter Score (NPS) is a customer loyalty measure showing the highest correlation with repeat purchases and referrals, and one of the best predictors of future growth. When utilizing the NPS question with sports teams and/or entertainment properties, we often tweak it a bit. Instead of asking respondents about the TEAM/EVENT itself, we ask how likely the respondent is to recommend attending TEAM games, or EVENT. Ex.:

How likely are you to recommend attending TEAM games to friends and colleagues?

0 – Not at all likely  1  2  3  4  5  6  7  8  9  10 – Extremely likely

How likely are you to recommend attending EVENT to friends and colleagues?

0 – Not at all likely  1  2  3  4  5  6  7  8  9  10 – Extremely likely

How do you calculate the Net Promoter Score?

NPS is calculated by taking the total percentage of respondents who answered “0” through “6” (these respondents are considered ‘detractors’) and subtracting it from the sum of the “10” and “9” percentages (this category is comprised of ‘promoters’).

TurnkeyAcademy_photo5 (nps1)

How do you calculate the Net Promoter Score in Surveyor?

Before you can calculate the NPS in Surveyor, double check that you Net Promoter question was programmed properly. Ensure the Report Value for each answer option is correct (i.e., the “0” answer option should have a Report Value of “0”, the “1” answer option should have a Report Value of “1”, etc.).

TurnkeyAcademy_photo6 (nps2)

If you need to bulk edit your report values, click the gear to the top right of the question and select “Edit Report Options…”. Then, enter the proper values (0, 1, etc.) in the “Report Value” column.

TurnkeyAcademy_photo7 (nps3)

Now, you can move forward with calculating your property’s NPS. There are a few different ways to calculate/display NPS within Surveyor: Gauge/Dial, Trend Table, Top Box, and Top Box Cross Tab.

Gauge/Dial Metric

  1. Create a report in the “Analyze” tab. This can be a stock report or a design from scratch report.
  2. Add a “Gauge” report element.

TurnkeyAcademy_photo8 (nps4)

  1. Select your NPS question.
  2. Under “Data”, make sure the “Reverse Order” checkbox is selected (it should already be checked if you programmed your answer options from low to high). Turnkey also recommends changing the decimal place to “0” if analyzing percentages, or “1” if analyzing means.

TurnkeyAcademy_photo9 (nps5)

  1. Under “Displayed Values”, select the “Net Score” checkbox.
  2. Then, you can customize the gauge if you’d like. For example:
  •  In the Report Element menu, you may choose to show, hide, or edit the question text and description or add a caption.
  •  In Appearance, you may edit the background or border colors, the border width or radius, or the size of the chart.
  •  In Segments, you may choose to change the colors or the length of the segments (i.e., the red, grey, and blue segments in the above gauge).

Trend Table

  1. Create a report in the “Analyze” tab. This can be a stock report or a design from scratch report.
  2. Add a “Trend Table” report element.
  3. Select your NPS question.
  4. Under “Data”, make sure the “Reverse Order” checkbox is selected. Select the “Range of Top Group” to “2” for the “promoters” (the 9 and 10 answer options) and the “Range of Bottom Group” to “7” for the “detractors” (those who responded 0-6). If you wish to look at the NPS across all responses and not within a defined time period, change the “Time Period” dropdown to “Entire”.

TurnkeyAcademy_photo10 (nps6)

  1. Under “Displayed Values”, select the “Net Score” checkbox. The “Count” checkbox will always remained checked.


Top Box Element

  1. Create a report in the “Analyze” tab. This can be a stock report or a design from scratch report.
  2. Add a “Top Box” report element.
  3. Select your NPS question.
  4. Under “Data”, make sure the “Reverse Order” checkbox is selected. Confirm the “Range of Top Group” is “2” for the “promoters” (the 9 and 10 answer options) and the “Range of Bottom Group” is “7” for the “detractors” (those who responded 0-6).

TurnkeyAcademy_photo11 (nps7)

  1. Under “Displayed Values”, select the “Net Score” checkbox. You may choose to leave “Mean” selected if you’d like. Turnkey recommends leaving the “Bar Chart Labels” selected so you can see that the top box is comprised of the “10” and “9” answers (promoters) and the bottom box is comprised of the “0-6” answer options (detractors).

Top Box Cross Tab Element

  1. Create a report in the “Analyze” tab. This can be a stock report or a design from scratch report.
  2. Add a “Top Box Cross Tab” report element.
  3. Select your NPS question, and the question you’d like to break NPS down by.
  4. Under “Data”, make sure the “Reverse Order” checkbox is selected. This should be selected if you programmed your answer options from low to high. Confirm the “Range of Top Group” is “2” for the “promoters” (the 9 and 10 answer options) and the “Range of Bottom Group” is “7” for the “detractors” (those who responded 0-6).

TurnkeyAcademy_photo13 (nps9)

  1. Under “Displayed Values”, select the “Net Score” checkbox. You may choose to leave “Mean” selected if you’d like. Turnkey recommends leaving the “Bar Chart Labels” selected so you can see that the top box is comprised of the “10” and “9” answers (promoters) and the bottom box is comprised of the “0-6” answer options (detractors).

Want to educate your colleagues about NPS? Share this graphic with them:

TurnkeyAcademy_photo14 (nps10)


You may wish to edit the Report Element Labels within a chart or table as highlighted below:

TurnkeyAcademy_photo21 (ChartEdits1)

To do so, follow these simple steps:

  1. Click the widget icon located in the lower right corner of the chart or table.

TurnkeyAcademy_photo22 (ChartEdits2)

  1. Within the “Row Labels” tab, edit the labels by typing into the Custom Label text box. You may also wish to edit the Column Labels by switching to the “Column Labels” tab.

TurnkeyAcademy_photo23 (ChartEdits3)



Surveyor has the ability to show or hide the following based on certain conditions:

  • Answer choices in a Choose One question
  • Answer choices in a Choose Many question
  • Rows in a Matrix question

(Please note: Surveyor also has the ability to hide questions and pages in a similar way, but are not the focus of this article.)

When deciding whether or not to hide a particular answer choice/matrix row, it’s important to understand the visibility options available within Surveyor. Each answer choice or row has three visibility options: Shown, Hidden, or Conditional.

  • Shown:The default selection.
  • Hidden:An option if, say, an answer choice or row refers to an event that hasn’t occurred yet.
  • Conditional:Best used to hide answer choices or rows that don’t apply to certain respondents. For example, choices referring to season ticket holder benefits can be hidden from respondents who do not have season tickets.

Another use for the “conditional” option is to pipe a selected answer from one question into the answer choices for another question. This is helpful in cases when Surveyor doesn’t enable piping.

How to Hide Answer Choices

Create your question as you normally would by adding the question and answer choices. Click into the answer that you want to conditionally hide/show. Near the top of the menu under the answer choice, there are the three visibility options. To show or hide the answer choice, simply click “Shown” or “Hidden.”

If you select “Conditional”, the word “Criteria” will then appear, enabling you to identify which answer selection(s) should trigger the answer choice to be hidden/shown.

TurnkeyAcademy_photo24 (HideShow1)

Click “Criteria”, which will open a menu providing the option to either “Hide this choice” or “Show this choice”. Choose “Hide” if the people specified by the criteria should not see the answer choice you’re working within Choose “Show” if you are specifying those who should see the answer choice. Then, click “Add Criteria” to add the specifications.

TurnkeyAcademy_photo25 (HideShow2)


  1.    I tried using the “Conditional” Visibility, but the answer choice is still showing.  Why?

If the criteria specifies which respondents should see the answer choice, make sure that “Show this choice” is selected at the top of the Criteria menu. Alternatively, make sure to select “Hide this choice” if the criteria specifies who should not see the answer choice.

  1.  Can I change the visibility of an answer choice or matrix row after a survey is open?

Yes, the visibility can be changed at any time without affecting the rest of the survey. However, keep in mind that any changes made after a survey is open will not appear until you click “Apply Changes.”

  1.  Can the “Conditional” criteria include dates or times?

No. Criteria can only be based on questions in the survey; they cannot include dates or times. If certain answer choices should only appear after certain dates, this will have to be done manually by having the answer choice be “Hidden” at first and then changing it to “Shown” when appropriate.

Remember, answer choices cannot be added once a survey is open, so enter all answers before your survey goes live and hide those that will be revealed later. Then, make sure to “Apply Changes” when changing the visibility after the survey is open.

  1.  How do I pipe selected answers from a checkbox question to another choose one or choose many question using this method? 

Type in the same list of answers for both questions. On the second question (which should only include answers selected in the first question), Conditional Criteria will need to be added for each answer choice. To do this, click in to the second question and choose “Conditional” visibility; then, in the Criteria menu, select “Show this choice” and click “Add Another Criteria”. In the three dropdowns, choose:

(1) The corresponding answer choice from the first question
(2) “=”
(3) “Selected”

(Alternatively, one can select “Hide this choice” and “Not selected” if desired.)

Be careful if the first question (the one you’re “piping” from) has a “None of the above” choice – anyone who selects this should be skipped over the second question; otherwise, their only choice for the second question will be “None of the above.”

  1.    Can I conditionally show/hide questions that are located on the same page?

Yes. Check the “Same Page Conditionally Visibility” checkbox on the question visibility pop-up.

(Note: This feature requires that the JavaScript Required property under Survey Properties be turned ON.)

Most web-browsers will have JavaScript, but there may be a few respondents for whom this feature will not work if they do not have JavaScript. To combat this, we suggest that you make sure the questions involved in your condition reside on separate pages from the start.

  1. Can I show or hide questions depending on whether or not the respondent is on a mobile device?

Yes. Please contact Turnkey for details on how to do this.

  1.    When piping from one checkbox question to another multiple choice question, what will happen with my “Other (Please specify)” choices?

These can be set up in almost the same way as all the other answer choices are. Under Conditional Criteria, select “Show this choice” and add the criteria: “Other (Please specify)” = “Selected” from the original question. However, instead of typing “Other (Please specify),” the answer choice should be piped in. To do this, click the piping button above the answer choice (this button will appear after clicking “Options”).

TurnkeyAcademy_photo26 (HideShow3)

In the pop-up menu, select the “Other (Please specify)” choice from the first question.

TurnkeyAcademy_photo27 (HideShow4)

Now, instead of “Other (Please specify)” appearing in the second question, the answer choice will be whatever the respondent typed into the first question’s “Other” field.

To avoid a blank answer choice being piped into your second question, make sure to check “Answer Required for Please Specify Options” under the validation & logic tab in the first question.

TurnkeyAcademy_photo28 (HideShow5)


Before you can create unique links, you first must complete two tasks:

  1. Create the necessary appended fields in your survey; and
  2. Create an Excel file containing a list of the records you want to create unique links for. Turnkey recommends including the following columns in your Excel file (in order to personalize the email content); however, of these, only email is required.
    • First Name
    • Last Name
    • Email
    • Account ID (or any other unique identifier)

TurnkeyAcademy_photo15 (UniqueLinks1)

Then, to create your unique links, follow these steps:

  1. Open your Excel file.
  2. Copy and paste your survey URL and appended field(s) into the Excel file.The appended field must be added to the survey URL using the following format: “&identifier=”.
    Example: If the identifier is “cid”, “&cid=” must be added to the end of the link.
    ii.    Example: If the identifier is “account”, “&account=” must be added to the end of the link.

The identifier is case sensitive. Confirm the identifier is written in the same case as in the hidden question and preselected field.

  1. Use the concatenate function to add the account ID (or any other appended field/unique identifier) to the end of the link for each record.
    • Formula: =concatenate($F$1,D2)Note: the actual cell numbers may change due to formatting of the excel file.
      • F1 = the cell where the survey URL is located. This survey URL cell should be an absolute reference, meaning that when we copy the formula to other cells it will always reference cell F1 (that is why the “$” is located before the F and the 1).
      • D2 = the field that will be appended to the link. Always start with the first field in the list

TurnkeyAcademy_photo16 (UniqueLinks2)

  1. Copy the formula down for the remaining respondents.

TurnkeyAcademy_photo17 (UniqueLinks3)

  1. At this moment, the links are still formulas. In order to create them as links, copy and paste the unique link cells as values.To do so, copy the unique link cells; then, right click and paste as values (i.e. the “123” clipboard icon) in the same location.

TurnkeyAcademy_photo18 (UniqueLinks4)

  1. Check the unique link is working properly by taking the survey. Copy a link from the excel file and paste it into an internet browser. Take the survey.
  2. Within Surveyor, go to the survey’s Analyze tab.
  3. Confirm the unique identifier (in this case, the account ID) is tracked. The unique identifier should be in the first column.

TurnkeyAcademy_photo19 (UniqueLinks5)

  1. If everything worked as intended during your test, open the survey for distribution.
  2. Load the list into Surveyor’s email system and send out the unique links accordingly.

Please click here to view a brief video demonstrating the above process.