Online Surveys

When conducting social science research via a survey, selecting a survey mode is an important part of the process. Survey mode refers to the means by which the survey is administered, such as mail, telephone, face-to-face, online, or a mix of any of these techniques. Online surveying has become very popular with the advent of Survey Monkey™, Zoomerang™, and other survey software. This software allows for the quick and easy creation and administration of online surveys, but online surveying is not appropriate for every situation. Regardless of mode, certain factors must be considered when conducting any survey. They include the population of interest, sampling, survey instrument design, data analysis, response rates, and non-response bias. After considering all of these issues, social scientists at the USGS Fort Collins Science Center’s Social and Economic Analysis (SEA) Branch conducted the Landsat imagery user surveys entirely online, from developing the sample to administering the survey.

General Surveying Considerations

Study Population

The characteristics of the population of Landsat users were among the first factors we considered to determine if online surveys would be appropriate. In order to use an online survey, the entire population under study must have access to computers as well as the Internet and have some technological proficiency. People who use Landsat imagery for work purposes fulfill all of these qualifications, making them good candidates for an online survey.

Sampling

One of the goals of the 2009 survey was to reach the broadest possible array of users of Landsat imagery in the United States. Given that there was no existing comprehensive list of users, a different means of obtaining contact information was necessary. We began with a Web search by state for email addresses of potential users, who were identified through their work with Landsat imagery, remote sensing, or GIS. We followed the search with snowball sampling to refine and expand the sample. Email addresses were selected as the method of contact because these Landsat users were highly likely to have an email address for their work, and email addresses were relatively easy to find on the Web. For the 2012 survey, the number of users registered with USGS had grown significantly so, instead of a Web search, email addresses of these users were obtained from USGS.

Survey Design

Another consideration was the complexity of the survey instruments. While the majority of the sample in the 2009 survey consisted of Landsat users, there were also a variety of other imagery users, some of whom were not currently using or had never used Landsat imagery. We wanted to include these users because we were interested in why they were not using Landsat. Because of the diversity of the users, different questions needed to be asked of each group, which created four different survey “paths” (Fig. 1). However, this structure ruled out a paper survey due to the branching and skipping patterns required for such a survey. Developing the survey online allowed for each respondent to be asked only the questions that were most relevant to them and their use of satellite imagery. Respondents also did not have to follow instructions to arrive at the next appropriate question, but were taken there automatically based on their previous responses. A similar survey instrument was used for the 2012 survey to identify current and past Landsat users. The capabilities of the online survey software reduced the burden on respondents while allowing for a complex survey.

 Flowchart for Landsat imagery user survey, showing how user groups were segmented into "survey paths."
Figure 1. Paths for the 2009 Landsat satellite imagery user survey. 

Online surveys also allowed us to limit the choices of the respondents and reduce potential response errors. For instance, with a question for which we wanted the respondents to check only one answer, the online survey software allowed us to limit that response. Other questions, such as responses adding up to a certain number (i.e., 100%), could be constructed to force the respondents to answer the question within the parameters. We were also able to add pop-up boxes to provide examples or definitions of terms.  They were available if the respondents desired to read them, but not placing the text on the page resulted in a cleaner, less cluttered survey (see Fig. 2 for an example question with a link to a pop-up box).

Question from the survey of Landsat users as it appeared to respondents in Key Survey™.
Figure 2. Question from the survey of Landsat users as it appeared to respondents in Key Survey™. Note link to pop up box listing moderate-resolution imagery products. 

Data Analysis

The design features of the surveys ensured that the data would be ready for analysis with minimal quality control or format manipulation. We used SPSS® for analysis, so it was important that the raw data be exportable either directly into SPSS® or into a format that SPSS® accepted (i.e., Microsoft® Excel®). The survey software we use, Key Survey™, exports data directly into SPSS® and allows for very complex surveys, but a variety of other software may be used to develop and conduct an online survey (www.WebSM.org provides an extensive list of software along with thousands of references about online surveying).

Response Rate

Online surveys are often criticized for having lower response rates compared to surveys conducted using more traditional modes. However, with appropriate sampling, and by following accepted survey methodology (as outlined in Dillman 2007), in the 2009 survey we achieved a response rate of over 50%, substantially higher than the 30-35% for online surveys found in the literature (e.g., Lozar Manfreda and others, 2008; Shih and Fan, 2008). One of the factors that may increase online survey response rates is pre-recruitment. Through the snowball sampling in the 2009 survey, participants had already indicated they were willing to participate in the study. The response rate for the 2012 survey was 30%; this was not unexpected given the lack of pre-recruitment.

Non-Response Bias

Non-response bias is an issue in any survey, including online surveys. Frequently, online surveys take a convenience sample by posting a link to the survey on a website where anyone can respond (and can respond more than once, in many cases). This increases the risk of non-response bias considerably and it prevents any generalization of the results to the population under study. However, non-response bias is present in every survey, partially due to the fact that people who are interested in the survey topic are more likely to respond to the survey than those who are not. For the surveys, we conducted online non-response surveys and found that doing a non-response survey via the Internet was extremely fast and effective.

Online Surveying Considerations

Technology: Software, Hardware, and Internet

While the considerations above must be considered for any survey mode, there are some unique factors for online surveying. Software, hardware, and internet connectivity, as well as the means of contacting respondents and development of institutional capacity are all considerations. Free or inexpensive survey software is available on the Internet that might be appropriate for some surveys, but other surveys may require that software is installed in-house for security or other reasons. While online surveys can frequently be administered at lower cost and in less time than other types of surveys, purchasing software, training someone to use it, and obtaining technical support can add substantially to the cost of the survey. Because the software, hardware, and technical support we used were shared by a number of projects, the initial costs of building the capacity to conduct online surveys were defrayed.

Email Contacts

Another unique consideration with online surveys is the option to contact participants by email. While email could be used with other survey modes, typically the means of contact is the same as the mode (i.e., postcards and survey packets for mail surveys, phone calls for telephone surveys).  There is little guidance as to the best way to use email contacts, though the challenges are well known. People often have multiple email addresses, spam filters catch any emails from unknown addresses, and there is no guarantee that recipients will even open an email after they see it. For the 2009 survey of Landsat users, during the snowball sampling, we gave participants the option of letting us know if we had sent them our request to more than one of their email addresses. In this way, we eliminated several addresses and mitigated the possibility that someone would later receive more than one unique link to the survey. Spam filters were more challenging but, for the surveys, we tried to follow basic guidelines so our emails would not get detained. For instance, using certain words (i.e., help, free), all capital letters, or exclamation marks in the subject line may trigger spam filters, so we avoided them.

Subject Lines

Additionally, we varied the content of the subject lines based on available literature on the topic (e.g., Porter and Whitcomb, 2005). We initially used an imperative request (e.g., “USGS survey of satellite imagery users – Please respond”), an opportunity for self-expression or an offer (e.g., “Give your opinions about satellite imagery to USGS”), and an announcement (e.g., “USGS asks satellite imagery users their opinions”). Two different senders were used to further increase the chances of someone opening the email: one was a generic USGS address and the other was from a specific person in USGS known to many Landsat users. Every combination of subject line and email was tested with a small group of participants in the 2009 survey to discover which combination worked best. We found that the imperative request and the announcement worked well when sent from the generic email account.  The announcement also worked well when sent from the specific person.  The self-expression subject line performed poorly regardless of which sender was used. While we did not attempt to fully quantify these results, it was evident that certain combinations did produce higher response rates, as well as faster responses, than others.  

Managing Email Contacts

Managing email contacts can be difficult because of the short intervals between contacts.  While mail survey contacts are usually at least a week apart, email contacts can be three to four days apart.  Having survey software that (1) sends emails automatically and (2) removes participants from the mailing list as soon as they complete the survey can be very useful and reduces the chance of participants being contacted needlessly (see Fig 3. for some of the email management capabilities in Key Survey™).

Screen capture showing some of the email management capabilities available in Key Survey™ software.
Figure 3. Some of the email management capabilities available in Key Survey™ software.

For more information about online surveying, please see the references listed below; thousands of references are also available at www.WebSM.org

References

Dillman, D.A., 2007, Mail and internet surveys: Hoboken, NJ, John Wiley & Sons, Inc.

Lozar Manfreda, K., Bosnjak, M., Berzelak, J., Haas, I., and Vehovar, V., 2008, Web surveys versus other survey modes—A meta-analysis comparing response rates: International Journal of Market Research, v. 50, no. 1, p. 79-104.

Porter, S.R., and Whitcomb, M.E., 2005, E-mail subject lines and their effect on Web survey viewing and response: Social Science Computer Review, v. 23, no. 3, p. 380-387.

Sexton, N.R., Miller, H.M., and Dietsch, A.M., in press, Appropriate uses and considerations for online surveying in human dimensions research: Human Dimensions of Wildlife.

Shih, T.H., and Fan, X., 2008, Comparing response rates from Web and mail surveys—A meta-analysis: Field Methods, v. 20, no. 3, p. 249-271.