When conducting social science research via a survey, selecting a survey mode is an important part of the process. Survey mode refers to the means by which the survey is administered, such as mail, telephone, face-to-face, online, or a mix of any of these techniques. Online surveying has become very popular with the advent of Survey Monkey™, Zoomerang™, and other survey software1. This software allows for the quick and easy creation and administration of online surveys, but online surveying is not appropriate for every situation. Regardless of mode, certain factors must be considered when conducting any survey. They include the population of interest, sampling, survey instrument design, data analysis, response rates, and non-response bias. After considering all of these issues, social scientists at the USGS Fort Collins Science Center’s Policy Analysis and Science Assistance Branch (PASA) conducted the Landsat imagery user survey entirely online, from developing the sample to administering the survey (view the executive report).
The characteristics of the population of Landsat users were among the first factors we considered to determine if an online survey would be appropriate. In order to use an online survey, the entire population under study must have access to computers as well as the Internet and have some technological proficiency. People who use Landsat imagery for work purposes fulfill these qualifications, making them good candidates for an online survey.
One of the goals of the survey was to reach the broadest possible array of users of Landsat imagery in the United States. Given that there was no existing comprehensive list of users, a different means of obtaining contact information was necessary. We began with a Web search by state for email addresses of potential users, who were identified through their work with Landsat imagery, remote sensing, or GIS. We followed the search with snowball sampling to refine and expand the sample. Email addresses were selected as the method of contact because these Landsat users were highly likely to have an email address for their work, and email addresses were relatively easy to find on the Web.
Another consideration was the complexity of the survey instrument. While the majority of the sample consisted of Landsat users, there were also a variety of other imagery users, some of whom were not currently using or had never used Landsat imagery. We wanted to include these users because we were interested in why they were not using Landsat. Because of the diversity of the users, different questions needed to be asked of each group, which created four different survey “paths” (Fig. 1). However, this structure ruled out a paper survey due to the branching and skipping patterns required for such a survey. Developing the survey online allowed for each respondent to be asked only the questions that were most relevant to them and their use of satellite imagery. Respondents also did not have to follow instructions to arrive at the next appropriate question, but were taken there automatically based on their previous responses. These capabilities reduced the burden on respondents while allowing for a complex survey.
An online survey also permitted us to limit the choices of the respondents and reduce potential response errors. For instance, with a question for which we wanted the respondents to check only one answer, the online survey software allowed us to limit that response. Other questions, such as responses adding up to a certain value (e.g., 100%), could be constructed to force the respondents to answer the question within the parameters. We were also able to add pop-up boxes to provide examples or definitions of terms. They were available if the respondents desired to read them, but not placing the text on the page resulted in a cleaner, less cluttered survey (see Fig. 2 for an example question with a link to a pop-up box).
The design features of this survey ensured that the data would be ready for analysis with minimal quality control or format manipulation. We used SPSS® for analysis, so it was important that the raw data be exportable either directly into SPSS® or into a format that SPSS® accepted (e.g., Microsoft® Excel®). The survey software we use, Key Survey™, exports data directly into SPSS® and allows for very complex surveys, but a variety of other software may be used to develop and conduct an online survey (www.WebSM.org provides an extensive list of software along with thousands of references about online surveying).
Online surveys are often criticized for having lower response rates compared to surveys conducted using more traditional modes. However, with appropriate sampling and by following accepted survey methodology (as outlined in Dillman, 2007), we achieved a response rate of over 50%, substantially higher than the 30-35% for online surveys described in the literature (e.g., Lozar Manfreda and others, 2008; Shih and Fan, 2008). One of the factors that may increase online survey response rates is pre-recruitment. Through the snowball sampling, participants had already indicated they were willing to participate in the study. We also were dealing with a technologically savvy group of respondents for whom an online survey most likely did not pose any particular challenges and, in fact, may have been preferred.
Non-response bias is an issue in any survey, including online surveys. Frequently, online surveys take a convenience sample by posting a link to the survey on a website where anyone can respond (and can respond more than once, in many cases). This increases the risk of non-response bias considerably and it prevents any generalization of the results to the population under study. However, non-response bias is present in every survey, partially due to the fact that people who are interested in the survey topic are more likely to respond to the survey than those who are not. For this survey, we conducted an online non-response survey and found differences in the sector distribution of respondents and non-respondents. Since the sample taken for this survey was not random, we did not weight the results based on these data. We did find that doing a non-response survey via the Internet was extremely fast and effective. Within three days, over 20% of those who were sent the non-response survey had responded.
While the considerations above must be considered for any survey mode, there are some unique factors for online surveying. Software, hardware, and Internet connectivity, as well as the means of contacting respondents and development of institutional capacity, are all considerations. Free or inexpensive survey software is available on the Internet that might be appropriate for some surveys, but other surveys may require that software is installed in-house for security or other reasons, as we did with the Landsat user survey. While online surveys can frequently be conducted at lower cost and in less time than other types of surveys, purchasing software, training someone to use it, and obtaining technical support can add substantially to the cost of the survey. Because the software, hardware, and technical support we used were shared by a number of projects, the initial costs of building the capacity to conduct online surveys were defrayed.
Another unique consideration with online surveys is the option to contact participants by email. While email could be used with other survey modes, typically the means of contact is the same as the mode (i.e., postcards and survey packets for mail surveys, phone calls for telephone surveys). There is little guidance as to the best way to use email contacts, though the challenges are well known. People often have multiple email addresses, spam filters catch any emails from unknown addresses, and there is no guarantee that recipients will even open an email after they see it. For the survey of Landsat users, during the snowball sampling, we gave participants the option of letting us know if we had sent them our request to more than one of their email addresses. In this way, we eliminated several addresses and mitigated the possibility that someone would later receive more than one unique link to the survey. Spam filters were more challenging but we tried to follow basic guidelines so our emails would not get detained. Using certain words (e.g., help, free), all capital letters, or exclamation marks in the subject line may trigger spam filters, so we avoided them.
Additionally, we varied the content of the subject line based on available literature on the topic (e.g., Porter and Whitcomb, 2005). We used an imperative request (“USGS survey of satellite imagery users – Please respond”), an opportunity for self-expression or an offer (e.g., “Give your opinions about satellite imagery to USGS”), and an announcement (“USGS asks satellite imagery users their opinions”). Two different senders were used to further increase the chances of someone opening the email: one was a generic USGS address and the other was from a specific person in USGS known to many Landsat users. Every combination of subject line and email was tested with a small group of participants to discover which combination worked best. We found that the imperative request and the announcement worked well when sent from the generic email account. The announcement also worked well when sent from the specific person. The self-expression subject line performed poorly regardless of which sender was used. While we did not attempt to fully quantify these results, it was evident that certain combinations did produce higher response rates, as well as faster responses, than others.
Managing email contacts can be difficult because of the short intervals between contacts. While mail survey contacts are usually at least a week apart, email contacts can be three to four days apart. Having survey software that (1) sends emails automatically and (2) removes participants from the mailing list as soon as they complete the survey can be very useful and reduces the chance of participants being contacted needlessly (see Figure 3 for some of the email management capabilities in Key Survey™).
For more information about online surveying, please see the references listed below. Thousands of references are also available at www.WebSM.org.
Dillman, D.A. 2008. Mail and internet surveys. Hoboken, NJ: John Wiley & Sons, Inc.
Lozar Manfreda, K., M. Bosnjak, J. Berzelak, I. Haas, and V. Vehovar. 2008. Web surveys versus other survey modes: A meta-analysis comparing response rates. International Journal of Market Research 50(1): 79-104.
Porter, S.R., and M.E. Whitcomb. 2005. E-mail subject lines and their effect on Web survey viewing and response. Social Science Computer Review 23(3): 380-387.
Sexton, N.R., H.M. Miller, and A.M. Dietsch. [In press.] Appropriate uses and considerations for online surveying in human dimensions research. Human Dimensions of Wildlife.
Shih, T.H., and X. Fan. 2008. Comparing response rates from Web and mail surveys: A meta-analysis. Field Methods 20(3) 249-271.
1 Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.