This report summarizes the results of a series of field-based case studies conducted by the U.S. Geological Survey (USGS) to (1) evaluate the use of nonmarket values in Bureau of Land Management (BLM) planning and project assessments, (2) update existing technical resources for measuring those values, and (3) provide guidance to field staff on the use of nonmarket values. Four BLM pilot sites participated in this effort: Canyons of the Ancients National Monument in Colorado, Red Cliffs and Beaver Dam Wash National Conservation Areas in Utah, BLM’s Taos Field Office in New Mexico, and BLM's Tuscarora Field Office in Nevada. The focus of the case studies was on practical applications of nonmarket valuation. USGS worked directly with BLM field staff at the pilot sites to demonstrate the process of considering nonmarket values in BLM decisionmaking and document the questions, challenges, and opportunities that arise when tying economic language to projects.
As part of this effort, a Web-based toolkit, available at https://my.usgs.gov/benefit-transfer/, was updated and expanded to help facilitate benefit transfers (that is, the use of existing economic data to quantify nonmarket values) and qualitative discussions of nonmarket values. A total of 53 new or overlooked nonmarket valuation studies comprising 494 nonmarket value estimates for various recreational activities and the preservation of threatened, endangered, and rare species were added to existing databases within this Benefit Transfer Toolkit. In addition, four meta-regression functions focused on hunting, wildlife viewing, fishing, and trail use recreation were developed and added to the Benefit Transfer Toolkit.
Results of this effort demonstrate that there are two main roles for nonmarket valuation in BLM planning. The first is to improve the decisionmaking process by contributing to a more comprehensive comparison of economic benefits and cost when evaluating resource tradeoffs for National Environmental Policy Act analyses. The second is to use economic language and information on economic values, either qualitative or quantitative, to improve the ability to communicate the economic significance of the resources provided by BLM-managed lands.
Findings also indicate that the use of existing economic data to quantify nonmarket values (that is, benefit transfer) poses unique challenges because of the scarcity of both resource data and existing valuation studies focused on resources and sites managed by BLM. This highlights the need for improvements in the collection of resource data at BLM sites, especially visitor use data, as well as an opportunity for BLM’s Socioeconomics Program to strategically identify priority areas, in terms of both resources and geographic locations, where primary valuation studies could be conducted and the results used for future benefit transfers. Finally, whereas qualitative discussions of nonmarket values do not facilitate the comparison of monetized values, they can provide a manageable next step forward in providing more comprehensive information on nonmarket values for BLM plans and project assessments.
Developing a USGS Legacy Data Inventory to Preserve and Release Historical USGS Data
Legacy data (n) - Information stored in an old or obsolete format or computer system that is, therefore, difficult to access or process. (Business Dictionary, 2016)
For over 135 years, the U.S. Geological Survey has collected diverse information about the natural world and how it interacts with society. Much of this legacy information is one-of-a-kind and in danger of being lost forever through decay of materials, obsolete technology, or staff changes. Several laws and orders require federal agencies to preserve and provide the public access to federally collected scientific information. The information is to be archived in a manner that allows others to examine the materials for new information or interpretations. Data-at-Risk is a systematic way for the USGS to continue efforts to meet the challenge of preserving and making accessible enormous amount of information locked away in inaccessible formats. Data-at-Risk efforts inventory and prioritize inaccessible information and assist with the preservation and release of the information into the public domain. Much of the information the USGS collects has permanent or long-term value to the Nation and the world through its contributions to furthering scientific discovery, public policies, or decisions. These information collections represent observations and events that will never be repeated and warrant preservation for future generations to learn and benefit from them.
Goal: Expand the USGS contribution to scientific discovery and knowledge by demonstrating a long-term approach to inventorying, prioritizing and releasing to the public the wealth of USGS legacy scientific data.
Implement a systematic workflow to create a USGS Legacy Data Inventory that catalogs and describes known USGS legacy data sets.
Develop a methodology to evaluate and prioritize USGS legacy data sets based on USGS mission and program objectives and potential of successful release within USGS records management and open data policies.
Preserve and release select, priority legacy data sets through the USGS IPDS data release workflow
Analyze the time and resources required to preserve/release legacy data and develop estimates to inform future legacy data inventory efforts.
As one of the largest and oldest earth science organizations in the world, the scientific legacy of the USGS is its data, to include, but not limited to images, video, audio files, physical samples, etc., and the scientific knowledge derived from them, gathered over 130 years of research. However, it is widely understood that high-quality data collected and analyzed as part of now completed projects are hidden away in case files, file cabinets and hard drives housed in USGS facilities. Therefore, despite their potential significance to current USGS mission and program research objectives, these “legacy data” are unavailable. In addition, legacy data are by definition at risk of permanent loss or damage because they pre-date current, open-data policies, standards and formats. Risks to legacy data can be technical, such as obsolescence of the data’s storage media and format, or they can be organizational, such as a lack of funding or facility storage. Conveniently, addressing legacy data risks such as these generally results in the science data becoming useable by modern data tools, as well as accessible to the broader scientific community.
Building on past USGS legacy data inventory and preservation projects
USGS has long history of proactively researching and developing solutions to data management needs, including legacy data inventory and preservation. For example, in 1994 USGS was instrumental in establishing the FGDC-CSDGM metadata standard for geospatial scientific data that is still part of the foundation of USGS data management. Today, USGS is a lead agency in establishing meaningful and actionable policies that facilitate data release to the greater, public scientific community. In recent years, CDI has invested in several legacy data inventory and preservation projects, including the “Legacy Data Inventory” project (aka, “Data Mine” 2013-present), which examined the time, resources and workflows needed for science centers to inventory legacy data. Another CDI project, the “North American Bat Data Recovery and Integration” project (2014-present), is preserving previously unavailable bat banding data (1932-1972) and white-nose syndrome disease data and making them available via APIs. Both of these CDI projects were forward-thinking legacy data initiatives, several years ahead of Federal open data policies and mandates.
However, one of the most comprehensive, Bureau-level legacy data preservation efforts was the USGS Data Rescue project, which provided funding, tools, and support to USGS scientists to preserve legacy data sets at imminent risk of permanent loss or damage. A small sample of USGS science data rescued over those eight fiscal years included:
Inventoried, catalogued, indexed, and preserved Famine Early Warning one-of-a-kind, hardcopy maps.
Landsat orphan scenes, totaling over 146,000 were retrieved and processed, allowing the land research community to access previously unavailable satellite records.
Through a partnership with the Alaska State Division of Geological and Geophysical Surveys, the Alaska Water Science Center scanned, added metadata to, and included in a database volcano imagery dating from the 1950s to 2004.
20,000 original, historical stream flow measurements from Kentucky dating from the early 1900s to the late 1980s were scanned and entered into NWIS.
Central Mineral and Environmental Resources Science Center geochemical data conversion totaling approximately 250,000 primary documents from paper to electronic format were completed.
California Water Science Center migrated paper well schedules and other groundwater records dating back more than 100 years old. The records define historical climate variability, geologic conditions where natural hazards occur, and the extents of freshwater resources.
Over 100 projects were supported in the 8 years the Data Rescue project was in operation (2006-2013), while an additional 300 projects went unfunded, providing a glimpse of the potential trove of USGS legacy data at risk of damage or loss. The urgency of and strategies for preserving USGS legacy data have been discussed at length at the 2014 CSAS&L Data Management Workshop and the 2015 CDI Workshop, further emphasizing a Bureau-wide recognition of the importance of legacy data preservation and release. During the 2015 CDI Workshop, legacy data preservation was rated a top-rated FY16 priority by the Data Management Working Group, laying the groundwork for this proposal, which intends to apply the legacy data inventory and evaluation methods developed through the CDI Legacy Data Inventory project to formalize and extend the inventory successfully started through the Data Rescue Program. By creating a formal method to submit, document and evaluate legacy data known to be in need of preservation, USGS would have a tool that USGS scientists, science centers, and mission areas can use to identify significant historical legacy data that can inform, new, data-intensive scientific efforts.
Challenges and improvements for USGS legacy data preservation and release
Based on our experiences managing and preserving USGS legacy data, we have seen two challenges that often undermine legacy data preservation and release:
The most scientifically significant legacy data aren’t always the most recoverable: Legacy data by definition are “dated” because there is some length of time that has passed since the data were collected, the project completed and recovery efforts begin. The longer the time that’s passed, the more likely project staff aren’t available and supporting project and data documents are lost. Lacking this knowledge and/or documentation, metadata may not be completed, resulting in preserved data that aren’t useable - a critical element of the USGS data release peer review and approval process. If data is not useable, it is more difficult to release. Critically evaluating legacy data for their “release potential,” not just their scientific significance, increases the likelihood of selecting legacy data that will be successfully released.
Research scientists may not have data science skills/expertise/resources: Traditionally, legacy data efforts provide funding directly to the data owner, who is generally a principal investigator and knows the data intimately, but may lack the data science experience, time and tools to preserve and release data in an open format with complete, compliant metadata. In our experience, this can lead to delays in preserving and releasing legacy data. Data scientists can/should not replace data owners, but they can provide a significant level of assistance to data owners, by applying their data and metadata development experience and tools.
We believe that each of challenges have good solutions that can improve the efficiency and predictability of preservation and release efforts:
Make “potential for successful release” a primary evaluation factor in prioritizing and selecting legacy data for preservation and release. By developing a method of estimating the feasibility and cost of preserving and releasing data and incorporating it into the evaluation and priority criteria, we can better select and prioritize data sets.
Provide funding to a USGS data scientist to collaborate with data owners and ensure preservation and releases are consistently produced and of the highest quality.
Each objective of this proposal will be addressed in a sequence of 3 phases:
Legacy Data Inventory Submission Period
Evaluation and prioritization of the Legacy Data Inventory; selection of data sets for preservation and release.
Preservation and release of selected datasets.
Phase I: Identification and inventory of USGS data at risk
Data owners will document their legacy data sets electronically, providing the primary project and data set metadata elements needed to score, evaluate and prioritize the legacy data inventory. The core of these metadata elements will be derived from the established “USGS Metadata 20 Questions” form, which has proven effective at gathering metadata from research scientists with little/no data science experience. Narrative fields will be used for evaluating need. Categorical fields will be used to calculate feasibility scores used to determine level of effort required to successfully rescue the proposed data.
Phase II: Evaluation and prioritization of the USGS data at risk requests
The CDI Data Management Working Group’s Data at Risk sub-group will facilitate the evaluation and prioritization of the legacy data inventory. Mission Areas will be engaged to verify inventory submissions are supported programmatically and meet mission objectives. The USGS Records Management Program, Enterprise Publishing Program, and Sciencebase will be consulted to verify submitted legacy data inventory submissions can be released within Bureau records management and data release policies. Once these checkpoints have been verified the Data at Risk sub-group and data scientist will score and prioritize the legacy data inventory based on the following criteria:
Scientific value/significance to USGS mission area and program objectives.
Potential of successfully preserving and releasing the data by the data scientist.
Severity/Imminence of loss or damage to data based on identified risk factors.
Phase III: Preservation and Release of Select, Priority Legacy Data
Working in order of priority as set in Phase II, the data scientist(s) will collaborate with the data owner and work with them to complete the process of preserving and releasing their legacy data. Through this data owner/scientist collaboration, the data scientist will create and validate the FGDC-CSDGM metadata and develop the data set in an open-format as documented in the metadata. By process, the data scientist will act as an agent of the data owner, coordinating and completing all steps in each workflow until the the IPDS record approved and disseminated by the Bureau and the Sciencebase data release item(s) are approved, locked and made public by the Sciencebase team. However, while the data scientist is responsible for ensuring all preservation and release tasks are completed consistently and within policies and best practices, the data owner retains all approval of final metadata attribution (e.g., title, authorship), as well as disposition of their legacy data (e.g., pre/post processing methods; derivative data architectures).
At the completion of Phase III, each legacy data release will have the following created by the data scientist:
complete, compliant FGDC-CSDGM metadata
legacy data set(s) in an open-format, publicly discoverable and available from Sciencebase.
a USGS highlight submitted through the SW Region to Reston.
a CDI update describing the data set(s) released and a summary of time and resources required to complete the release.
The Community for Data Integration (CDI) represents a dynamic community of practice focused on advancing science data and information management and integration capabilities across the U.S. Geological Survey and the CDI community. This annual report describes the various presentations, activities, and outcomes of the CDI monthly forums, working groups, virtual training series, and other CDI-sponsored events in fiscal year 2016. The report also describes the objectives and accomplishments of the 13 CDI-funded projects in fiscal year 2016.
First estimates of the probability of survival in a small-bodied, high-elevation frog (Boreal Chorus Frog, Pseudacris maculata), or how historical data can be useful
Muths, E.L., R.D. Scherer, S.M. Amburgey, T. Matthews, A.W. Spencer, and P.S. Corn
In an era of shrinking budgets yet increasing demands for conservation, the value of existing (i.e., historical) data are elevated. Lengthy time series on common, or previously common, species are particularly valuable and may be available only through the use of historical information. We provide first estimates of the probability of survival and longevity (0.67–0.79 and 5–7 years, respectively) for a subalpine population of a small-bodied, ostensibly common amphibian, the Boreal Chorus Frog (Pseudacris maculata (Agassiz, 1850)), using historical data and contemporary, hypothesis-driven information–theoretic analyses. We also test a priori hypotheses about the effects of color morph (as suggested by early reports) and of drought (as suggested by recent climate predictions) on survival. Using robust mark–recapture models, we find some support for early hypotheses regarding the effect of color on survival, but we find no effect of drought. The congruence between early findings and our analyses highlights the usefulness of historical information in providing raw data for contemporary analyses and context for conservation and management decisions.
Infectious disease is an important consideration when contemplating reintroduction of a species to an area from which it has been extirpated and is one risk that has escalated in recent decades as use of large-scale and hands-on conservation measures increase. Reintroduction (in essence moving animals around), is a management tool considered when populations are failing or extirpations have occurred, yet is obviously at odds with many of the tenets of disease management. We focus on extirpations attributed to disease and formulate a decision tree to guide managers considering reintroduction. If disease was not the original cause of extinction or decline, it still is important to consider as inadvertent introduction of disease with reintroduced hosts may cause a reintroduction to fail, or may threaten members of the recipient ecological community. If disease was an important agent of extinction or decline, then the disease threat must be addressed before reintroduction is contemplated, or the effort is highly likely to fail. If disease resistant or tolerant stock are available, then reintroducing these animals may succeed. If such stock are not available, then it is important to determine whether reservoirs are present, and if they are, to develop strategies to manage disease adequately in the reservoirs. If reservoirs are not present, then the biggest threat to a reintroduction is the presence of still-infected members of the species being reintroduced. We illustrate these principles with two case studies, the boreal toad (Anaxyrus (Bufo) boreas), threatened by the amphibian chytrid fungus (Batrachochytrium dendrobatidis) and the Tasmanian devil (Sarcophilus hariisii), threatened by a transmissible cancer.
A Multiscale Index of Landscape Intactness for the Western United States
Landscape intactness has been defined as a quantifiable estimate of naturalness measured on a gradient of anthropogenic influence. We developed a multiscale index of landscape intactness for the Bureau of Land Management’s (BLM) landscape approach, which requires multiple scales of information to quantify the cumulative effects of land use. The multiscale index of landscape intactness represents a gradient of anthropogenic influence as represented by development levels at two analysis scales.
To create the index, we first mapped the surface disturbance footprint of development, for the western U.S., by compiling and combining spatial data for urban development, agriculture, energy and minerals, and transportation for 17 states. All linear features and points were buffered to create a surface disturbance footprint. Buffered footprints and polygonal data were rasterized at 15-meter (m), aggregated to 30-m, and then combined with the existing 30-meter inputs for urban development and cultivated croplands. The footprint area was represented as a proportion of the cell and was summed using a raster calculator. To reduce processing time, the 30-m disturbance footprint was aggregated to 90-m. The 90-m resolution surface disturbance footprint is retained as a separate raster data sets in this data release (Surface Disturbance Footprint from Development for the Western United States). We used a circular moving window to create a terrestrial development index for two scales of analysis, 2.5-kilometer (km) and 20-km, by calculating the percent of the surface disturbance footprint at each scale. The terrestrial development index at both the 2.5-km (Terrestrial Development Index for the Western United States: 2.5-km moving window) and 20-km (Terrestrial Development Index for the Western United States: 20-km moving window) were retained as separate raster data sets in this data release. The terrestrial development indexes at two analysis scales were ranked and combined to create the multiscale index of landscape intactness (retained as Landscape Intactness Index for the Western United States) in this data release. To identify intact areas, we focused on terrestrial development index scores less than or equal to 3 percent, which represented relatively low levels of development on multiple-use lands managed by the BLM and other land management agencies.
The multiscale index of landscape intactness was designed to be flexible, transparent, defensible, and applicable across multiple spatial scales, ecological boundaries, and jurisdictions. To foster transparency and facilitate interpretation, the multiscale index of landscape intactness data release retains four component data sets to enable users to interpret the multiscale index of landscape intactness: the surface disturbance footprint, the terrestrial development index summarized at two scales (2.5-km and 20-km circular moving windows), and the overall landscape intactness index. The multiscale index is a proposed core indicator to quantify landscape integrity for the BLM Assessment, Inventory, and Monitoring program and is intended to be used in conjunction with additional regional- or local-level information not available at national levels (such as invasive species occurrence) necessary to evaluate ecological integrity for the BLM landscape approach.
Adapting to climate change and variability, and their associated impacts, requires integrating scientific information into complex decision making processes. Recognizing this challenge, there have been calls for federal climate change science to be designed and conducted in a way that ensures the research translates into effective decision support. Despite the existence of many decision support tools, however, the factors that influence which decision makers choose to use which decision support tools remain poorly understood. Using the Upper Colorado River Drought Early Warning System as a case study, this research will 1) examine how managers choose between many available tools and 2) consider how tool creators can better align their offerings to decision maker needs.
1. Improve understanding of:
The factors that influence decision makers’ choices to use decision support tools or not, and how they choose between available tools
How scientists creating decision support tools currently interface with decision makers and how their outreach efforts do or do not match information channels preferred by managers
The role that decision support tools play in drought decision making
2. Provide useful information to the National Integrated Drought Information System about the current use of the Upper Colorado River Basin Drought Early Warning System
Study Area and Scope
The Upper Colorado River Basin (UCRB) was one of the first pilot areas, beginning in 2008, for implementation of a regional drought early warning system (DEWS) under the National Integrated Drought Information System (NIDIS), which now supports ten regional DEWS. The selection of the UCRB for a DEWS reflects the regional importance of drought monitoring for managing water supply for agriculture and other uses, and the need for effective decision support related to drought. New drought-information tools have been developed specifically for the UCRB DEWS, and a number of others have been created since 2008, adding to the pre-existing toolkit for drought decision making. The various tools that are now available in the UCRB region can be expected to be more or less suitable for different decision makers’ needs. As a result, the broad decision context of this case study (managing drought) is fixed, but information needs vary. Thus decision makers will make varied choices about which of the available tools to use or not use.
The overall aim is to juxtapose understanding of the tool development process of tool creators with understanding of the choices made by prospective tool users to incorporate (or not) given decision support tools into their drought decision making. Document analysis will provide context and an official view of tool development or agency decision making. Conversations with scientists creating tools and drought decision makers will be used to understand motivations, priorities, concerns, and tacit influences on behavior.
Contrasting evolutionary histories of MHC class I and class II loci in grouse—Effects of selection and gene conversion
Minias, P., Z.W. Bateson, L.A. Whittingham, J.A. Johnson, S.J. Oyler-McCance, and P.O. Dunn
Genes of the major histocompatibility complex (MHC) encode receptor molecules that are responsible for recognition of intracellular and extracellular pathogens (class I and class II genes, respectively) in vertebrates. Given the different roles of class I and II MHC genes, one might expect the strength of selection to differ between these two classes. Different selective pressures may also promote different rates of gene conversion at each class. Despite these predictions, surprisingly few studies have looked at differences between class I and II genes in terms of both selection and gene conversion. Here, we investigated the molecular evolution of MHC class I and II genes in five closely related species of prairie grouse (Centrocercus and Tympanuchus) that possess one class I and two class II loci. We found striking differences in the strength of balancing selection acting on MHC class I versus class II genes. More than half of the putative antigen-binding sites (ABS) of class II were under positive or episodic diversifying selection, compared with only 10% at class I. We also found that gene conversion had a stronger role in shaping the evolution of MHC class II than class I. Overall, the combination of strong positive (balancing) selection and frequent gene conversion has maintained higher diversity of MHC class II than class I in prairie grouse. This is one of the first studies clearly demonstrating that macroevolutionary mechanisms can act differently on genes involved in the immune response against intracellular and extracellular pathogens.
Estimating the economic impacts of ecosystem restoration—Methods and case studies
Cathy Cullinane Thomas
Cullinane Thomas, Catherine, Christopher Huber, Kristin Skrabis and Joshua Sidon
Federal investments in ecosystem restoration projects protect Federal trusts, ensure public health and safety, and preserve and enhance essential ecosystem services. These investments also generate business activity and create jobs. It is important for restoration practitioners to be able to quantify the economic impacts of individual restoration projects in order to communicate the contribution of these activities to local and national stakeholders. This report provides a detailed description of the methods used to estimate economic impacts of case study projects and also provides suggestions, lessons learned, and trade-offs between potential analysis methods.
This analysis estimates the economic impacts of a wide variety of ecosystem restoration projects associated with U.S. Department of the Interior (DOI) lands and programs. Specifically, the report provides estimated economic impacts for 21 DOI restoration projects associated with Natural Resource Damage Assessment and Restoration cases and Bureau of Land Management lands. The study indicates that ecosystem restoration projects provide meaningful economic contributions to local economies and to broader regional and national economies, and, based on the case studies, we estimate that between 13 and 32 job-years4 and between $2.2 and $3.4 million in total economic output5 are contributed to the U.S. economy for every $1 million invested in ecosystem restoration. These results highlight the magnitude and variability in the economic impacts associated with ecosystem restoration projects and demonstrate how investments in ecosystem restoration support jobs and livelihoods, small businesses, and rural economies. In addition to providing improved information on the economic impacts of restoration, the case studies included with this report highlight DOI restoration efforts and tell personalized stories about each project and the communities that are positively affected by restoration activities. Individual case studies are provided in appendix 1 of this report and are available from an online database at https://www.fort.usgs.gov/economic-impacts-restoration.
Tamarisk beetle (Diorhabda spp.) in the Colorado River basin: Synthesis of an expert panel forum
Bloodworth, Benjamin R.; Shafroth, Patrick B.; Sher, Anna A.; Manners, Rebecca B.; Bean, Daniel W.; Johnson, Matthew J.; Hinojosa-Huerta, Osvel
In 2001, the U.S. Department of Agriculture approved the release of a biological control agent, the tamarisk beetle (Diorhabda spp.), to naturally control tamarisk populations and provide a less costly, and potentially more effective, means of removal compared with mechanical and chemical methods. The invasive plant tamarisk (Tamarix spp.; saltcedar) occupies hundreds of thousands of acres of river floodplains and terraces across the western half of the North American continent. Its abundance varies, but can include dense monocultures, and can alter some physical and ecological processes associated with riparian ecosystems.
The tamarisk beetle now occupies hundreds of miles of rivers throughout the Upper Colorado River Basin (UCRB) and is spreading into the Lower Basin. The efficacy of the beetle is evident, with many areas repeatedly experiencing tamarisk defoliation. While many welcome the beetle as a management tool, others are concerned by the ecosystem implications of widespread defoliation of a dominant woody species. As an example, defoliation may possibly affect the nesting success of the endangered southwestern willow flycatcher (Empidonax traillii extimus).
In January 2015, the Tamarisk Coalition convened a panel of experts to discuss and present information on probable ecological trajectories in the face of widespread beetle presence and to consider opportunities for restoration and management of riparian systems in the Colorado River Basin (CRB). An in-depth description of the panel discussion follows.