Skip to main content
U.S. flag

An official website of the United States government

Computational Tools and Services

CDI projects tagged with Computational Tools and Services. Computational tools and services include applications, Web services, data discovery tools, models, semantic services and tools, infrastructure, data brokers, and visualization tools.

Filter Total Items: 128

Birds and the Bakken: Integration of oil well, land cover, and species distribution data to inform conservation in areas of energy development

The goal of this project was to develop a novel methodology to combine the USGS Gap Analysis Program (GAP) national land cover and species distribution data with disturbance data to describe and predict how disturbance affects biodiversity. Specifically, the project team presented a case study examining how energy development in the Williston Basin can affect grassland birds; however...
Birds and the Bakken: Integration of oil well, land cover, and species distribution data to inform conservation in areas of energy development

Birds and the Bakken: Integration of oil well, land cover, and species distribution data to inform conservation in areas of energy development

The goal of this project was to develop a novel methodology to combine the USGS Gap Analysis Program (GAP) national land cover and species distribution data with disturbance data to describe and predict how disturbance affects biodiversity. Specifically, the project team presented a case study examining how energy development in the Williston Basin can affect grassland birds; however, the methods
Learn More

Data Management Training Clearinghouse

The purpose of the Data Management Training (DMT) Clearinghouse project was twofold. First, the project aimed to increase discoverability and accessibility of the wealth of learning resources that have been developed to inform and train scientists about data management in the Earth sciences. Secondly, the project team wanted to facilitate the use of these learning resources by providing
Data Management Training Clearinghouse

Data Management Training Clearinghouse

The purpose of the Data Management Training (DMT) Clearinghouse project was twofold. First, the project aimed to increase discoverability and accessibility of the wealth of learning resources that have been developed to inform and train scientists about data management in the Earth sciences. Secondly, the project team wanted to facilitate the use of these learning resources by providing
Learn More

A data management and visualization framework for community vulnerability to hazards

USGS research in the Western Geographic Science Center has produced several geospatial datasets estimating the time required to evacuate on foot from a Cascadia subduction zone earthquake-generated tsunami in the U.S. Pacific Northwest. These data, created as a result of research performed under the Risk and Vulnerability to Natural Hazards project, are useful for emergency managers and...
A data management and visualization framework for community vulnerability to hazards

A data management and visualization framework for community vulnerability to hazards

USGS research in the Western Geographic Science Center has produced several geospatial datasets estimating the time required to evacuate on foot from a Cascadia subduction zone earthquake-generated tsunami in the U.S. Pacific Northwest. These data, created as a result of research performed under the Risk and Vulnerability to Natural Hazards project, are useful for emergency managers and community
Learn More

Facilitating the USGS Scientific Data Management Foundation by integrating the process into current scientific workflow systems

Increasing attention is being paid to the importance of proper scientific data management and implementing processes that ensure that products being released are properly documented. USGS policies have been established to properly document not only publications, but also the related data and software. This relatively recent expansion of documentation requirements for data and software...
Facilitating the USGS Scientific Data Management Foundation by integrating the process into current scientific workflow systems

Facilitating the USGS Scientific Data Management Foundation by integrating the process into current scientific workflow systems

Increasing attention is being paid to the importance of proper scientific data management and implementing processes that ensure that products being released are properly documented. USGS policies have been established to properly document not only publications, but also the related data and software. This relatively recent expansion of documentation requirements for data and software may
Learn More

Development of Recommended Practices and Workflow for Publishing Digital Data through ScienceBase for Dynamic Visualization

The purpose of this project was to document processes for USGS scientists to organize and share data using ScienceBase, and to provide an example interactive mapping application to display those data. Data and maps from Chase and others (2016a, b) were used for the example interactive maps. Principal Investigator : Katherine J Chase, Andy Bock, Thomas R Sando Accomplishments The
Development of Recommended Practices and Workflow for Publishing Digital Data through ScienceBase for Dynamic Visualization

Development of Recommended Practices and Workflow for Publishing Digital Data through ScienceBase for Dynamic Visualization

The purpose of this project was to document processes for USGS scientists to organize and share data using ScienceBase, and to provide an example interactive mapping application to display those data. Data and maps from Chase and others (2016a, b) were used for the example interactive maps. Principal Investigator : Katherine J Chase, Andy Bock, Thomas R Sando Accomplishments The
Learn More

Hunting Invasive Species with HTCondor: High Throughput Computing for Big Data and Next Generation Sequencing

Large amounts of data are being generated that require hours, days, or even weeks to analyze using traditional computing resources. Innovative solutions must be implemented to analyze the data in a reasonable timeframe. The program HTCondor (https://research.cs.wisc.edu/htcondor/) takes advantage of the processing capacity of individual desktop computers and dedicated computing resources...
Hunting Invasive Species with HTCondor: High Throughput Computing for Big Data and Next Generation Sequencing

Hunting Invasive Species with HTCondor: High Throughput Computing for Big Data and Next Generation Sequencing

Large amounts of data are being generated that require hours, days, or even weeks to analyze using traditional computing resources. Innovative solutions must be implemented to analyze the data in a reasonable timeframe. The program HTCondor (https://research.cs.wisc.edu/htcondor/) takes advantage of the processing capacity of individual desktop computers and dedicated computing resources as a
Learn More

Evaluating a new open-source, standards-based framework for web portal development in the geosciences

Web portals are one of the principal ways geospatial information can be communicated to the public. A few prominent USGS examples are the Geo Data Portal (http://cida.usgs.gov/gdp/ [URL is accessible with Google Chrome]), EarthExplorer (http://earthexplorer.usgs.gov/), the former Derived Downscaled Climate Projection Portal, the Alaska Portal Map (http://alaska.usgs.gov/portal/), the...
Evaluating a new open-source, standards-based framework for web portal development in the geosciences

Evaluating a new open-source, standards-based framework for web portal development in the geosciences

Web portals are one of the principal ways geospatial information can be communicated to the public. A few prominent USGS examples are the Geo Data Portal (http://cida.usgs.gov/gdp/ [URL is accessible with Google Chrome]), EarthExplorer (http://earthexplorer.usgs.gov/), the former Derived Downscaled Climate Projection Portal, the Alaska Portal Map (http://alaska.usgs.gov/portal/), the Coastal
Learn More

Integration of Phenological Forecast Maps for Assessment of Biodiversity: An Enterprise Workflow

Recent open data policies of the Office of Science and Technology Policy (OSTP) and Office of Management and Budget (OMB), which were fully enforceable on October 1, 2016, require that federally funded information products (publications, etc.) be made freely available to the public, and that the underlying data on which the conclusions are based must be released. A key and relevant...
Integration of Phenological Forecast Maps for Assessment of Biodiversity: An Enterprise Workflow

Integration of Phenological Forecast Maps for Assessment of Biodiversity: An Enterprise Workflow

Recent open data policies of the Office of Science and Technology Policy (OSTP) and Office of Management and Budget (OMB), which were fully enforceable on October 1, 2016, require that federally funded information products (publications, etc.) be made freely available to the public, and that the underlying data on which the conclusions are based must be released. A key and relevant aspect of these
Learn More

Crowd-Sourced Earthquake Detections Integrated into Seismic Processing

The goal of this project is to improve the USGS National Earthquake Information Center’s (NEIC) earthquake detection capabilities through direct integration of crowd-sourced earthquake detections with traditional, instrument-based seismic processing. During the past 6 years, the NEIC has run a crowd-sourced system, called Tweet Earthquake Dispatch (TED), which rapidly detects earthquakes...
Crowd-Sourced Earthquake Detections Integrated into Seismic Processing

Crowd-Sourced Earthquake Detections Integrated into Seismic Processing

The goal of this project is to improve the USGS National Earthquake Information Center’s (NEIC) earthquake detection capabilities through direct integration of crowd-sourced earthquake detections with traditional, instrument-based seismic processing. During the past 6 years, the NEIC has run a crowd-sourced system, called Tweet Earthquake Dispatch (TED), which rapidly detects earthquakes worldwide
Learn More

Developing a USGS Legacy Data Inventory to Preserve and Release Historical USGS Data

Legacy data (n) - Information stored in an old or obsolete format or computer system that is, therefore, difficult to access or process. (Business Dictionary, 2016) For over 135 years, the U.S. Geological Survey has collected diverse information about the natural world and how it interacts with society. Much of this legacy information is one-of-a-kind and in danger of being lost forever...
Developing a USGS Legacy Data Inventory to Preserve and Release Historical USGS Data

Developing a USGS Legacy Data Inventory to Preserve and Release Historical USGS Data

Legacy data (n) - Information stored in an old or obsolete format or computer system that is, therefore, difficult to access or process. (Business Dictionary, 2016) For over 135 years, the U.S. Geological Survey has collected diverse information about the natural world and how it interacts with society. Much of this legacy information is one-of-a-kind and in danger of being lost forever through
Learn More

Making Unmanned Aircraft System (UAS) Data Available to USGS Scientists and the Public

Prior to this project, data acquired from the USGS Unmanned Aircraft Systems (UAS) have been provided to requesting scientists but have not been made available to the broader USGS community, the U.S. Department of the Interior (DOI) bureaus, or the public at large. This project performed a pilot study and developed a strategy that is scalable to evolve into a permanent UAS data...
Making Unmanned Aircraft System (UAS) Data Available to USGS Scientists and the Public

Making Unmanned Aircraft System (UAS) Data Available to USGS Scientists and the Public

Prior to this project, data acquired from the USGS Unmanned Aircraft Systems (UAS) have been provided to requesting scientists but have not been made available to the broader USGS community, the U.S. Department of the Interior (DOI) bureaus, or the public at large. This project performed a pilot study and developed a strategy that is scalable to evolve into a permanent UAS data management
Learn More

The 'Digital Grain Size' Web and Mobile-Computing Application

This project team developed a Web-hosted application (that can also be used on mobile platforms) for automatic analysis of images of sediment for grain-size distribution, using the “Digital Grain Size” (DGS) algorithm of Buscombe (2013) (“DGS-Online,” 2015). This is a free, browser-based application for accurately estimating the grain-size distribution of sediment in digital images...
The 'Digital Grain Size' Web and Mobile-Computing Application

The 'Digital Grain Size' Web and Mobile-Computing Application

This project team developed a Web-hosted application (that can also be used on mobile platforms) for automatic analysis of images of sediment for grain-size distribution, using the “Digital Grain Size” (DGS) algorithm of Buscombe (2013) (“DGS-Online,” 2015). This is a free, browser-based application for accurately estimating the grain-size distribution of sediment in digital images without any
Learn More
Was this page helpful?