TERRA-REF Documentation
WebsiteGitHubTutorials
revisions
revisions
  • Introduction
  • Data Sources
  • Software
  • Scientific Objectives and Experimental Design
    • Protocols
      • Controlled Environment Protocols
      • Manual Field Data Protocols
      • Phenotractor Protocols
      • Sensor Calibration
      • Template Protocol
      • UAV Protocols
    • Experimental Design
      • Experimental Design Danforth
        • Sorghum Lines Danforth
      • Experimental Design Genomics
        • Sorghum Lines Genomics Year 1
        • Sorghum Lines Genomics Year 1 (continued)
        • Sorghum Lines Genomics Year 2
      • Experimental Design MAC
  • User Manual
    • What Data is Available
    • Data Products
      • Environmental conditions
      • Fluorescence intensity imaging
      • Genomics data
      • Geospatial information
      • Hyperspectral imaging data
      • Infrared heat imaging data
      • Multispectral imaging data
      • Meteorological data
      • Phenotype data
      • Point Cloud Data
    • How to Access Data
      • Using Clowder (Sensor and Genoomics data)
      • Using Globus (Sensor and Genomics data)
      • Using BETYdb (trait data, experimental metadata)
        • Accessing BETYdb via ArcMap and other GIS software
      • Using CoGe (Genomics)
      • Using CyVerse (Genomics)
      • Using Analysis Workbench (all data)
    • Data Use Policy
    • Manuscripts and Authorship Guidelines
    • Release / reprocessing schedule
  • Technical Documentation
    • Data Standards
      • Existing Data Standards
      • Agronomic and Phenotype Data Standards
      • Genomic Data Standards
      • Sensor Data Standards
      • Data Standards Committee
    • Directory Structure
    • Data Storage
    • Data Transfer
    • Data Processing Pipeline
      • Geospatial Time Series Structure
    • Data Backup
    • Data Collection
    • Data Product Creation
      • Genomic Data
      • Hyperspectral Data
    • Quality Assurance and Quality Control
    • Systems Configuration
  • Developer Manual
    • Submitting data to Clowder
    • Submitting data to BETYdb
    • Submitting Data to CoGe
    • Developing Clowder Extractors
  • Tutorials
  • Appendix
    • Code of Conduct
    • Collaboration Tools
    • Glossary
Powered by GitBook
On this page
  • Maricopa Agricultural Center, Arizona
  • Automated controlled-environment phenotyping, Missouri
  • Kansas State University
  • HudsonAlpha - Genomics
Export as PDF
  1. Technical Documentation

Data Processing Pipeline

PreviousData TransferNextGeospatial Time Series Structure

Last updated 7 years ago

Maricopa Agricultural Center, Arizona

Automated controlled-environment phenotyping, Missouri

At two points in the processing pipeline, metadata derived from collected data is inserted into BETYdb:

  • At the start of the transfer process, metadata collected and derived during Danforth's initial processing will be pushed.

  • After transfer to NCSA, extractors running in Clowder will derive further metadata that will be pushed. This is a subset of the metadata that will also be stored in Clowder's database. The complete metadata definitions are still being determined, but will likely include:

    • plant identifiers

    • experiment and experimenter

    • plant age, date, growth medium, and treatment

    • camera metadata

Kansas State University

HudsonAlpha - Genomics