Home
Where to Start
News and Updates
Tutorials
Data Products
Data Access
Sky Coverage
Instruments
Data Flow
Algorithms
Glossary
Help and Feedback
Known Problems
Search

Photometry QA

Up: Data Products Sections: Images - Object lists - Spectra - Tiling - summaryQA

Accessing summary and single-run QA information

The DAS serves the summaryQA tables and plots.

Overview of Quality Assurance

In order to assess the overall quality of the SDSS imaging data, each SDSS imaging run goes through a series of tests to confirm its overall photometric and astrometric quality. This document describes these tests in detail, in particular the detailed web pages that are made summarizing the results. These quality assurance tests have developed over the course of the survey, and have several goals:

  • To flag possible problems in the telescope or imaging camera;
  • To flag possible problems in the data themselves (caused by clouds, poor seeing, etc.)
  • To flag possible problems in the image processing or photometric or astrometric calibration;
  • To give quantitative measures of the accuracy of the SDSS photometry and astrometry, and their errors.

These pages describing the QA output are not meant to be read end-to-end, but rather to be used as a reference for those using the QA outputs to understand the quality of a given set of SDSS data. We have a summary of the photometric quality of each SDSS image frame. The file contents are described in detail in the summaryQA documentation. A description of the QA mechanisms may also be found in Ivezic et al. 2004.

Types of imaging QA

There are two types of quality assurance carried out for each SDSS imaging run: that produced by each pipeline (astrom, psp, frames, and nfcalib; see the EDR paper for descriptions of each of these pipelines, as well as the paper by Pier et al. (2003) for the astrometry) to diagnose immediate problems with running the pipeline, and summary quality assurance on the outputs, which is run after the data are completely reduced and calibrated. The latter final quality assurance itself comes in several flavors:

  • Quality assurance using internal checks within a given run: single-run QA
  • Global per-run summaries of quantities derived by single-run QA: summary QA
  • Quality assurance using the overlap between adjacent runs: overlap QA

With DR4, we make available the single-run and summary QA outputs; since overlap QA may involve data which are not public yet, we do not publish those outputs, but may do so in the future. These pages are devoted to a detailed description of single-run QA and a description of overall assessments of SDSS data quality.


Last modified: Fri Jun 24 16:46:57 CDT 2005