Large Scale Structure
Large scale structure catalogues
The Large Scale Structure catalogs combine the list of cosmological tracer targets (typically galaxies or QSOs) with the results of the spectroscopic data reduction to create data and random catalogs that allow the user to estimate the cosmological tracer density fluctuations at any point within the survey footprint. Weights are assigned to tracers to account for observational imperfections such as a failure of the spectroscopic pipeline to obtain a redshift, fiber collisions that preclude simultaneously assigning spectroscopic fibers to targets closer than 62”, and non-cosmological fluctuations imprinted the target catalog, such as the correlation between targets and stellar density described in Ross et al. 2011, Ho et al. 2012, Ross et al. 2012. The random catalogs are designed to randomly sample the survey footprint with a density proportional to the map of the survey completeness. Pairs of data and random catalogues are generated for samples with distinct selection functions.
This page describes the LSS catalogues for BOSS and eBOSS. In BOSS, we generate LSS catalogues for the LOWZ and CMASS samples separately. The LOWZ target class captures objects primarily at z < 0.4, while the CMASS class selects objects at 0.4 < z < 0.8. With DR12 we also provide a combined sample catalogue, where these two samples are optimally combined (details in Reid et al. 2016). In eBOSS, we generate LSS catalogues for the LRGs (described in Bautista et al. 2017), QSOs (described in Ata et al. 2018) and ELGs (coming in DR16) independently. Finally, we continue to generate catalogs in the north galactic cap and south galactic cap separately, for the reasons detailed in Ross et al. 2011, Ho et al. 2012, Ross et al. 2012.
The DR14 eBOSS LSS catalogues for LRGs and QSOs are available on the SAS.
- Data files are described in data_DR14_QSO_NS and data_DR14_LRG_NS
- Random files are described in random_DR14_QSO_NS and random_DR14_LRG_NS.
- Mask files are described in mask_DRX_SAMPLE_NS
X stands for QSO or LRG, and NS corresponds to the galactic cap.
Mock catalogues described in the two reference papers will be made available in DR16.
To work directly with the catalogs used in BOSS analyses, download the following files directly from the SAS for DR10, DR11, and DR12 . Note that the DR12 catalogues are the final BOSS catalogues. The contents of each relevant file is described by a data model:
- Data files are described in galaxy_DRX_SAMPLE_NS
- Random files are described in randomN_DRX_SAMPLE_NS
where X, SAMPLE, and NS can change depending on the data release, the sample and the galactic cap.
Mock Galaxy Catalogs
We created a set of mock galaxy catalogs with the same survey footprint as the BOSS survey. We use these to validate our methodology and estimate the covariance matrix associated with our observables, which is necessary to compute the uncertainty on any derived quantity from our galaxy catalogs, such as the BAO scale. We used two different methodologies, QPM and PATCHY, both described in Alam et al. 2017. Galaxy and random catalogs are available for each set of mocks and described by a data model: mock_galaxy_DRX_SAMPLE_NS_QPM_IDNUMBER.fits.gz The data model for QPM and PATCHY mocks is the same.
More information on the PATCHY mock galaxy catalog is available from its description page on the Skies and Universes website.
LSS catalog creation code [mksample]
The code used by the BOSS galaxy clustering working group to produce the DR10, DR11, and DR12 catalogs is called mksample and is available to download on the SAS. The DR10 and DR11 catalogs were produced using the same algorithm but with different input files. The DR12 catalogs were produced with an updated version of mksample. For details of the DR10/DR11 algorithms, consult Anderson et al. 2014. The DR12 catalogs and updated mksample are described in Reid et al. 2016. The code can be used to create new sets of random catalogs or to generate new catalogs for a subsample of galaxies in the CMASS or LOWZ target classes. The necessary input files and algorithms are described in more detail in the tutorial.