Knowing The Telescope Before It Is Built

May 5, 2015

By the end of the decade, the Large Synoptic Survey Telescope (LSST) will begin gazing at the sky and revolutionizing the study of dark energy and astronomy. Today, however, scientists are already hard at work learning how to analyze LSST's unprecedented amount and complexity of data with the Image Simulator.

Simulated image of a galaxy as seen by LSST, including progressively the effects of optical aberrations, optical misalignments, detector imperfections, random atmospheric distortion, distortion due to wind, and pixelization.

In systematically surveying the sky repeatedly with deep, large field of view images in 6 optical color bands on a 3 billion pixel camera for years, the LSST project will provide unprecedented constraints on dark energy and knowledge of the optical Universe. The observing process will result in nearly 7000 Terabytes - that's 7000 trillion bytes - of data per year, an amount which is more reminiscent of traditional high energy physics experiments than astronomical telescopes. Indeed, analyzing the LSST data will require a marriage of techniques from the two fields.

A collaboration of LSST scientists from several institutions - including KIPAC professor Steve Kahn, postdoctoral researcher Deborah Bard, graduate student Chihway Chang, and staff scientists Kirk Gilmore, Andrew Rasmussen, Marina Shmakova, and Stuart Marshall - are already in the process of understanding almost real LSST data with the Image Simulator, or 'ImSim'. The ImSim is like a virtual universe, atmosphere, and LSST telescope that exists entirely in computer bits. It takes a realistic - but 'fake' - sky full of galaxies, quasars, supernovae, stars, and solar system objects placed based on the latest astronomical knowledge of these realms, and rides with photons of light as they go through our atmosphere, the mirrors and lenses of the telescope, and into the pixels of the camera. All the while the subtle changes and imperfections in the later are represented in a realistic way, based on current best knowledge of the instrument and site properties.

A case study of why the ImSim is a crucial part of LSST development can be seen in weak lensing analyses. Probing dark energy's imprint on the history of the Universe by using the subtle distortions in the shapes of galaxies due to weak lensing requires a deep understanding of the instrumental effect and atmospheric effects on object shapes. Chang has led an effort to use the ImSim to study the subtle systematic errors introduced in weak lensing measurements by individual physical effects in the telescope, such as the changing atmosphere or small optical misalignments. This allows an understanding of how these may effect constraints on dark energy, and informs the development science analysis algorithms to best use the available data.

The entire process of generating a simulated sky and following through to a simulated image requires significant computing resources. KIPAC postdoc Bard, along with affiliate Garret Jernigan, have used SLAC's unique computing infrastructure to generate thousands of simulated LSST sky images, which are then fed to the LSST data management team and as an input for developing the data analysis algorithms. The active collaboration and interplay between the LSST instrument teams, science groups, and data management group are necessary to provide constant information and feedback, so that evolving instrument properties are reflected in the simulator, which can then inform the developing science analysis and data management systems.

This work is described further in paper published in the Proceedings of SPIE (SPIE, 2010, vol 7738).
Science Contact:
Chihway Chang
KIPAC
chihway@stanford.edu