PLANCK Facility Planning (LFI)
-
Introduction
Much of the instituional interfaces and facilities are called out in the
Science Implementation Plan (SIP). The details of the facility planning
are still left to be done through the scoping of the tasks to be done and
the definition of the actual scientific operations of interest to various
parties.
-
Simulation effort for full mission operations
-
data product transfer and interfaces in terms of hardware and protocol
-
special software for monitoring facility, network, and mission progress
and status
-
verification and validation of results
-
Mission Operations:
Mission Planning and Data Receipt
-
Geneva will be the facility for data receipt and basic mission planning
-
data and results transfer and interfaces
-
special software development for comparison to normal commanding modes
and operations
-
verification and validation of results
-
Instrument Performance
Monitoring
-
Location and number of scientists involved in instrument performance monitoring
-
data and results transfer and interfaces
-
special software development
-
verification and validation of results
-
Data Quality
Checking
-
Location and number of scientists involved in data quality checking monitoring
and at which levels
-
Result library and summaries
-
software development of increasing sophistication
-
verification and validation of results
-
Merge data streams
-
-
first merge at Geneva
-
second merge at Trieste (LFI & HFI) and corresponding HFI institution
-
software and hardware for tracking merged products
-
verification and validation of results
-
Calibrated
Time Ordered Data
-
Effort centered at Trieste but additional scientists at other institutions
required
-
tag of special data for ancillary sets such as beam map, dipole calibration,
etc.
-
characterize and remove instrument signatures - development by external
group and transfer to Trieste
-
filtering/baseline removal - development by external group and transfer
to Trieste
-
Generate Ancillary
Data Sets
-
Original work to be done by specialized scientists at their local work
place and transfered to Trieste
-
software standards and interface to provide modular, tested and documented
software to Trieste
-
verification and validation
-
Map Generation
-
Data processing at a very serious level - existing algorithms insufficient
-
developmen of new satisfactory algorithms
-
transfer to software, test and transfer to Trieste
-
verify software to generate maps and error covariance matrix
-
verification and validation of map making products
-
separation of signals, e.g. CMB, foregrounds, etc. requires extra software,
data bases, and perhaps facilities
-
Maximum
Likelihood Estimation of Power Spectrum
-
Data processing at a very serious level - existing algorithms insufficient
-
developmen of new satisfactory algorithms
-
transfer to software, test and transfer to Trieste
-
verify software to find power spectrum
-
verification and validation of power spectrum and covariance products
-
Test Cosmological
Models and assumptions, e.g. Gaussian versus non-Gaussian, goodness
of fit, ...
-
Cosmological
Parameter Estimation
-
Data processing at a very serious level - existing algorithms insufficient
-
developmen of new satisfactory algorithms
-
transfer to software, test and transfer to Trieste
-
verify software to estimate cosmological parameters
-
verification and validation of estimated cosmological parameters and covariance
products
-
CMB Polarization
Analysis
-
Data processing at a very serious level - existing algorithms insufficient
-
developmen of new satisfactory algorithms
-
transfer to software, test and transfer to Trieste
-
verify software to generate polarization maps or cross-correlation and
correlation functions and error covariance matrix
-
verification and validation of polarization map making or cross-correlation
with temperature products
-
separation of signals, e.g. CMB, foregrounds, etc. requires extra software,
data bases, and perhaps facilities
-
Additional
Scientific Goals & Processing
-
Data processing at a very serious level - goals not sufficiently well-defined
at present
-
developmen of approaches, algorithms, and expected data bases
-
verify software to generate catalogs, sky maps, etc.
-
verification and validation of products
-
separation of signals, e.g. CMB, foregrounds, etc. requires extra software,
data bases, and perhaps facilities