PLANCK LFI Data Quality Checking
Data quality checking is a critical part of mission operations
and the long term performance of the experiment. The data quality checking
tasks during operations are likely to be divided into the following categories:
Data Quality Checking During Operations
-
Instrument Science Performance
-
Does the instrument produce a high quality raw data stream?
-
Is the within reasonable operating ranges?
-
Are the trends indicating an incipient problem?
-
Are the data being tranmitted in proper condition?
-
Are all science channels in nominal mode?
-
Is on -board software handling the data pr
-
Ground Data Quality
-
Are the data being received properly?
-
Are the data being unpacked and archived properly?
-
Do the data exhibit the appropriate behavior?
-
Does the noise appear to be right level?
-
Power spectra of signals?
-
Integration and Signal Average performance?
-
Calibration signals at correct times and levels?
-
Planet and other source signals at appropriate levels
-
Are calibration and ancillary data in place and appropriate.
-
Are all calibrations and baselines at appropriate levels and in reasonable
trends?
-
Is the necessary ancillary and housekeeping data in place?
-
verification and validation of results
-
Data Quality Software development
-
Software repository contains correct source codes and data base files?
-
Data and results transfer and interfaces working properly?
-
Development and testing of new on-board software and processing and
its testing
-
Verification and validation of data quality checking results
-
Verification and Validation of Data Quality Checking
-
This is a management issue foremost but requires additional resources
-
Focus on continued scientific data quality monitoring processes
-
Special software development for comparison and testing of data quality
-
Verification and validation of instrument performance monitoring processes
For PLANCK LFI data quality checking will require substantial
preparation and also a reservoir of experienced scientists and engineers.
Work will need to be done prior to launch to develop the data quality checking
guidelines and then more effort after launch to refine and further develop
those procedures. We can also anticipate that a significant effort will
be needed to understand the science data quality checking in much greater
detail than the simple filter, status, sieve, and trending done for the
routine monitoring. This will require the same personnel
with the expertise to understand the detectors and instrument at a deep
level. e.g. thermal performance, sensor behavior, etc.and those that understand
the next level of data processing leading ultimately to the map-making
and scientific results. These are called out roughly in the following list.
Data Quality Checking Preparation and In-Depth
-
Definition and Development of Data Quality Checking
-
Specify and define required checking
-
Define normal modes and operations and expected data quality
-
Define nominal operating status and conditions and data stream
-
Develop and define approriate data base for comparison, calibration, and
trending data from I&T phase
-
Outline nominal data quality checking processes
-
How does the on-board software and data processing affect data quality?
-
Are there special diagnostic modes of operation or data that can be utilized,
if necessary?
-
Specify software development for data quality checking
-
Software and instrument data base definition
-
Data formats for input and output
-
Specify instrument performance monitoring results for review and analysis
-
Define software and instrument monitoring results reporting and exchange
-
Outline future special software development for in-flight data quality
checking in special modes
-
Development of on-board software and processing for instrument performance
monitoring and data quality checking
-
How is software verified, validated and delivered?
-
In-Depth Analysis of Data Quality during Operations
-
Expert personnel and scientists investigate the detailed data quality -
related to instrument performance
-
Investigation of data taken during special operations or sky targets
-
Investigation of data taken for calibration, beam mapping, sources, etc.
as a test and check of instrument performance
-
Follow up of anomalous signals and effects
-
Development and testing of new software for data quality checking as a
result of operations
-
In-Depth Reports of data quality checking
However, all the expertise and knowledge about the instrument
and its various subsystems will not reside at the institution with primary
responsibiltiy for data quality checking. So the knowledgeable and responsible
personnel will have to provide support in both the development of the data
quality checking processes, in the processes remotely, through verification
and validation of the data quality checking, e.g. review of reports and
results, and through in-depth monitoring of the instrument performance
and long-term health. This is one of the areas in which the US and NASA
supported staff can be very valuable to the Planck mission.
Tasks and Roles for Scientists in Data Quality Checking
-
Definition and Development of Data Quality Checking
-
Software specification and development for Data Quality Checking
-
Provide Expertise and Support for Data Quality Checking
-
Daily shifts of reviewing data early in the mission until baselines and
performance norms well-established
-
Interpretation and Review of results of data quality checks
-
Verification and Validation of data quality
-
Suggest and Develop data quality checks, software, and processes based
on in-flight results
-
In-Depth Analysis of Data Quality During Operations