20
Preprint typeset in JINST style - HYPER VERSION A systematic approach to the P LANCK LFI end-to-end test and its application to the DPC Level 1 pipeline * M. Frailis a, M. Maris a , A. Zacchei a , N. Morisset b , R. Rohlfs b , M. Meharga b , P. Binko b , M. Türler b , S. Galeotta a , F. Gasparo a , E. Franceschi c , R. C. Butler c , O. D’Arcangelo h , S. Fogliani a , A. Gregorio f , S.R. Lowe g , G. Maggio a , M. Malaspina c , N. Mandolesi c , P. Manzato a , F. Pasian a , F. Perrotta d , M. Sandri c , L. Terenzi c , M. Tomasi e , A. Zonca e a Osservatorio Astronomico di Trieste, INAF, Via Tiepolo 11, 34143 Trieste, Italy b ISDC, University of Geneva, ch. d’Ecogia 16, 1290 Versoix, Switzerland c IASF - Sezione di Bologna, INAF, Via Gobetti, 101, 40129 Bologna, Italy d SISSA - ISAS, via Beirut 2-4, 34151 Trieste, Italy e Dipartimento di Fisica, Università degli Studi di Milano, Via Celoria 16, 20133 Milano, Italy f Dipartimento di Fisica, Università degli Studi di Trieste, Via Valerio 2, 34127 Trieste, Italy g Jodrell Bank Centre for Astrophysics, The University of Manchester, Manchester, M60 1QD, U.K. h Istituto di Fisica del Plasma, CNR, Via Cozzi 53, Milano, Italy –1– arXiv:1001.4838v1 [astro-ph.IM] 27 Jan 2010

A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

  • Upload
    others

  • View
    7

  • Download
    0

Embed Size (px)

Citation preview

Page 1: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

Preprint typeset in JINST style - HYPER VERSION

A systematic approach to the PLANCK LFI end-to-endtest and its application to the DPC Level 1 pipeline ∗

M. Frailisa†, M. Marisa, A. Zaccheia, N. Morissetb, R. Rohlfsb, M. Mehargab, P. Binkob,M. Türlerb, S. Galeottaa, F. Gasparoa, E. Franceschic, R. C. Butlerc, O. D’Arcangeloh,S. Fogliania, A. Gregorio f , S.R. Loweg, G. Maggioa, M. Malaspinac, N. Mandolesic,P. Manzatoa, F. Pasiana, F. Perrottad , M. Sandric, L. Terenzic, M. Tomasie, A. Zoncae

aOsservatorio Astronomico di Trieste, INAF,Via Tiepolo 11, 34143 Trieste, Italy

bISDC, University of Geneva,ch. d’Ecogia 16, 1290 Versoix, Switzerland

cIASF - Sezione di Bologna, INAF,Via Gobetti, 101, 40129 Bologna, Italy

dSISSA - ISAS,via Beirut 2-4, 34151 Trieste, Italy

eDipartimento di Fisica, Università degli Studi di Milano,Via Celoria 16, 20133 Milano, Italy

f Dipartimento di Fisica, Università degli Studi di Trieste,Via Valerio 2, 34127 Trieste, Italy

gJodrell Bank Centre for Astrophysics, The University of Manchester,Manchester, M60 1QD, U.K.

hIstituto di Fisica del Plasma, CNR,Via Cozzi 53, Milano, Italy

– 1 –

arX

iv:1

001.

4838

v1 [

astr

o-ph

.IM

] 2

7 Ja

n 20

10

Page 2: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han-dling of the scientific and housekeeping telemetry. It is a critical component of the Planck groundsegment which has to strictly commit to the project schedule to be ready for the launch and flightoperations. In order to guarantee the quality necessary to achieve the objectives of the Planckmission, the design and development of the Level 1 software has followed the ESA Software Engi-neering Standards. A fundamental step in the software life cycle is the Verification and Validationof the software. The purpose of this work is to show an example of procedures, test developmentand analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used andthe results obtained. Different approaches have been used to test the scientific and housekeepingdata processing. Scientific data processing has been tested by injecting signals with known proper-ties directly into the acquisition electronics, in order to generate a test dataset of real telemetry dataand reproduce as much as possible nominal conditions. For the HK telemetry processing, valida-tion software have been developed to inject known parameter values into a set of real housekeepingpackets and perform a comparison with the corresponding timelines generated by the Level 1. Withthe proposed validation and verification procedure, where the on-board and ground processing areviewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of thePlanck/LFI raw data is correct and meets the project requirements.

KEYWORDS: Software Engineering; Data Handling; Space instrumentation; Instruments forCMB observations.

∗Submitted to JINST: 24 June 2009, Accepted: 23 November 2009, Published: 29 December 2009. Reference : 2009JINST 4 T12021 DOI: 10.1088/1748-0221/4/12/T12021

†Corresponding Author, e–mail:[email protected]

Page 3: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

Contents

1. Introduction 1

2. The LFI software verification and validation 3

3. The LFI acquisition chain and the L1 pipeline 3

4. End-to-end test objectives and methods 54.1 Scientific data processing: testing method and setup 6

5. Data analysis procedures 75.1 Software 75.2 On-line and off-line analysis 85.3 Signal reconstruction 85.4 PType comparison 105.5 OBT reconstruction 11

5.5.1 Anomalies in the sampling rate 115.5.2 Packet-Peak correlation 125.5.3 Period determination 13

5.6 ADU conversion 135.7 Quantization errors 145.8 Decompression 14

6. Housekeeping processing validation using a known pattern 146.1 The HVS system 156.2 HK tests results 16

7. Conclusions 17

1. Introduction

The design and development of the LFI-DPC software follows the guidelines provided by the ESASoftware Engineering Standards [1]. The purpose of these standards is to guarantee as much aspossible the quality of the software produced, in terms of compliance with the user and softwarerequirements, stability, accuracy, efficiency and commitment to the project schedule.

A fundamental stage in the software life cycle is the software verification and validation. Ac-cording to the ESA standard [2], “verification” means the act of reviewing, inspecting, testing,checking, auditing, or otherwise establishing and documenting whether items, processes, services

– 1 –

Page 4: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

or documents conform to specified requirements. “Validation” is the process of evaluating a sys-tem or component during or at the end of the development process to determine whether it satisfiesspecified requirements. Hence, validation consists of an “end-to-end” verification, where knowninputs are provided to the system and, after correct processing, the outputs are analyzed and cross-checked with the expected results.

In this paper we present the approach adopted by the LFI DPC for the end-to-end tests of theLFI Level 1 software by describing methods, test procedures and results of the validation. TheLevel 1 (L1) of the LFI processing is formed by three subsystems:

• A Real-time Assessment system (RTA): based on SCOS 20001, the ESA generic missioncontrol system and Electrical Ground Segment Equipment (EGSE), it is a system for thereal-time monitoring of the housekeeping telemetry, in order to verify the overall health ofthe instrument and detect possible anomalies. It receives telemetry packets directly fromthe instrument and provides TCP/IP and CORBA interfaces to communicate with externaldevices and software.

• A TeleMetry Handler system (TMH): developed by the ISDC (Integral Science Data Center)team, on the basis of the user requirements defined by the LFI DPC, the TMH system isthe core of the L1 pipeline. It receives in real-time scientific (SCI) and housekeeping (HK)telemetry packets from the RTA system, then it decodes their content, reconstructs the timeof the SCI and HK samples and creates time series (also called Time Ordered Information,TOIs) for each data source and acquisition mode.

• A Telemetry Quick-Look system (TQL): mainly developed to perform an interactive quickanalysis of the LFI data, it provides a set of graphical tools to display the LFI scientific dataand the satellite HK data and calculates quick statistics and fast Fourier transforms.

A more detailed description of the LFI-DPC Level 1 software can be found in [6].The development and release of the Level 1 software has followed the model philosophy

adopted for the development stages of the LFI instrument (and in general for the space projects[3]). Each model of the software implemented the additional functionalities necessary to supportthe tests, calibration and operations of the corresponding model of the instrument. First, a Bread-Board Model of the Level 1 was released in 2002, followed by a Demonstration Model [4]. Forthe Qualification Model of the instrument, providing a prototype fully representative of the instru-ment functionality [5], a Qualification Model version of the Level 1 was released in 2005. A FlightModel release has followed to support the calibration and integration tests of Planck/LFI. The lastLevel 1 software model, the Operations Model, integrating all the Planck Science Ground Segmentinterfaces, has been finally released to support the Planck nominal flight operations.

The tests have been focused on the validation of the TMH/TQL system and the customiza-tions applied to the standard distribution of SCOS 2000. The tailoring of SCOS concerns the taskhandling the reception of the telemetry from an external source, the task sending the telemetry tothe TMH and the definition of the Mission Information Base. A first run of the tests has been per-formed on the Qualification Model of the TMH/TQL. Subsequently, tests have been repeated onthe Flight Model, after solving the bugs identified in the previous release.

1www.egos.esa.int/portal/egos-web/products/MCS/SCOS2000/

– 2 –

Page 5: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

The main testing scheme used is a black-box testing, where known inputs have been providedto the pipeline and the results of the processing have been compared with the expected outputs.To test the scientific telemetry processing, a digital signal generator has been connected directly tothe analog gates of the LFI acquisition chain in order to sample a signal with known properties.Metrics have been applied to the output of the TMH/TQL to assess the proper reconstruction of theinput signal. For the housekeeping telemetry processing, software have been developed to insertknown parameter values into an archive of real LFI HK packets [7].

2. The LFI software verification and validation

Verification & Validation (V&V), within the Planck/LFI DPC context, has to be intended as Scien-tific Software Verification and Validation. This means that the main subject of V&V is the quality ofthe scientific product which may be obtained with the software delivered and installed at the DPC.The validation process assures that all scientific software meet their functional and performancerequirements according to a given set of validity criteria.

First, a Software Integration, Verification and Validation Plan (SIVVP) has been developed bythe LFI DPC to define the basic principles, methods and procedures that shall be followed for thesoftware verification and validation at the acceptance point [8], in order to assure

• the validation of each software block taking into account its functional and performancerequirements;

• the verification of each software component at specific points of its life cycle;

• the accuracy of the software against the accuracy stated by the software procedures;

• that the software satisfies all the requirements defined in the corresponding User RequirementDocument (URD).

Acceptance testing should require no knowledge of the internal workings of the software.Hence, the general procedure for the verification and validation of the software, at the acceptancepoint, is a black-box test. In such type of tests, the software is treated as a “black-box” whoseinternals cannot be seen [11]. The validation process consists of the comparison of the results ofeach software component with the results expected by the user for a given predefined input datasetthrough the application of predefined validity criteria. When an incremental delivery approach isused, acceptance tests only address the user requirements of the new release, since regression testsare performed in system testing. For each test, validity criteria are given as a list of Test Pass/FailCriteria (TP/FCs).

3. The LFI acquisition chain and the L1 pipeline

A detailed description of the LFI on–board acquisition chain and the LFI L1 processing is given by[6]. In this work, we use a simplified model of the L1 pipeline in order to present the test methodsand procedures applied for the end-to-end tests of the L1 software.

– 3 –

Page 6: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

Figure 1: Schematic representation of the scientific on–board processing, processing parametersand processing types for the LFI. The diagram shows the sequence of operations leading to eachprocessing type: coadding, mixing, requantization (Requant) and compression (CMP). A detaileddescription of the operations is given in [6].

In this model, data in form of time series are generated from various SCI and HK sources (i.e.analog detectors, sensors, registers) on–board the spacecraft and read by the Digital AcquisitionElectronics (DAE) which samples the analog signals at discrete times. The analog output of theinstrument read by each scientific ADC is either a sequence of interleaved sky and reference–loadmeasures, or a sequence of measures of only sky or only reference–load, depending on the statusof the phase–switches associated with each radiometer of the LFI instrument.

The DAE is polled at regular intervals by the on-board processor, the REBA, which runs apre-processing pipeline, reducing the amount of scientific data generated by the instrument andstoring the data in telemetry packets to be sent to ground. A set of 7 extraction points are definedalong the REBA pipeline, in order to recover valuable diagnostic data. Each point corresponds toa Processing Type (PType) applied by the REBA on the scientific data (see figure 1). The REBAsoftware can apply concurrently two different PTypes to a single LFI channel. So, in nominalconditions, the instrument will generate just PType 5 data from all of its 44 detectors and shortchunks of PType 1 data from a single detector in turn.

In particular, in PType 5 three main processing steps are applied [6]. First, a double differenceis performed between two averaged sky and reference–load samples, Ssky and Sload :

P1 = Ssky−GMF1 ·Sload (3.1)

P2 = Ssky−GMF2 ·Sload (3.2)

where GMF1 and GMF2 are two different gain modulation factors. Then, the two values obtained

– 4 –

Page 7: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

are requantized converting them into two 16-bit signed integers:

Qi = round [q · (Pi +Offset)] (3.3)

Finally, the REBA performs a loss-less adaptive arithmetic compression of the data obtained inthe previous step. Hence, the REBA pipeline is tunable through a set of processing parametersor REBA parameters: Naver (the number of consecutive sky or load samples that are averaged), q,Offset, GMF1 and GMF2 [6].

The L1 ground software receives the telemetry packets generated by the instrument. In partic-ular, the scientific telemetry is processed by the TMH/TQL system, which has to handle properlyeach PType in order to regenerate the time series acquired on-board. The exact steps of the L1pipeline processing depend on the data source and the PType of each telemetry packet.

4. End-to-end test objectives and methods

To validate the Demonstration Model of the L1 software, developed before the integration and de-livery of the LFI instrument, software were developed in order to generate simulated HK telemetry[13] and scientific telemetry [14, 15]. The scope of end-to-end (E2E) tests, instead, is to assessthe proper coverage of the requirements by the L1 software and validate its operations by using, asmuch as possible, the data generated by the testing campaign of the instrument, in particular datagathered from the Qualification Model and the Flight Model of the instrument. The reason is theneed to test the software in the most realistic conditions, by performing real instrument operationsthat allow testing of various operative aspects of the pipeline design. The usage of simulated datahas been limited only to cases in which it was not possible to perform a complete E2E test.

Tests have been ordered according to a functional classification of the requirements, consistingof:

1. Handling of raw TM packets:

(a) communication with SCOS,

(b) storing of raw packets,

(c) proper interpretation of primary and secondary headers,

(d) proper commutation of packets according to their purpose, source, service, size andtimestamp,

(e) hexadecimal dump of packet content,

(f) other packet related services;

2. Handling of time information;

3. Decompression, decoding and reconstruction of scientific packets content;

4. Decoding and reconstruction of HK packets content;

5. TOIs generation;

– 5 –

Page 8: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

Figure 2: Hardware setup during the end-to-end tests.

6. Graphical displays;

7. On-the-fly software analysis;

8. Other services.

There are overlaps between some of the classes. In particular, the first 4 classes are strictlyconnected, allowing a minimization of the number of tests. Classes from 6 to 8 can be tested bycomparing their output directly with the TOIs rather than the input signals. Test procedures havebeen defined to cover one or more test cases of the L1 E2E test plan and formalized into a set oftest controls sheets.

4.1 Scientific data processing: testing method and setup

Figure 1 suggests a possible testing strategy, i.e. comparison testing. This is based on the com-parison of data taken with different PTypes. In the figure, a small gauge is connected to PType0 and PType 4 data. The gauge represents the L1 software receiving PType 0 and PType 4 datafrom the same data source (detector), processing and comparing them. Given that the compres-sion/decompression is a completely loss–less process, a correct result would be PType 4 data, afterdecompression, to be identical to the corresponding PType 0 data.

More generally, even an absolute testing is required to assure that data are properly acquired,processed, received, reconstructed, displayed and stored by the pipeline as a whole. This is obtainedby acquiring a signal of well known properties (period, shape, amplitude, voltage levels, dutycycles) at the DAE analog gates, processing it through the whole system and comparing it with theoutput by using a suitable set of metrics. Depending on the type of test, identity between input andoutput or statistical agreement among them have been used as a criteria to assess the success orfailure of the test. As a by product, this kind of testing is able to detect potential problems in theon–board acquisition electronics and software.

– 6 –

Page 9: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

Figure 2 represents the complete acquisition chain of LFI during the E2E tests. The signalis injected through an oscilloscope probe at the input of a DAE pin corresponding to the detectorto be tested. The signal is generated through a stabilized, digital function generator, and it ismonitored by a digital oscilloscope connected to the signal generator by a high impedance probe.Monitoring is required to asses that the signal produced by the function generator matches therequired characteristics. In some tests, the same signal has been distributed to different inputs by asignal distributor.

To analyze the test data, it would have sufficed to correlate the output signal with the inputsignal, but practically it was difficult to record the input signal separately. So, the statistical prop-erties of the output have been compared to the properties of the input measured by the oscilloscopeand characterized by the following parameters: the shape, the period T , the duration of the highstate Tup, the duration of the low state Tdown, the duty cycle δcycle = Tup/T , the low state voltageVlow, the high state voltage Vup, and the peak-to-peak amplitude A =Vup−Vlow. The source of thesignal shall ensure that all of these parameters are known accurately, a good S/N (at least 20), thatthe signal fits the range of voltages acceptable to the input gate at which it is connected, that itis stable, within the quantization step of the ADC converter, and that Vlow, Vup, Tup, and Tdown areadjustable within a wide range of values.

For each test, the general procedure has been the following:

i. the signal generator has been plugged to the DAE input corresponding to the channel to betested;

ii. the signal generator has been set-up for the type of signal needed for the specific test. Theoscilloscope has been used to check the signal properties;

iii. telecommands have been sent to the DAE to set the proper acquisition setup;

iv. the DAE acquisition has been started keeping the generator off,

v. after about 5 seconds the generator has been switched on for the time required by the test;

vi. the generator has been switched off and after about 5 seconds the acquisition has beenstopped.

In this procedure, the steps in which the signal generator is switched on and off have beenused to allow a semiautomatic identification of the start and end time of each test within the time-lines generated by the L1 software. The setup of each test has been recorded by documenting thecabling, the signal generator setup, keeping screenshots of the SCOS displays and archiving thelog generated through the TQL where each test phase has been annotated. The subsequent dataanalysis has been carried out off–line.

5. Data analysis procedures

5.1 Software

An analysis toolkit, OCA (On-board Computing Analysis), implemented in the Interactive DataLanguage (IDL) and in C++, with additional off-the-shelf freeware libraries and applications, has

– 7 –

Page 10: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

been developed to analyze the test data. OCA provides simple analysis methods and automatedreporting. The use of this toolkit assures repeatability of test analysis.

In particular, OCA allows the processing of the raw telemetry archive generated by the TMHtogether with the intermediate products of the pipeline and the corresponding log files. It providesfunctions to: decode the packets content, in particular the On-Board Time (OBT) and the acquisi-tion parameters stored in the tertiary header of the scientific packets; scan the TOI archive and splittests into frames; detect pulses of known shapes within packets or TOIs; identify the source packetfrom which a given sample was extracted; perform comparisons between the TOIs and correspond-ing scientific packets. Moreover, it simulates the conversion of PType 1 data into PType 2 data, toverify the correct processing of these data types [12].

5.2 On-line and off-line analysis

Depending on the scope of the test, data analysis was performed on-line and off-line. On-lineanalysis has been carried out for all of the live operations of the L1 software. For instance, wetested on-line the TQL function which generates an hexadecimal dump of the packets by comparingit with the same function performed by SCOS, the function which monitors the REBA parameterscurrently in use, the graphical displays and the FFT computation. Most of the other functionalities,such as OBT and TOI reconstruction, have been tested off-line.

The first group of tests have covered the basic functionalities. In particular, we checked thatall the TM packets are received and stored, that each packet header is properly handled in thepackets archive, the proper interpretation of primary, secondary and tertiary headers, that packetsare properly registered, i.e. grouped according to the data source and processing type and convertedinto TOIs. In this group of tests, the checking of the TOI creation was focused on testing the propermapping between the TM packets and the raw samples contained in the TOIs.

The test checking for proper registration of scientific packets is illustrated in figure 3. In thattest, the acquisition was kept running while the signal generator was connected in turn to each ana-log input of the DAE. The data analysis looks for erroneous activation of unplugged channels and/orlack of activation of the plugged ones. It is evident that the presence of a signal in one of the TOIsexcludes signals in others (apart from short transients, corresponding to the plugging/unpluggingoperations) and that there are no holes in the data.

5.3 Signal reconstruction

A more advanced test procedure and analysis was required to test the OBT reconstruction and theconversion to physical units. Ideally, those two elements of the processing should have been testedseparately. In practice, time handling is connected to the identification of features in the signal,such as the period between subsequent peaks or the reconstruction of the phase information, whichare obviously connected to a proper signal reconstruction.

After having identified the region where the signal generator is “on”, a Lomb-Scargle Peri-odogram (LSP) is applied to assess the period of the signal as measured by the OBT time scale.The LSP has the advantage, over the FFT, of being robust against perturbations introduced by er-roneous inclusions of short no-signal zones or interruptions in the data. The LSP is automaticallyscanned looking for the peak with the highest power, giving a first approximation of the period.

– 8 –

Page 11: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

transients

signal

Figure 3: Simplified scheme of registration for 16 of the 44 LFI channels. Note that the verticalscale is in volts, but each plot is vertically shifted for graphical purposes. For the same reason thetime line is split and shifted back at the level of the vertical red–line.

The period is further refined by fitting the LSP peak with a sinc(x) with two free parameters: thedifference in the peak period δ p with respect to the approximate period and the peak normalization.The fitting procedure estimates also the error in the two free parameters. The error in δ p is takenas the error in the period estimate.

The phases and amplitude determination depends on the wave shape. For square waves, Vlow,Vhigh, amplitudes and phases are determined by using a χ2 fitting combined with a phase–foldeddiagram. The χ2 is computed between the model calculated for a given tentative phase, Φ, and thesignal. A fine-grained grid of phases is explored and the minimum χ2 is used to assess the bestphase.

The model for the triangular wave assumes that the transition between the growing and the de-creasing ramp is instantaneous. The peak-to-peak amplitude for the triangular wave is determinedby the moments of the cumulative distribution function of the samples. In particular

A = σtriangular√

12. (5.1)

In this way, the determination of the peak-to-peak amplitude is decoupled from the determinationof period and phase. The phase is again determined by the χ2 method used for the square waves.

– 9 –

Page 12: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

�0.4 �0.2 0.0 0.2 0.4SKY(AVR) [V]

�0.4

�0.2

0.0

0.2

0.4

SK

Y(C

OM

) [V

]

XXX_9022 Block: 1 - Regression test: AVR vs COM SKY

(a)

0 50 100 150 200OBT [sec]

�0.0002

�0.0001

0.0000

0.0001

0.0002

Pro

cessin

g e

rror

[V]

XXX_9022 Block: 1 - SKY(COM) - SKY(AVR)

(b)

�100 �50 0 50 100Lag [samples]

0.0

0.2

0.4

0.6

0.8

1.0

|Corr

ela

tion|

XXX_9022 Block: 1 - Cross-correlation(AVR,COM)

(c)

Figure 4: Frame a) Comparison of two different processing types (PType5 and PType1) applied tothe same data stream. Points are the original data, the yellow line is the ideal relation. Frame b)Processing error introduced by PType5. The red lines represent the average expected processingerror and the ±1 band. Note that the distribution of this error is visibly not Gaussian. Besides,digitization of data is very tiny and the distribution of noise in the region where the signal generatoris off is not uniform. Frame c) Cross-correlation test between PType 5 and PType 1.

5.4 PType comparison

Since the REBA is able to process data from the same detector with two different processing typesin parallel, this functionality was used to perform a comparison between the TOIs of the samesignal obtained from two PTypes. An example is given in figure 4, where data acquired with PType1 have been compared with the same data acquired with PType 5. The example concerns a testwhere a square wave with a period of 1 second, peak-to-peak amplitude of about 0.84 V and dutycycle δcycle = 0.25 has been used; phase switching was left off, so each resulting TOI contains onlysky samples. The figure covers the analysis performed on data from feed-horn 28, radiometer 0 anddetector 0 only. The processing parameters used are: Naver=126, GMF1=1, GMF2=-1, Offset=0,q=1.

In particular, the first frame in figure 4 shows a correlation plot of PType 5 (COM) versusPType 1 (AVR) data. As it is possible to see, the correlation is perfectly linear. A more refinedtest is the evaluation of the processing error (or quantization error) introduced by the PType 5transformations. It is defined as:

εq =VPType5−VPType1 (5.2)

where VPType1 and VPType5 are, respectively, the measures obtained acquiring the data at the sametime by using both the 1 and 5 processing modes. The error, compared to the expectation fromprocessing parameters, is plotted in the second frame of figure 4. It shows the effect of the digi-tization which, in this case, is very tiny. Some structure in the noise is clearly present, especiallyat the left and right side where the signal was off. Moreover, the noise does not follow a Gaussiandistribution.

– 10 –

Page 13: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

An additional test performed is the cross-correlation index, C I1,5 = ρ1,5(0), between PType 1

and PType 5. We evaluated the cross-correlation for a range of time lags. The cross-correlation iscalculated by shifting the TOI obtained from PType 5 data with respect to the one obtained fromPType 1 data by a lag of ∆t, taking ∆t = 0 as the case for overlapping PType 5 with PType 1.

The results obtained always showed ρ1,5(∆t)≤ ρ1,5(0). The third frame of figure 4 shows anexample of such case for one of the tests, where we have obtained 1−C I

1,5 = 1.81 ·10−8.

5.5 OBT reconstruction

Testing for proper handling of time information is critical since time series are matched by com-paring samples with the same OBT. In particular the matching of the OBTs is used to correlatescientific time series with HK time series. OBTs are also used to correlate scientific data acquiredby using different PTypes from the same detector and estimate the quantization error εq.

In particular, when testing the OBT reconstruction, we consider the following type of anoma-lies: gaps between the last sample of a packet and the first sample of the subsequent one; overlapsbetween the last samples of a packet and the first samples of the subsequent one; the presence ofduplicated packets; possible errors in estimating the time scale (sampling step) and the zero pointof the OBT.

The test procedures for the OBT reconstruction consist in:

i. checking whether the sampling rate derived from OBT is consistent with Naver;

ii. checking whether there are defects (holes) in the OBT;

iii. verifying that holes (if present) in the OBT are just due to acquisition stops;

iv. measuring periods, duty-cycles of signals and checking if they are consistent with the periodsimposed by the signal generator;

v. checking that the correlation between the OBT of a packet and the OBT in the correspondingTOIs is correct;

vi. correlating two different PTypes applied to the same signal.

5.5.1 Anomalies in the sampling rate

The sampling period for PType 1, 2, 3, 5 and 6 data shall be

τsampling =Naver

8192seconds. (5.3)

For any pair of consecutive samples [i, i+ 1], where i = 0 denotes the first sample in the TOI, theOBT should be a monotonically increasing quantity; so, denoting with tob the OBT and having

τ′sampling,i = tob

i+1− tobi (5.4)

then for any i, τ ′sampling,i ≡ τsampling. Hence, a simple histogram of τ ′sampling,i allows discoveringthe presence of anomalies, i.e. cases in which τ ′sampling,i 6= τsampling. Table 1 gives a summaryof the possible anomalies and their most likely source. Gaps in the timelines are allowed when

– 11 –

Page 14: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

Table 1: Diagnostic of τsampling.

Case Diagnosticτ ′sampling,i ≤ 0 repeated packets or reset in the on–board clockτ ′sampling,i < τsampling overlapping of packets or erroneous decrease in Naver

τ ′sampling,i = τsampling the nominal conditionτ ′sampling,i > τsampling gap in the acquisition or erroneous increase in Naver

the acquisition is stopped. The test is complemented by cross-checking the OBT in the packetsecondary header and the corresponding OBTs in the TOIs.

In an early test, a problem discovered by this method was an inconsistent distribution ofτsampling for PTypes 1 and 2 when the phase switch was off. They were a factor of 2 too large.This occurred because of an incorrect handling of the OBTs by the TMH in this instrument con-figuration. After fixing the problem the test was successfully repeated. No defects or unexplainedgaps in the OBT have been found when the phase switch was on.

Another example of a problem discovered and corrected by looking at an erroneous OBTreconstruction was the case of an early attempt to carry out acquisition of PType 2 and 5 together.In that case, the TMH did not register properly the PType 5 data since the data of both PTypes werestored in the same TOI instead of storing them into separate timelines. Since PType 5 packets aregenerated at a rate of about 1/6 of PType 1, the OBT apparently jumped back about every 6 PType1 packets.

5.5.2 Packet-Peak correlation

Packet-Peak correlation is a method to assess the use of the OBT as a way to correlate packetsto events occurring in different time lines. The measure of Packet-Peak correlation is assessed bymeasuring the time interval between the first peak contained in the packet, tob

peak, and the time stampof the packet, tob

packet.For square waves, the peak position is defined as the OBT for which folding(tob

peak) = δcycle/2then a square wave “belongs” to a packet if tob

packet ≤ tobpeak.

Then, the packet-to-peak correlation

∆Tpck−peak = tobpeak− tob

packet

can be easily predicted when the period of production of packets and the period of the square waveis known. In particular, for uncompressed data, packets are generated with period:

τpck =NpckNaver

fsamplingseconds, (5.5)

where Npck is the number of samples in the packet: 490 for any uncompressed PType, exceptPType 1 for which Npck = 245. The deviation of ∆Tpck−peak from the predicted value is measuredby:

C Tpck−peak =

∣∣∣∆Tpck−peak−∆T Calcpck−peak

∣∣∣τpck

, (5.6)

– 12 –

Page 15: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

where ∆T Calcpck−peak is the ∆Tpck−peak predicted. The test fails if C Tpck−peak > 1/(103Naver) ≈ 10−6.

In no cases a packet–peak correlation worst than C Tpck−peak < 10−11 have been detected.

5.5.3 Period determination

An erroneous reconstruction of the OBT would result in an erroneous estimate in the period of thesignal as measured by using the LSP. Given that the OBT of the packets is given in seconds, with a16 bits accuracy for the fractional part, the maximum error in determining the period should be ofthe order of 10−4 seconds.

To check for this case we did various tests with different combinations of signal periods, am-plitudes and Naver. Periods in the measured waveforms have been fitted by using the LSP togetherwith phase folding as explained in Sect. 5.3. Typically, accuracies better than 9× 10−5 sec havebeen obtained.

5.6 ADU conversion

The use of a signal generator allows conversion from ADU (Analog to Digital Unit) to physicalunits (Volts) to be tested. A linear fit between the fitted Vf it and the generator values Vgenerator,

Vf it = Intercept+Slope ·Vgenerator, (5.7)

has been performed. Pearson statistics have been used as a way to assess linearity. The fit has beenperformed either comparing the High levels, the Low levels or both. The ideal case would be:

Intercept = 0,Slope = 1,Pearson = 1. (5.8)

Typically, we obtained |Intercept|< 5×10−3, |1−Slope|< 10−2 and |1−Pearson|< 10−4.

�1.0 �0.5 0.0 0.5 1.0 1.5sky qerror [adu]

�1.5

�1.0

�0.5

0.0

0.5

1.0

load q

err

or

[adu]

XXX_9022 Block: 25 - Qerr Correlation, data x-black, simul. +-orange

(a)

�4 �3 �2 �1 0 1 2 3 4sky qerror [adu]

�4�3�2�10

1

2

3

4

load q

err

or

[adu]

XXX_0199 Block: 1 - Qerr Correlation, data x-black, simul. +-orange

(b)

Figure 5: Frame a) Distribution of quantization errors for sky vs quantization errors for load fortrue PType 5 data (black crosses) and the simulated processing (orange crosses). Frame b) Thesame test performed on another data set after the bug on the on-board software was corrected.

– 13 –

Page 16: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

5.7 Quantization errors

Comparing PType 5 and PType 1 data allowed discovering anomalies in the quantization error.Tests have shown that the differences between PType 1 and PType 5 are small and compatible withthe expected quantization error (depending on the REBA processing parameters applied). However,in a first run of these tests, in some cases it was noted that the distribution of quantization errorsdid not follow the expectation exactly, since they were not symmetrically distributed around zero.

A more detailed analysis is shown in Fig 5a, where the processing leading to PType 5 genera-tion from PType 1 data has been carried out according to the documented on-board algorithm. Thefigure shows a scatter plot of sky vs. load processing error. It is possible to see that the expectedprocessing error is not distributed as the real one. After the analysis, it was concluded that it wasdue to a bug in the on-board software, when rounding negative values in the quantization step. Thebug was subsequently corrected and a second run of the tests has shown that the quantization errornow follows the expectation (Fig 5b).

5.8 Decompression

PType 2 and PType 5 data where acquired together. Data from PType 5 processing are comparedwith data from PType 2 processing. No differences were revealed between PType 2 and PType 5data. The compression/decompression procedure passed this test.

6. Housekeeping processing validation using a known pattern

Unlike the case of the scientific telemetry, it wasn’t possible to inject known signals into the HKpackets directly through the DAE/REBA chain. The HK parameters are heterogeneous and includetemperature sensor values, current consumptions, register values, switch statuses. Different param-eters are grouped by hardware unit, sampling frequency and service purpose and stored in the sameHK packet.

The structure of the HK telemetry packets strictly follows the rules of the ESA Packet Uti-lization Standard (PUS). According to the PUS, each field in a telemetry HK packet is either aparameter field containing a parameter value or a structured field containing several parameterfields organized according to a set of rules. The PUS permits the structure of a HK packet to bedefined by specifying, for each parameter within the packet, the offset with respect to the packetprimary header, the parameter type code and format code and the offset of the subsequent samplesof the parameter.

The HK packet structure definitions are provided in an Interface Control Document (ICD) ofthe instrument and then mapped in the SCOS Mission Information Base (MIB), a set of ASCIItables containing all the monitoring parameters characteristics and location within each telemetrypacket and the telecommands characteristics. Both the SCOS system and the TMH/TQL system ofthe LFI-DPC import the packet structure definitions from the MIB tables.

In order to test the decoding and reconstruction of the HK packets content, an HousekeepingValidation System (HVS) has been developed to manipulate the binary representation of an HKpacket with the goal of generating packets with known parameter values starting from a set of realHK packets.

– 14 –

Page 17: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

6.1 The HVS system

The HVS system has been developed to test the correct processing of each HK packet type. Packetsof the same type have an identical structure and are identified by 4 fields: the Application ID(APID), the service Type and Subtype, contained in the packet primary and secondary headers, andthe Structure ID (SID), contained in the first two bytes of the data segment.

For each HK packet type and parameter within it, the HVS system creates a predefined patternof values. It iterates over all the samples of the parameter, setting to 1 one bit a time. Hence, foreach sample, only a single bit of a single parameter has value 1 while all the other bits have value0. This implies that for a given HK packet type, each parameter in turn takes increasing power of 2values. One of the purposes of this pattern is to verify that in the TMH/TQL system the offset andthe length of each parameter has been correctly defined.

Figure 6: Class diagram of the core components of the HVS system.

Figure 6 shows the class diagram of the components of the HVS system which build an HKpacket structure. The data segment (source data) of a packet starts with the SID field, used toidentify the packet type. Then a sequence of samples follow. Each sample is composed of asequence of different parameters. A parameter is represented by a bit field (BitField templateclass) specifying the size in bits of the parameter, its raw value type and the offset of the parameterwithin the sample. The BitField class inherits from a pure abstract class, the IBitField class, whichis used to iterate over all parameters, independently of their concrete type, to set each bit value.

The HVS system is able to increase the number of packets in the input dataset, introducing newpackets with coherent OBT, sample time and source sequence count values and keeping constantthe proportion of each packet type within the dataset. With this feature, we have produced a datasetcorresponding to an acquisition lasting 6 days. This was necessary to apply the pattern of values topackets with a low frequency (64 seconds) and containing a large number of HK parameters.

The dataset generated by the HVS system is then processed by the TMH in order to generate

– 15 –

Page 18: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

Figure 7: Comparison between the HK parameters values in the HVS generated dataset (black)with values in the corresponding TOIs (red) for the REBA HK telemetry packet.

the HK timelines. For each packet type, the parameter values of the input dataset are comparedwith the corresponding TOIs generated by the TMH in order to highlight possible discrepancies.

6.2 HK tests results

Tests performed with the HVS system have highlighted an error in the handling of two HK teleme-try packets types, the REBA HK packet and the REBA Diagnostic HK packet, which have a com-mon structure. The problem is shown in figure 7 for the REBA HK packet: it represents a plot ofthe HK parameter values in the dataset generated by the HVS system (black) and the correspondingTOIs generated by the TMH system (red). The parameters are ordered according to their positionin the packet (denoted by the number on the top of each peak). The “sample” axis is the loca-tion in the test sequence where that sample is tested with the given value. The problem with theregistration of the HK parameters is evidenced by the anomalous distribution of red points aroundthe sample 700. This test has proved that there was a discordance between the ICD describing thepackets structure for LFI and their mapping in the SCOS 2000 MIB [10]. After this analysis, theoffsets of the involved parameters were corrected in the MIB tables by the instrument team andsuccessfully checked with a rerun of this test.

– 16 –

Page 19: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

7. Conclusions

The Level 1 software system of Planck/LFI, which is a critical component of the Planck scienceground segment, has been validated by applying the ESA standards for software verification andvalidation. The tests have been designed to recreate the most realistic operational conditions, usingtelemetry directly generated by the instrument, whenever possible. An additional effort was neededto develop testing and analysis tools devoted to the end-to-end tests. Some relevant bugs have beenidentified in the first run of the tests, performed on the Qualification Model of the software, andrapidly fixed. The Flight Model of the Level 1 has successfully passed the end-to-end validationtests and has been adopted for the entire LFI calibration tests campaign and the subsequent Planckintegration tests.

Acknowledgments

Planck is a project of the European Space Agency with instruments funded by ESA member states,and with special contributions from Denmark and NASA (USA). The Planck-LFI project is de-veloped by an International Consortium lead by Italy and involving Canada, Finland, Germany,Norway, Spain, Switzerland, UK, USA. The Italian contribution to Planck is supported by theItalian Space Agency (ASI). The US Planck Project is supported by the NASA Science MissionDirectorate.

References

[1] ESA Publications, ESA Software Engineering Standards, ESA Procedures Standards andSpecifications, PSS-05-0 (1991), ftp://ftp.estec.esa.nl/pub/wm/wme/bssc/PSS050.pdf.

[2] ESA Publications, Guide to software verification and validation, ESA Procedures Standards andSpecifications, PSS-05-10 (1995) ftp://ftp.estec.esa.nl/pub/wm/wme/bssc/PSS0510.pdf.

[3] ECSS Standard, Space product assurange - Quality assurance, European Cooperation for SpaceStandardization, ECSS-Q-ST-20C, 2008

[4] F. Pasian et al., Planck/LFI Pipeline – The Demonstration Model, Astron. Data Anal. Soft. Sys. XIV347 (2005) 609.

[5] A. Mennella et al., Calibration and testing of the Planck-LFI QM instrument, Space Telescopes andInstrumentation I: Optical, Infrared, and Millimeter, Proc. SPIE 6265 (2006) 62650G.

[6] A. Zacchei et al., Level 1 on ground telemetry handling in PLANCK LFI, 2009 JINST 4 T12019.

[7] M. Frailis et al., Level 1 core software testing for Planck/LFI, in Proceedings of CMB and Physics ofthe Early Universe, April 22-26, 2006 Ischia, Italy, PoS(CMB2006)035

[8] L. Popa, M. Maris, F. Bottega and F. Pasian, Planck LFI - DPC Software Integration and Testing Plan,Astronomical Observatory of Trieste, Tech. Report, PL-LFI-OAT-PL-006 (2006).

[9] M. Maris, M. Frailis, Planck LFI - Test Plan for the TMH/TQL software, Astronomical Observatory ofTrieste, Tech. Report, PL-LFI-OAT-PL-009 (2006).

[10] M. Maris, M. Frailis, Planck LFI - Test Report on the TMH/TQL (QM and FM) by Using a KnownSignal Test data, Astronomical Observatory of Trieste, Tech. Report, PL-LFI-OAT-RP-17 (2008).

– 17 –

Page 20: A systematic approach to the PLANCK LFI end-to-end test ... · ABSTRACT: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the han- dling of the scientific

[11] I. Somerville, Software Engineering, Addison Wesley, 8th edition (2006).

[12] M. Maris et al., Optimization of PLANCK/LFI on-board data handling, 2009 JINST 4 T12018. .

[13] A. Zacchei et al., HouseKeeping and Science Telemetry: the case of Planck/LFI, Mem. S.A.It. Suppl.3 (2003) 331, http://sait.oat.ts.astro.it/MSAIS/3/PDF/331.pdf.

[14] M. Maris et al., A scientific Telemetry generator for the Planck/LFI flight simulator, AstronomicalObservatory of Trieste, Tech. Report,PL-LFI-OAT-TN-17 (2001).

[15] S. Fogliani, M. Malaspina et al., Planck/LFI: management of telemetry, Mem. S.A.It. 74 (2003) 470,http://sait.oat.ts.astro.it/MSAIt740203/PDF/2003MmSAI..74..470F.pdf.

– 18 –