22
10 September 200 3 1 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

Embed Size (px)

Citation preview

Page 1: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

1

POOL v1 QA

Massimo Lamanna,Jakub MoscickiCERN LCG/SPI

Page 2: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

2

Contents

• Regular QA assessment– QA reviews for 0.4.0, 1.0.0, 1.1.0, 1.2.0

• The final report– Available at http://spi.cern.ch/qa/QAPOOL1.pdf– User feedback (Atlas, CMS)– General Comments and Recommendations

• Conclusions

Page 3: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

3

Regular QA assessment

• QA reviews for 0.4.0, 1.0.0, 1.1.0, 1.2.0– April to August 2003– Guidelines:

• Select “indicators” of “quality”– Examples:

» Does the code compile?» Are there tests for each component?» Is the documentation usable?» Select the most complex file/method…

• Improve on uniformity SPIPOOL– LCG compliance– Uniformity

– Reports attached as appendixes to the final document

Page 4: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

4

Regular QA assessments

• Main observations– Enormous emphasis on getting first fundamental

functionalities implemented• With good results!• But some other activities suffered; examples:

– Test effort limited in the first months (now much better)– Documentation effort limited (but dedicated effort was

reduced after few months…)

– The group showed a good dynamics• Personnel change• New people becoming efficient within a short delay

Page 5: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

5

Regular QA assessment (cont.d)

– Sometimes difficult to broadcast the message on the value of having uniformity across all LCG projects

• Nevertheless fundamental input and discussions from POOL are acknowledged

– POOL (as possibly other LCG projects, SPI included) seems to consider the other projects as separate entities

• My understanding is a bit different…• Constructive collaboration should be applied here; the

attitude to consider allowed everything which is not forbidden will lead to a committee-based development (with the net results everyone can imagine…)

Page 6: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

6

POOL in the experiments

• This section is the cornerstone of the report. – Since the final judgment about the quality of a

package is given by the fact to meet the requirements of the users, the early adopters have been interviewed

– To provide more material and some independent confirmation, some outstanding open issues (from the Savannah Bug Tracker) have been identified and considered in some detail, to ascertain if they were pointing to “normal” bugs or to missing features

– But, who are the POOL users?

Page 7: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

7

POOL users

• Core software teams– To date, ATLAS and CMS had started the integration

in their software frameworks– LHCb (as planned) will start in September– ALICE does not foresee to use POOL

• ATLAS and CMS– 2 persons each– Very expert persons

• Other early adopters?– No non-LHC experiments– No “final” or “non-expert” users yet

Page 8: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

8

CMS interview• Bill and Vincenzo were contacted. Most of the input from

Bill.• The experience was globally positive.

– The CMS developers pointed out that the overall structure is adequate and presents the appropriate level of modularity.

– The documentation is found still to be weak. Although a manual would be highly desirable, CMS used the tests and examples as replacement for a proper documentation.

• The RTAG document, on the other side, is too high-level to be helpful in providing a description of the POOL architecture.

– The developer we interviewed used a little bit Doxygen to browse through the code (and never LXR); most of the time the CVS browser is used for this purpose.

– Savannah (and in particular the BugTracker) is found to be an excellent tool.

– The CMS developers acknowledge the responsiveness of POOL to provide explanations, fixes, and workarounds.

Page 9: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

9

CMS effort from QA point of view

• The effort was concentrated between mid June and end of July.– Testified by the very high bug tracker activity– The 2 CMS persons and the full POOL team provided

an enormous effort– So far, so good

# of Bugs

Project week

POOL BugTracker

Page 10: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

10

ATLAS interview

• ATLAS has started a similar activity to integrate POOL within ATHENA. Again two developers are the integration task force (R. D. and Valeri), who were interviewed.– At the time of the interview, ATLAS was somewhat less

advanced than CMS, but the developers could provide interesting and complementary information.

– The integration is less advanced also because of some problems between ATHENA and SEAL services (mainly connected to the plug-in manager).

– Most of the problems and instabilities are felt as normal infancy problems, which are expected to disappear (some instability in some public interfaces, for example).

– The documentation is probably too limited and the need of an overview page giving the general vision is advocated. 

Page 11: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

11

ATLAS interview (2)• Some concerns have been expressed on the impact the

new system (using POOL) will have on the final users. In particular, sharing private files across colleagues could be complicated by the fact that each set of files comes together with a file catalog. This could be confusing for many end users performing essential tasks like prototyping, test beam analysis, detector debugging or commissioning.

• ATLAS does not use SCRAM as configuration and build tool.– POOL and SEAL are just 2 more external packages for CMT. – ATLAS developers did manage to recompile POOL (using

SCRAM) just using the POOL doc (This was needed for some tests).

• First positive impression but the first solid confirmation of the fact that POOL does satisfy the essential requirements will come probably in a month’s time.

i.e. basically now…

Page 12: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

12

ATLAS effort from QA point of view

• The effort is less concentrated (still going on).– Complementary vision and very useful comments to

be taken into account– Again, so far, so good!

Page 13: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

13

Is POOL providing what the experiments want?

• The inspection of the open bugs from the Savannah Bug Tracker confirms the picture obtained from the interview of the developers. – Please note the great value to channel all

communication between users and POOL in a single repository

• The open bugs (beginning of August) showed that these issues are mainly plain bugs (5 out of 7)– although sometimes very dangerous (possible corruption

of files; one CINT problem), possibly leading to a slow down of the integration effort, they are “just” bugs NOT SUGGESTING MAJOR DEFECTS

Page 14: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

14

Is POOL providing what the experiments want?

• One out of seven is the bad behaviour of a POOL subsystem under specific circumstances (failure of clearing the cache when a set of objects are referencing each other via a circular reference; bug report 1281). POOL managed to provide a mechanism to circumvent the problem; it might be even the final solution

• Only one (out of seven) points to a missing functionality, namely to the fact that it is not possible to selectively load the dictionary of all objects one would potentially use: the system seems to force the user to load all possible dictionaries before loading a file (bug report 1290 ). This is a serious problem that would be possibly fixed (leveraging some new functionality of SEAL and necessary support from back-end implementations, e.g. ROOT); a general solution does not seem completely trivial.

Page 15: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

15

What did we learn so far?

• Regular reports:– Impressive effort over all project life so far – POOL is getting better and better in terms of adherence of

LCG-AA policies (as maintained by SPI), but still some efforts are needed

• Interviews: POOL matches the experiments’ expectations in terms of usability

• BugTracker: POOL matches the experiments’ expectations in terms of functionality

• To be done: Data Challenges should confirm that POOL matches what the experiments’ expectations in terms of performances (not only speed!)

Page 16: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

16

Recommendations

1. Complete the automation of the test procedures (using the existing SPI tools) and provide a first test coverage.

2. Make available and allocate enough effort to provide the required documentation. This should include an overview description of the full system and a workbook-like set of “getting started” instructions (installations, more examples). The effort could be evaluated to be ~ 0.5 FTE.

3. Improve on SPI compliance and maintain POOL aligned with SPI policy and usage of tools and services.

4. Make available and allocate enough effort to have “support” persons oriented to create permanent links with the experiments core software teams (~1 person) and the production managers (~1 person).

Page 17: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

17

Complete the automation of the test procedures (using the existing SPI tools) and provide a first test

coverage

• POOL will keep a major fraction of the LHC experimental data– the system validation and risk associated calls for a

very deep, effective, and continuous test effort

• The multiple sites, multiple versions, multiple platforms requires it already in the development stage

• It will be a key factor for the distribution and deployment outside CERN (LCG-1 sites and other installations, like development machines)

Page 18: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

18

Make available and allocate enough effort to provide the required documentation. This should include an overview description of the full system and a

workbook-like set of “getting started” instructions. The effort could be evaluated to be

~ 0.5 FTE.

• I do not think it is necessary to elaborate on this self-evident item...

• A plan should be agreed with the experiments, to provide a priority list

• 0.5 FTE is a wild guess, based on previous experiences, but it should be build by a few persons, not just like the 5% of everybody in the group!

Page 19: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

19

Improve on SPI compliance and maintain POOL aligned with SPI policy and usage of tools and

services

• Again, self-explanatory• Please remember that uniformity has a great

value– For developers:

• Already many examples of people contributing to different teams

– For users:• Typical experiment persons deals with many packages

– For LCG support:• Difficult to factorize support activities if they are not

uniform

Page 20: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

20

Make available and allocate enough effort to have “support” persons oriented to create

permanent links with the experiments core software teams (~1 person) and the production managers (~1 person).

• The project will (very soon) spawn a set of services.

• The embryos should be started now, due to the very tight data challenges schedule ahead

• For this particular item, physical persons (not FTEs) are essentials!

Page 21: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

21

Conclusions

• The POOL project is definitely on track• POOL played (is playing) an important role as

pilot project in LCG-AA– Interaction with SPI to define policies and services

• As CVS structure, tests, documentation, installation, …

– Start up “example” for the others

• Encouraging interaction with the experiments• We look forward to the demonstration of its

capabilities and possibilities in the upcoming data challenges.

Page 22: 10 September 20031 POOL v1 QA Massimo Lamanna, Jakub Moscicki CERN LCG/SPI

10 September 2003

22

Recommendations

1. Complete the automation of the test procedures (using the existing SPI tools) and provide a first test coverage.

2. Make available and allocate enough effort to provide the required documentation. This should include an overview description of the full system and a workbook-like set of “getting started” instructions (installations, more examples). The effort could be evaluated to be ~ 0.5 FTE.

3. Improve on SPI compliance and maintain POOL aligned with SPI policy and usage of tools and services.

4. Make available and allocate enough effort to have “support” persons oriented to create permanent links with the experiments core software teams (~1 person) and the production managers (~1 person).