of 64/64
1 The LHC Computing Project Common Solutions for the LHC ACAT 2002 Presented by Matthias Kasemann FNAL and CERN

The LHC Computing Project Common Solutions for the LHC

  • View
    34

  • Download
    0

Embed Size (px)

DESCRIPTION

The LHC Computing Project Common Solutions for the LHC. ACAT 2002 Presented by Matthias Kasemann FNAL and CERN. Outline. The LCG Project: goal and organization Common solutions: Why common solutions How to … The Run2 common projects The LCG Project: status of planning - PowerPoint PPT Presentation

Text of The LHC Computing Project Common Solutions for the LHC

  • The LHC Computing Project

    Common Solutions for the LHCACAT 2002

    Presented by

    Matthias Kasemann FNAL and CERN

  • OutlineThe LCG Project: goal and organizationCommon solutions:Why common solutionsHow to The Run2 common projects

    The LCG Project: status of planningResults of the LCG workshop in March 02Planning in the Applications Area

    For the LCG Grid see: Les Robertson (Thursday) The LHC Computing Grid Project - Creating a Global Virtual Computing Center for Particle Physics

  • From Raw Data to Physics:what happens during analysisFragmentation,DecayPhysics analysisInteraction withdetector materialPattern,recognition,ParticleidentificationDetectorresponseapplycalibration,alignment,

    ReconstructionSimulation (Monte-Carlo)Analysise+e-ffZ0Basic physics

    Results250Kb 1 Mb100 Kb25 Kb5 Kb500 b_

  • HEP analysis chain: common to LHC experiments

  • Developing Software for LHC experimentsChallenges in big collaborationsLong and careful planning processMore formal procedure required to commit resourcesLong lifetime, need flexible solutions which allow for changeAny state of experiment longer than typical Ph.D. or postdoc timeNeed for professional IT participation and supportNew development, maintenance and support model requiredChallenges in smaller collaborationsLimited in resourcesAdapt and implement available solutions (b-b-s)

  • CMS - CCS schedule (V33): the bottom lineMilestones of ~ next year: delays of ~9 months Milestones a few yrs away: delays of ~15 months

  • CMS - CCS Software Baseline: L2 milestonesDDD ready for OSCAR, ORCA, IGUANAData model definedPersistent and transient representationsDemonstrably as correct as existing CMS descriptionSwitch from Geant3 to Geant4: Date not decided (just my estimate)E.g. it needs the new persistencySoftware Infrastructuredeployed and workingUser analysis componentsFramework with coherent user interfaceEvent display / interactive visualisationTools for browsing / manipulating data sets Data presentation, histograms, numerical,Framework for processing CMS dataWorking for simulation, reconstruction, analysisSupporting persistency and data managementStrongly dependent on LCG successCCS Baseline Software for TDRs

  • The LHC Computing Grid Project (LCG)Work AreasApplications Support & CoordinationComputing SystemsGrid TechnologyGrid DeploymentCommon SolutinsExperiments and Regional Centres agree on requirements for common projectsLCG was approved in fall 2001resources contributed from some member states1. Workshop in March 02

  • LCG - Fundamental goal:The experiments have to get the best, most reliable and accurate physics results from the data provided by their detectors

    Their computing projects are fundamental to the achievement of this goal

    The LCG project at CERN was set up to help them all in this taskCorollarySuccess of LCG is fundamental to success of LHC Computing

  • Fulfilling LCG Project GoalsPrepare and deploy the LHC Computing EnvironmentApplications - provide the common components, tools and infrastructure for the physics application softwareComputing system fabric, grid, global analysis systemDeployment foster collaboration and coherenceNot just another grid technology projectValidate the software by participating in Data Challenges using the progressively more complex Grid PrototypePhase 1 - 50% model production grid in 2004Produce a TDR for full system to be built in Phase 2Software performance impacts on size and cost of production facilityAnalysis models impact on exploitation of production gridMaintain opportunities for reuse of deliverables outside LHC experimental programme

  • Applications Activity AreasApplication software infrastructurephysics software development environment, standard libraries, development tools Common frameworks for simulation and analysisDevelopment and integration of toolkits & components Support for physics applicationsDevelopment, support of common software tools & frameworks Adaptation of Physics Applications to Grid environmentObject persistency and data management toolsEvent data, metadata, conditions data, analysis objects,

  • Goals for Applications AreaMany Software Production TeamsLHC experimentsCERN IT groups, ROOT team, ..HEP software collaborations CLHEP, Geant4 , ..External Software python, Qt, XML, Strive to work together to develop and use software in commonWill involve identifying and packaging existing HEP software for reuse as well as developing new componentsEach unit has its own approach to design and to supporting the developmentSharing in the development and deployment of software will be greatly facilitated if units follow a common approachRecognise that there will be start-up costs associated with adapting to use new common products and development tools

  • Why common and when?Why not:Experiments have independent detectors and analysis tools verify physics resultsCompetition for best physics resultsCoordination of common software development is significant overheadWhy common solutions:Need mature engineered software Resources are scarce, in particular manpowerEffort: Common projects are a good way to become more efficient ( , , , ?)Lessons need to be learnt from past experienceFor LHC experiments: Everything non experimentspecific is a potential candidate for a common project

  • FNAL: CDF/D0/CD - Run 2 Joint Project OrganizationR2JOPSteering CommitteeDirectorateD0 CollaborationCDF CollaborationTask CoordinatorsRun IICommitteeRun II ComputingProject OfficeExternal Review CommitteeBasic InfrastructureMass Storage &Data AccessReconstruction SystemsPhysics AnalysisSupportFermilabClassLibraryConfigurationManagementSupportDatabasesSimulationStorageManagementSerial MediaWorkingGroupMSSHardwareReconstructionfarm hardwareNetworkinghardwareProductionManagementReconstructioninput pipelinePhysics analysishardwarePhysics Anal-ysis SoftwareVisualizationData Access15 joint projects defined,4 years before start of data taking

  • Perceptions of Common ProjectsExperimentsWhilst may be very enthusiastic about long-term advantages .have to deliver on short term milestonesDevoting resources to both will be difficultAlready experience an out-flux of effort into common projectsHosting projects in experiments excellent way of integrating effortFor initial phase and prototyping

    Technology groupsGreat motivation to use expertise to produce useful solutionsNeed the involvement of the experiments

  • Common solutions - How to do?Requirements are set by experiments in the SC2 + Requirements Technical Assessment Groups (RTAGs)Planning and implementation is done by LCG together with experimentsMonitoring of progress and adherence by the SC2

    Frequent releases and testingGuaranteed life-time maintenance and supportIssues:How will applications area cooperate with other areas?Not feasible to have a single LCG architect to cover all areas.Need mechanisms to bring coherence to the project

  • Workflow around the organisation chartWPnPEBSC2RTAGmrequirementsmandatePrioritisedrequirementsUpdated workplanWorkplan feedback~2 mthsProject planRelease 1Release 2~4 mthsStatus reportReview feedbacktimeSC2 Sets the requirements SC2 approves the workplanSC2 reviews the statusPEB develops workplanPEB manages LCG resourcesPEB tracks progress

  • Issues related to partitioning the workHow do you go from present to future without dismantling existing projects?Have to be careful that we dont partition into too small chunks and lose coherence of overall softwareWe are not starting afresh, we have a good knowledge of what the broad categories are going to beExperiment architectures help to ensure coherency.

  • Coherent ArchitectureApplications common projects must follow a coherent overall architectureThe software needs to be broken down into manageable pieces i.e. down to the component level Component-based, but not a bag of disjoint componentscomponents designed for interoperability through clean interfacesDoes not preclude a common implementation foundation, such as ROOT, for different componentsThe contract in the architecture is to respect the interfacesNo hidden communication among componentsStarting point is existing products, not a clean slate

  • Approach to making workplanDevelop a global workplan from which the RTAGs can be derivedConsiderations for the workplan:Experiment need and priorityIs it suitable for a common projectIs it a key component of the architecture e.g. object dictionaryTiming: when will the conditions be right to initiate a common projectDo established solutions exist in the experimentsAre they open to review or are they entrenchedAvailability of resources and allocation of effortIs there existing effort which would be better spent doing something elseAvailability, maturity of associated third party softwareE.g. grid softwarePragmatism and seizing opportunity. A workplan derived from a grand design does not fit the reality of this project

  • RTAG: blueprint of LCG application architectureMandate: define the architectural blueprint for LCG applications:Define the main architectural domains (collaborating frameworks) of LHC experiments and identify their principal components. (For example: Simulation is such an architectural domain; Detector Description is a component which figures in several domains.)Define the architectural relationships between these frameworks and components, including Grid aspects, identify the main requirements for their inter-communication, and suggest possible first implementations. (The focus here is on the architecture of how major domains fit together, and not detailed architecture within a domain.) Identify the high-level milestones for each domain and provide a first estimate of the effort needed. (Here the architecture within a domain could be considered.)Derive a set of requirements for the LCGTime-scale: started in June 02, draft report in July, final report in August 02

  • RTAG statusIdentified and started eight Requirement Technical Assessments (RTAGs)in application software areaData persistencyfinishedSoftware support process and toolsfinishedMathematical librariesfinishedDetector Geometry & Materials descriptionsstartedblueprint architecture of applicationsstartedMonte Carlo event generatorsstartedin compute fabric areamass storage requirementsfinishedin Grid technology and deployment areaGrid technology use casesfinishedregional center category and services definitionfinished

  • Software Process RTAGMandate:Define a process for managing LCG software. Specific tasks to include: Establish a structure for organizing software, for managing versions and coherent subsets for distribution Identify external software packages to be supportedIdentify recommended tools for use within the project to include configuration and release managementEstimate resources (person power) needed to run an LCG support activityGuidance:Procedures and tools will be specified Will be used within project Can be packaged and supported for general use Will evolve with timeThe RTAG does not make any recommendations on how experiment internal software should be developed and managed. However, if an experiment specific program becomes an LCG product it should adhere to the development practices proposed by this RTAG

  • Process RTAG Recommendations(1)All LCG projects must adopt the same set of tools, standards and procedures. The tools must be centrally installed, maintained and supported.Adopt commonly used open-source or commercial software where available. Try to avoid do it yourself solutions in this area where we dont have core competency. Concerning commercial software, avoid commercial software that has to be installed on individuals machines as this will cause well known problems of license agreements and management in our widely distributed environment. Commercial solutions for web-portals or other centrally managed solutions would be fine.

  • Process RTAG Recommendations(2)Release early, release often implies major release 2-3 times per yearDevelopment release every 2-3 weeksAutomated nightly builds, regression tests, benchmarksTest and quality assuranceSupport of external software installation and build up of local expertiseEffort needed for filling support rolesLibrarianRelease managerToolsmith Quality assuranceTechnical writer

  • Persistency RTAGMandate:Write the product specification for the Persistency Framework for Physics Applications at LHCConstruct a component breakdown for the management of all types of LHC dataIdentify the responsibilities of Experiment Frameworks, existing products (such as ROOT) and as yet to be developed productsDevelop requirements/use cases to specify (at least) the metadata /navigation component(s)Estimate resources (manpower) needed to prototype missing componentsGuidance:The RTAG may decide to address all types of data, or may decide to postpone some topics for other RTAGS, once the components have been identified.The RTAG should develop a detailed description at least for the event data management.Issues of schema evolution, dictionary construction and storage, object and data models should be addressed.

  • Persistency Near term recommendationsto develop a common object streaming layer and associated persistence infrastructure. a common object streaming layer based on ROOT-IO and several related components to support it, including a (currently lightweight) relational database layer. Dictionary services are included in the near-term project specification. dictionary services may have additional clientsThis is first step towards a complete data management environment, one with enormous potential for commonality among the experiments.

  • RTAG: math library reviewMandate: Review the current situation with math libraries and make recommendationsReview the current situation of the usage of the various math libraries in the experiments (including but not limited to NagC++, GSL, CLHEP, ROOT)Identify and recommend which ones should be adopted, which ones could be discontinuedSuggest possible improvements to the existing onesEstimate resources needed for this activityGuidance The result of the RTAG should allow to establish a clear program of work to streamline the status of math libraries and find the maximum commonality between experiments, taking into account cost, maintenance and projected evolution of the experiment needs

  • Math Library: RecommendationsTo design a support group to provide advice and information about the use of existing libraries, to assure their continued availability, to identify where new functionality is needed, and to develop that functionality themselves or by coordinating with other HEP-specific library developers. The goal would be to have close contact with the experiments and provide expertise on mathematical methods, aiming at common solutions,The experiments should maintain a data base of mathematical libraries used in their software, and within each library, the individual modules used. A detailed study should be undertaken to determine whether there is any functionality needed by the experiments and available in the NAG library which is not covered as well by a free library such as GSL.

  • RTAG: Detector Geometry & Materials Description Write the product specification for detector geometry and materials description services.Specify scope: e.g. Services to define, provide transient access to, and store the geometry and materials descriptions required by simulation, reconstruction, analysis, online and event display applications, with the various descriptions using the same information sourceIdentify requirements including end-user needs such as ease and naturalness of use of the description tools, readability and robustness against errors e.g. provision for named constants and derived quantitiesExplore commonality of persistence requirements with conditions data management Interaction of the DD with a conditions DB. In that context versioning and configuration management of the detector description, coherence issuesIdentify where experiments have differing requirements and examine how to address them within common toolsAddress migration from current tools

  • RTAG:Monte Carlo Event GeneratorsMandate: To best explore the common solutions needed and how to engage the HEP community external to the LCG it is proposed to study:How to maintain a common code repository for the generator code and related tools such as PDFLIB.The development or adaptation of generator-related tools (e.g.HepMC) for LHC needs. How to provide support for the tuning, evaluation and maintenance of the generators.The integration of the Monte Carlo generators into the experimental software frameworks.The structure of possible forums to facilitate interaction with the distributed external groups who provide the Monte Carlo generators.

  • Possible Organisation of activitiesProjectWPWPWPProjectWPProjectWPWPProjectWPWPWPOverall management, coordination, architecture, integration, supportActivity area: Physics data managementPossible projects: Hybrid event store, Conditions DB, Work Packages: Component breakdown and work plan lead to Work Package definitions. ~1-3 FTEs per WPActivity areaActivity areaActivity areaExample:ArchitectProjectleader

  • Global Workplan 1st priority levelEstablish process and infrastructureNicely covered by software process RTAGAddress core areas essential to building a coherent architectureObject dictionary essential piecePersistency - strategicInteractive frameworks - also driven by assigning personnel optimallyAddress priority common project opportunitiesDriven by a combination of experiment need, appropriateness to common project, and the right moment (existing but not entrenched solutions in some experiments)Detector description and geometry modelDriven by need and available manpowerSimulation tools

  • Global Workplan 2nd priority levelBuild outward from the core top-priority componentsConditions databaseStatistical analysisFramework services, class librariesAddress common project areas of less immediate priorityMath librariesPhysics packages (scope?)Extend and elaborate the support infrastructureSoftware testing and distribution

  • Global Workplan 3rd priority levelThe core components have been addressed, architecture and component breakdown laid out, work begun. Grid products have had another year to develop and mature. Now explicitly address physics applications integration into the grid applications layer.Distributed production systems. End-to-end grid application/framework for production.Distributed analysis interfaces. Grid-aware analysis environment and grid-enabled tools.Some common software components are now available. Build on them.Lightweight persistency, based on persistency frameworkRelease LCG benchmarking suite

  • Global Workplan 4th priority levelLonger term items waiting for their momentHard ones, perhaps made easier by a growing common software architectureEvent processing frameworkAddress evolution of how we write softwareOO language usageLonger term needs; capabilities emerging from R&D (more speculative)Advanced grid tools, online notebooks,

  • Candidate RTAGs (1)

    Simulation toolsNon-physics activityDetector description, modelDescription tools, geometry modelConditions databaseIf necessary after existing RTAGData dictionaryKey need for common serviceInteractive frameworksWhat do we want, have, needStatistical analysisTools, interfaces, integrationVisualizationTools, interfaces, integrationPhysics packagesImportant area but scope unclearFramework servicesIf common framework is too optimisticC++ class librariesStandard foundation libraries

  • Candidate RTAGs (2)

    Event processing frameworkHard, long termDistributed analysisApplication layer over gridDistributed productionApplication layer over gridSmall scale persistencySimple persistency toolsSoftware testingMay be covered by process RTAGSoftware distributionFrom central Program Library to convenient broad distributionOO language usageC++, Java (..?) roles in the futureBenchmarking suiteComprehensive suite for LCG softwareOnline notebooksLong term; low priority

  • Common Solutions: ConclusionsCommon Solutions for LHC software are required for successCommon solutions are agreed upon by experiments The requirements are set by the experimentsThe development is done jointly by the LCG project and the LHC experimentsAll LCG software is centrally supported and maintained.What makes us believe that we succeed? What is key to success?The process in the LCG organizationThe collaboration between playersCommon technologyCentral resources, jointly steer-able by experiments and managementParticipants have prototyping experience !!

  • Backup & Additional slides

  • Post-RTAG Participation of Architects Draft Proposal (1)Monthly open meeting (expanded weekly meeting)Accumulated issues to be taken up with architectsArchitects in attendance; coordinators invitedInformation has gone out beforehand, so architects are primedMeeting is informational, and decision-making (for the easier decisions)An issue is eitherResolved (the easy ones)Flagged for addressing in the architects committee

  • Post-RTAG Participation of Architects Draft Proposal (2)Architects committee:Members: experiment architects + applications manager (chair)Invited: computing coordinators, LCG project manager and CTOOthers invited at discretion of memberse.g. project leader of project at issueMeets shortly after the open meeting (also bi-weekly?)Decides the difficult issuesMost of the time, committee will converge on a decisionIf not, try harderIf still not, applications manager takes decisionSuch decisions can be accepted or challengedChallenged decisions go to full PEB, then if necessary to SC2PEB role of raising issues to be taken up by SC2We all abide happily by an SC2 decisionCommittee meetings also cover general current issues and exchange of viewsCommittee decisions, actions documented in public minutes

  • Distributed Character of Components (1)Persistency frameworkNaming based on logical filenamesReplica catalog and managementCost estimators; policy modulesConditions databaseInherently distributed (but configurable for local use)Interactive frameworksGrid-aware environment; transparent access to grid-enabled tools and servicesStatistical analysis, visualizationIntegral parts of distributed analysis environmentFramework servicesGrid-aware message and error reporting, error handling, grid-related framework services

  • Distributed Character of Components (2)Event processing frameworkCf. framework services, persistency framework, interactive frameworksDistributed analysisDistributed productionSoftware distributionShould use the gridOO language usageDistributed computing considerationsOnline notebookGrid-aware tool

  • RTAG?: Simulation toolsGeant4 is establishing a HEP physics requirements body within the collaboration, accepted by SC2 as a mechanism for addressing G4 physics performance issuesHowever, there are important simulation needs to which LCG resources could be applied in the near term.By the design of LCG, this requires SC2 delivering requirements to PEBJohn Apostolakis has recently assembled G4 requests and requirements from the LHC collaborations Proposal: Use these requirements as the groundwork for a quick 1-month RTAG to guide near term simulation activity in the project, leaving the addressing of physics performance requirements to the separate process within Geant4

  • RTAG?: Simulation tools (2)Some possible activity areas in simulation, from the Geant4 requests/requirements received from the experiments, which would be input to the RTAG:Error propagation tool for reconstruction (GEANE)Assembly and documentation of standard physics listsPython interfaceDocumentation, tutorials, communicationGeant4 CVS server access issues

    The RTAG could also address FLUKA supportRequested by ALICE as an immediate priorityStrong interest expressed by other experiments as well

  • RTAG?: Detector geometry & materials description and modeling servicesWrite the product specification for detector geometry and materials description and modeling services Specify scope: eg. Services to define, provide transient access to, and store the geometry and materials descriptions required by simulation, reconstruction, analysis, online and event display applications, with the various descriptions using the same information sourceIdentify requirements including end-user needs such as ease and naturalness of use of the description tools, readibility and robustness against errors e.g. provision for named constants and derived quantitiesExplore commonality of persistence requirements with conditions data managementIdentify where experiments have differing requirements and examine how to address them within common toolsAddress migration from current tools

  • RTAG?: Conditions databaseWill depend on persistency RTAG outcomeRefine the requirements and product specification of a conditions database serving the needs of the LHC experiments, using the existing requirements and products as a reference point. Give due consideration to effective distributed/remote usage. Identify the extent to which the persistency framework (hybrid store) can be directly used at the lower levels of a conditions database implementation.Identify the component(s) and interfaces atop a common persistency foundation that complete the conditions database

  • RTAG?: Data dictionary serviceCan the experiments converge on common data definition and dictionary tools in the near term?Even if the answer is no, it should be possible to establish a standard dictionary service (generic API) by which common tools can interact, while leaving free to the experiments how their class models are defined and implementedDevelop a product specification for a generic high-level data dictionary service able to accommodate distinct data definition and dictionary tools and present a common, generic interface to the dictionaryReview the current data definition and dictionary approaches and seek to expand commonality among the experiments. Write the product specifications for common (even if N
  • RTAG?: Interactive frameworksFrameworks providing interactivity for various environments including physics analysis and event processing control (simulation and reconstruction) are critical. They serve end users directly and must match end user requirements extremely well. They can be a powerful and flexible glue in a modular environment, providing interconnectivity between widely distinct components and making the whole offered by such an environment much greater than the sum of its parts.Develop the requirements for an interactive framework common across the various application environmentsRelate the requirements to existing tools and approaches (e.g. ROOT/CINT, Python-based tools)Write a product specification, with specific recommendations on tools and technologies to employAddress both command line and GUI interactivity

  • RTAG?: Statistical analysis interfaces & toolsAddress requirements on analysis toolsWhat data analysis services and tools are requiredWhat is and is not provided by existing toolsAddress what existing tools should be supported and what further development is neededIncluding long term maintenance issuesAddress role of abstract interfaces to statistical analysis servicesAre they to be used?If so, what tools should be interfaced to a common abstract interface to meet LHC needs (and how, when, etc.)Address requirements and approaches to persistency and data interchange

  • RTAG?: Detector and event visualizationExamine the range of tools available and identify those which should be developed as common components within the LCG Applications architectureAddress requirements, recommendations and needed/desired implementations in such areas asexisting and planned standard interfaces and their applicabilityGUI integrationInteractivity requirements (picking)Interface to visualizing objects (eg. Draw() method)Use of standard 3D graphics librariesVery dependent on other RTAG outcomes

  • RTAG?: Physics packagesNeeds and requirements in event generators and their interfaces & persistency, particle property services, Scope of the LCG in this area needs to be made clearer before a well defined candidate RTAG can be developed

  • RTAG?: Framework servicesWhile converging on a common event processing framework among the LHC experiments may be impractical at least on the near term, this does not preclude adopting common approaches and tools for Framework servicesExamples: message handling and error reporting; execution monitoring and state management; exception handling and recovery; job state persistence and recording of history information; dynamic component loading; interface definition, versioning, etc.Seek to identify framework services and tools which can be developed in common, possibly starting from existing products.Develop requirements on their functionality and interfaces.

  • RTAG?: C++ class librariesAddress needs and requirements in standard C++ class libraries, with recommendations on specific toolsProvide recommendations on the application and evolution of community libraries such as ROOT, CLHEP, HepUtilities, Survey third party libraries and provide recommendations on which should be adopted and what should be used from themMerge with Framework Services candidate RTAG?

  • RTAG?: Event processing frameworkThere is no consensus to pursue a common event processing framework in the near term. There is perhaps more agreement that this should be pursued in the long term (but theres no consensus on a likely candidate for a common framework in the long term)This looks at best to be a long term RTAGTwo experiments do use a common event processing framework kernel (Gaudi)Many difficult issues in growing N past 2, whether with Gaudi, AliRoot, COBRA or something else!

  • RTAG?: Interfaces to distributed analysisDevelop requirements on end-user interfaces to distributed analysis, layered over grid middleware services, and write a product specificationGrid portals, but not only; e.g. PROOF and Jas fall into this categoryA grid portal for analysis is presumably an evolution of tools like theseFocus on analysis interface; address the distinct requirements of production separatelyProduction interface should probably be addressed first, as it is simpler and will probably have components usable as parts of the analysis interface

  • RTAG?: Distributed production systemsDistributed production systems will have much common ground at the grid middleware level. How much can be done in common at the higher level of end-to-end distributed production applications layered over the grid middleware?Recognizing that the grid projects are active at this level too, and coordination is neededSurvey existing and planned production components and end-to-end systems at the application level (AliEn, MOP, etc.) and identify tools and approaches to develop in commonWrite product specifications for common components, and/or explicitly identify specific tools to be adapted and developed as common componentsInclude end user (production operations) interfaceGrid portal for production

  • RTAG?: Small-scale persistency & databasesIf not covered by the existing persistency RTAG, and if there is agreement this is neededWrite the product specification for a simple, self-contained, low-overhead object persistency service for small-scale persistency in C++ applicationsMarshal objects to a byte stream which may be stored on a file, in an RDBMS record, etc.In implementation, very likely a simplified derivative of the object streamer of the hybrid storeFor small scale persistence applications, e.g. saving state, saving configuration informationExamine the utility of and requirements on a simple, standard, easily installed and managed database service complementing the persistency service for small scale applicationsMySQL, PostgreSQL etc are casually adopted for simple applications with increasing frequency. Is it possible and worthwhile to converge on a common database service

  • RTAG?: Software testing tools & servicesHow much commonality can be achieved in the infrastructure and tools usedMemory checking, unit tests, regression tests, validation tests, performance testsA large part of this has been covered by the process RTAG

  • RTAG?: Software distributionMay or may not be adequately addressed in the process RTAGRequirements for a central distribution point at CERNA CERN LHC Program Library OfficeRequirements on software distribution taking into account all tiersSurvey and recommend on the various approaches, their utility, complementarityTarballs (DAR)RPMs and other standard open software toolsRole of AFS, asisHigher level automated distribution tools (pacman)

  • RTAG?: Evolution of OO language usageLong-term evolution of C++Role for other language(s), e.g. Java?Near, medium and (to the extent possible) long term application of other languages among LHC experimentsImplications for tools and support requirementsIdentify any requirements arisingApplications, services to be developed in commonThird party tools to be integrated and supportedCompilers and other infrastructure to be supportedLibraries required

  • RTAG?: LCG Benchmarking suiteBelow threshold for an RTAG? Every LCG application should come with a benchmarking suite, and should be made available and readily usable as part of a comprehensive benchmarking suiteDevelop requirements for a comprehensive benchmarking suite of LCG applications for use in performance evaluation, testing, platform validation and performance measurement, etc.Tools which should be representedTests which should be includedPackaging and distribution requirements

  • RTAG?: Online notebooks and other remote control / collaborative toolsIdentify near term and long term needs and requirements common across the experimentsSurvey existing, planned tools and approachesDevelop recommendations for common development/adaptation and support of tools for LHC

    Teams not on a specific experiment