Docstoc

ATLAS Data Management Status

Document Sample
ATLAS Data Management Status Powered By Docstoc
					   U.S. ATLAS Software
          WBS 2.2

         S. Rajagopalan
          July 8, 2003
DOE/NSF Review of LHC Computing
                                 Outline

 Organizational Issues
         ATLAS & U.S. ATLAS software

 Current Affairs
         Current resource allocation including LCG contributions
         Major milestones met

 FY04 Planning
         Planning, coordination with international ATLAS
         Near term milestones
         Priorities and request for FY04

 Conclusions

S. Rajagopalan                              DOE/NSF Review of LHC Computing, 8 July 2003   2
Organizational Issues
             New Computing Organization

      x

             x                          x                            x


                 x

     x           x         x
                 x
                 x
                 x
     x           x
     x                                                       x

                     x                                       x

S. Rajagopalan                 DOE/NSF Review of LHC Computing, 8 July 2003   4
           Computing Management Board

 Coordinate & Manage computing activities
         Set priorities and take executive decisions

 Computing Coordinator (chair)
         Software Project Leader (D. Quarrie, LBNL)
         TDAQ Liaison
         Physics Coordinator
         International Computing Board Chair
         GRID, Data Challenge and Operations Coordinator
         Planning & Resources Coordinator (T. Lecompte, ANL)
         Data Management Coordinator (D. Malon, ANL)

 Meets bi-weekly

S. Rajagopalan                             DOE/NSF Review of LHC Computing, 8 July 2003   5
    Software Project Management Board

 Coordinate the coherent development of software
         (core, applications and software support)
 Software Project Leader (chair) D. Quarrie
         Simulation coordinator
         Event Selection, Reconstruction & Analysis Tools coordinator
         Core Services Coordinator (D. Quarrie)
         Software Infrastructure Team Coordinator
         LCG Applications Liaison (T. Wenaus, BNL)
         Physics Liaison
         TDAQ Liaison
         Sub-System: Inner Detector, Liquid Argon, Tile, Muon coordinators
                Liquid Argon: S. Rajagopalan (BNL), Muon: S. Goldfarb (U Mich)

 Meets bi-weekly
S. Rajagopalan                                      DOE/NSF Review of LHC Computing, 8 July 2003   6
        US ATLAS Software Organization
Software Project (WBS 2.2)               Coordination (WBS 2.2.1)
        S. Rajagopalan
                                             Core Services (WBS 2.2.2)
                                                        D. Quarrie

                                          Data Management (WBS 2.2.3)
                                                         D. Malon

                                         Application Software (WBS 2.2.4)
                                                        F. Luehring

                                           Software Support (WBS 2.2.5)
                                                        A. Undrus
 US ATLAS software WBS scrubbed, consistent with ATLAS
     Resource Loading and Reporting established at Level 4
 Major change compared to previous WBS:
    Production and Grid Tools & Services moved under Facilities
S. Rajagopalan                             DOE/NSF Review of LHC Computing, 8 July 2003   7
Current Affairs
                 WBS 2.2.1 Coordination

 David Quarrie (LBNL) :
         ATLAS Software Project Manager
         ATLAS Chief Architect
         U.S. ATLAS Core Services Level 3 Manager

 David Malon (ANL) :
         ATLAS Data Management Coordinator
         U.S. ATLAS Data Management Level 3 Manager

 Other U.S. Atlas personnel playing leading roles in ATLAS:
         S. Goldfarb (Muon), T. LeCompte (Planning),
         S. Rajagopalan (LAr), T. Wenaus (LCG Liaison)


S. Rajagopalan                           DOE/NSF Review of LHC Computing, 8 July 2003   9
                 WBS 2.2.2 Core Services
                               (D. Quarrie)

 P. Calafiura (LBNL) :
         Framework support, Event Merging, EDM infrastructure
 M. Marino (LBNL) :
         SEAL plug-in and component support
 W. Lavrijsen (LBNL) :
         User interfaces, Python scripting, binding to dictionary, integration
          with GANGA.
 C. Leggett (LBNL) :
         Conditions infrastructure, G4 Service integration in Athena,
          Histogramming support. Redirected to other tasks in FY04
 H. Ma, S. Rajagopalan (BNL) (Base Program) : EDM infrastructure
 C. Tull (LBNL) (PPDG) : Athena Grid Integration coordination
S. Rajagopalan                              DOE/NSF Review of LHC Computing, 8 July 2003   10
        WBS 2.2.2 Key Accomplishments

 Python based user interfaces to CMT, Athena, and ROOT
 Interval of Validity Service to allow time-based retrieval of
    conditions data into transient memory
 Support for plug-in manager in LCG/SEAL
 gcc-3.2 support, multithreading support, pile-up support.
 Services to upload persistent addresses for on-demand
    retrieval of data objects
 Common Material Definition across sub-systems, creation
    of G4 geometries from this description demonstrated

S. Rajagopalan                    DOE/NSF Review of LHC Computing, 8 July 2003   11
                 WBS 2.2.3 Data Management
                               (D. Malon)

 S. Vanyachine (ANL) :
         Database Services & Servers, NOVA database
 Kristo Karr (ANL) :
         New Hire, replaces S. Eckmann
         Collections, Catalogs and Metadata
 Valeri Fine (BNL) :
         Integration of Pool with Athena
 David Adams (BNL) :
         Event datasets
 Victor Perevotchikov (BNL) :
         POOL evaluation, foreign object persistent in ROOT.
S. Rajagopalan                              DOE/NSF Review of LHC Computing, 8 July 2003   12
          WBS 2.2.3 Key Accomplishments
 ATLAS specific
         Athena-Pool conversion service prototype
                Will be available to end user in July (tied to POOL release)
         Support for NOVA database
                (primary source for detector description for simulation)
                Support for interval of validity
                NOVA automatic object generation
                Data additions, embedded MYSQL support for G4
                Authentications, access to databases behind firewalls

 LCG contributions
         Delivered POOL collections/metadata WP interface, doc & unit tests
         Delivered relational implementation of POOL explicit collections
         Delivered MYSQL and related package support
         Foreign object persistence
S. Rajagopalan                                     DOE/NSF Review of LHC Computing, 8 July 2003   13
          WBS 2.2.4 Application Software
                               (F. Luehring)

 Geant3 simulation support
    BNL
 Calorimeter (LAr & Tile) software incl. calibration
    ANL, BNL, Nevis Labs, U. Arizona, U. Chicago, U. Pittsburgh, SMU
 Pixel, TRT detector simulation & digitization
    Indiana U., LBNL
 Muon reconstruction and database
    BNL, Boston U., LBNL, U. Michigan
 Hadronic calibration, tau and jet reconstruction
    U. Arizona, U. Chicago, ANL, BNL, LBNL
 electron-gamma reconstruction
    BNL, Nevis Labs, SMU
 High Level Trigger software
    U. Wisconsin
 Physics analysis with new software
    U. S. ATLAS

S. Rajagopalan                                 DOE/NSF Review of LHC Computing, 8 July 2003   14
                 WBS 2.2.5 Software Support
                          (A. Undrus)

 Release and maintenance of ATLAS and all associated
  external software (including LCG software, LHCb Gaudi
  builds) at the Tier 1 Facility.

 Deployment of a nightly build system at BNL, CERN and
  now used by LCG as well.

 Testing releases with new compilers (gcc-3.2, SUN 5.2).

 Software Infrastructure Team : Forum for discussions of
  issues related to support of ATLAS software and associated
  tools. A. Undrus is a member of this body.

S. Rajagopalan                     DOE/NSF Review of LHC Computing, 8 July 2003   15
US FY03 contribution in international context

                                       Non-US              Total             LCG
  Category             US (FTE)
                                        (FTE)             (FTE)             (FTE)
 Framework                3.25          0.75                4.0               1.3

 EDM                   0.5P + 0.5B        0                 1.0               0.0

 Det. Description         0.0            1.0                1.0               0.0

 Data Management          4.6            4.0                8.6               1.2

 Graphics                 0.0           0.25                0.25              0.0

 SW Infrastructure        0.7           2.95                3.45              0.1

 Total                9.05P + 0.5B      8.95               18.5              2.6
 * Excludes David Quarrie & Torre Wenaus coordination role contributions
S. Rajagopalan                         DOE/NSF Review of LHC Computing, 8 July 2003   16
                 LCG Application Component

 US effort in SEAL : 1.0 FTE (FY03)
         Plug-in manager (M. Marino (0.75 FTE, LBNL)
                Internal use by POOL now, Full integration into Athena Q3 2003
         Scripting Services (W. Lavjrisen; 0.25 FTE, LBNL)
                Python support and integration

 US effort in POOL : 1.2 FTE (FY03)
         Principal responsibility in POOL collections and metadata WP
                D. Malon, K. Karr, S. Vanyachine (0.5 FTE) [ANL]
         POOL Datasets (D. Adams, 0.2 FTE, BNL)
         Common Data Management Software
                V. Perevoztchikov, ROOT I/O foreign object persistence (0.3 FTE, BNL]
         POOL mysql package and server configurations (ANL, 0.2 FTE)

S. Rajagopalan                                    DOE/NSF Review of LHC Computing, 8 July 2003   17
           US ATLAS contribution in LCG

    • Contribution to Application Area only
    • Snapshot (June 2003) contribution




S. Rajagopalan                  DOE/NSF Review of LHC Computing, 8 July 2003   18
             ATLAS interactions with LCG
 Lack of manpower has made ATLAS participation weaker
  than we would like
         Little or no effort available to :
                Participate in design discussions of POOL & SEAL omponents for
                 which we are not directly responsible
                Evaluate and test new features
                Write ATLAS acceptance tests for POOL releases and for specifically
                 requested features
                Ensure that ATLAS priorities are kept prominent in LCG plans (ATLAS
                 does this, but our voice has at times seemed not as loud as that of our
                 sisters)
         Less development contributed in the collections/metadata work
          package (for which we are responsible) than we would have liked,
          though this should improve soon with recent hire at ANL


S. Rajagopalan                                    DOE/NSF Review of LHC Computing, 8 July 2003   19
FY04 Plans
             International ATLAS Planning

 ATLAS has a planning officer: T. LeCompte (ANL)

 The current focus is on defining the WBS and establishing
    coherent short term plans.
         US WBS used as a starting point!

 Responsibility in monitoring all deliverables including non-
    ATLAS components (such as LCG) and assessing the
    impact from any delays.

 Responsibility for establishing the software agreements and
    scope with international ATLAS institutions.
S. Rajagopalan                           DOE/NSF Review of LHC Computing, 8 July 2003   21
                 ATLAS Computing Timeline
2003
                    • Jul 03 POOL/SEAL release

NOW                 • Jul 03 ATLAS release 7 (with POOL persistency)
                    • Aug 03 LCG-1 deployment
2004                • Dec 03 ATLAS complete Geant4 validation
                    • Mar 04 ATLAS release 8
                    • Apr 04 DC2 Phase 1: simulation production
2005                • Jun 04 DC2 Phase 2: reconstruction (the real challenge!)
                    • Jun 04 Combined test beams (barrel wedge)
                    • Dec 04 Computing Model paper
                    • Jul 05 ATLAS Computing TDR and LCG TDR
2006
                    • Oct 05 DC3: produce data for PRR and test LCG-n
                    • Nov 05 Computing Memorandum of Understanding
                    • Jul 06 Physics Readiness Report
2007                • Oct 06 Start commissioning run
                    • Jul 07 GO!
S. Rajagopalan                              DOE/NSF Review of LHC Computing, 8 July 2003   22
                 Major near term milestones

 July to Dec 2003: SEAL/POOL/PI deployment by LCG

 Sept. 2003: Geant 4 based simulation release

 Dec. 2003: Validate Geant4 release for DC2 and test-beam

 Dec. 2003: First release of full ATLAS software chain using LCG
    components and Geant4 for use in DC2 and combined test-beam.

 Spring 2004: Combined Test-Beam runs.

 Spring 2004: Data Challenge 2
         Principal means by which ATLAS will test and validate its proposed
          Computing Model
 Dec. 2004: Computing Model Document released


S. Rajagopalan                            DOE/NSF Review of LHC Computing, 8 July 2003   23
                        U.S. scope issues

 2003-2004: Develop sufficient core software infrastructure to deploy
    and exercise a reasonable prototype of the ATLAS Computing Model
         ATLAS is quite far from being able to do this

 Now is not the time to sacrifice core software development
         Doing so puts the TDR and hence the readiness for LHC turn-on at risk.

 U.S was asked to lead the effort in coordinating, developing and
    deploying the ATLAS architecture (from ground-zero in 1999).
         Leadership roles in Software Project, Architecture and Data Management.
          + major responsibilities - but minimal resources to work with.
         We are responsible to ensure the success of ATLAS architecture.
         Efforts are continuing to be made in encouraging and recruiting non-US
          institutions & US universities to contribute to core and leveraging from LCG.

S. Rajagopalan                                 DOE/NSF Review of LHC Computing, 8 July 2003   24
                 Core Software & Physicists

 The presence of a strong core team in the U.S. has helped U.S.
    physicists make significant contributions to reconstruction, simulation
    and physics analysis. – in turn allowing them to play an influential role
    in the overall ATLAS software program.
         Examples from LAr, InDet simulation and Calo, Muon reconstruction, event
          generation infrastructure, egamma, tau, jet reconstruction, calibration, …

 Conversely, this has also allowed U.S. physicists to provide valuable
    feedback to core software and in some cases contribute to the core
    development
         Examples are the Event Data Model and the Detector Description efforts.

      This harmony is necessary to allow U.S. to develop the necessary
      expertise and effectively contribute to the physics at turn-on.
S. Rajagopalan                                DOE/NSF Review of LHC Computing, 8 July 2003   25
          Incremental Effort: Core Services

 Redirections:
         C. Leggett (0.5 Calibration Infrastructure to EDM)
         M. Marino (0.25 Training to SEAL/Framework)

 Additions (prioritized):
         + 1.0 FTE in Detector Description, WBS 2.2.2.3 (U. Pittsburgh)
                New Hire to work with J. Boudreau
         + 0.5 FTE in Analysis Tools support, WBS 2.2.2.5
                New Hire or redirection of effort
         + 1.0 FTE in Graphics, WBS 2.2.2.4 (UC Santa Cruz)
                Existing person (G. Taylor) who is currently making significant
                 contributions to ATLANTIS (Atlas Graphics Package).



S. Rajagopalan                                       DOE/NSF Review of LHC Computing, 8 July 2003   26
                        Detector Description

 ATLAS lacked a Detector Description Model
         Numbers hardwired in reconstruction, no commonality with simulation.

 Along came Joe Boudreau (U. Pittsburgh) CDF experience
         Successfully designed, developed and deployed a prototype model
          for both material and readout geometry. We encouraged this!
                Automatically handles alignments, Optimized for memory (5 MB for
                 describing ATLAS geometry), Not coupled to visualization software.
         Currently resident at Oxford, helping sub-systems migrate.
 No surprise, the work load on Joe has increased
         Critical items include Material Integration Service, Configuration
          Utility, Identifiers and Transient Model for readout geometry

    Important to support such university based initiatives to core software
S. Rajagopalan                                   DOE/NSF Review of LHC Computing, 8 July 2003   27
   Incremental Effort: Data management

 Our plan has always been to sustain 6.5 FTE effort.
 Recent Cuts in 2002 …
         Ed Frank, U. Chicago
         BNL Hire : job offered but retracted due to last minute budget cuts
… have impacted our ability to deliver the promised
         Unable to save and restore objects from persistent event store
         No ATLAS interfaces to Event collections, catalogs and metadata
    Approximate allocation of new effort:
         + 1.0 FTE Collections, Catalogs, and Metadata (WBS 2.2.3.5)
         + 1.0 FTE Common Data Management Software (WBS 2.2.3.2)
         + 0.5 FTE Event Store (WBS 2.2.3.3)
                Redirect from WBS 2.2.3.1 & 2.2.3.5 (0.5 each) if no funds available.

S. Rajagopalan                                    DOE/NSF Review of LHC Computing, 8 July 2003   28
                 Impact of Insufficient Funds
         -1.0 FTE in Graphics                                                                    Model 6

                Impacts our ability to have any reasonable visualization software for test-beam
                 or Data Challenge 2.
         - 0.5 FTE in Analysis Tools
                Impacts our ability to deliver a framework for analysis
         - 1.0 FTE in Data Management
                0.5 FTE for supporting Non-Event Data Management.
                0.5 FTE in supporting basic database services
         - 1.0 FTE in Detector Description                                                       Model 5
                Jeopardizes our ability to deliver key components including Material Service
                 Integration, common geometry for simulation and reconstruction,
         - 1.0 FTE in Common Data Management Software                                            Model 4
                Impacts contributions to POOL and integration aspects, schema management
         - 0.5 FTE in Event Store
                Support for a persistent EDM and Event Selection

S. Rajagopalan                                         DOE/NSF Review of LHC Computing, 8 July 2003   29
  Cost (FY04 k$)           FY04 Ramp-Up Cost




                       FY04 guidance
                       from J. Shank




                      +0     +0.5   +1.5   +2.5      +3.5         +4         +5
                   Prioritized incremental Ramp-Up in FTE
S. Rajagopalan                              DOE/NSF Review of LHC Computing, 8 July 2003   30
                 WBS-Personnel Summary




S. Rajagopalan               DOE/NSF Review of LHC Computing, 8 July 2003   31
                           Conclusions

 Request for a + 5 FTE in FY04:
         2.5 FTE to bring Data Management to its intended Level of Effort
         1 FTE university based for Detector Description
         0.5 FTE for contribution to Analysis Tools
         1 FTE university based for support Graphics

 Guidance given for FY04 can handle only 1.5 FTE

 U.S. ATLAS LCG contribution will be 4.0 FTE in FY04
         2.0 FTE each in Core Services and Data Management WP




S. Rajagopalan                             DOE/NSF Review of LHC Computing, 8 July 2003   32

				
DOCUMENT INFO
Shared By:
Categories:
Stats:
views:12
posted:6/26/2011
language:English
pages:32