NGS - GridPP

Document Sample
NGS - GridPP Powered By Docstoc
					Grid Operations Support Centre
             and
   UK National Grid Service
         What Next ?
                              @
?           Neil Geddes
       GridPP15,Janury 2006
Outline
• Current Status
• Plans for the coming year
• NGS-2
                              @
• working with


?
                                        Introduction
•       The “Science and Innovation Investment Framework 2004-2014”
    –         a national e-Infrastructure (hardware, networks, communications technology)
              to provide “ready and efficient access to information of all kinds – such as
              experimental data sets, journals, theses, conference proceedings and
              patents”.
•       Critical to successful collaborative, multi-disciplinary research and
        innovation
    –          „Over the decade many of the grand challenges in research will occupy the
              interfaces between the separate research disciplines developed in the 19th
              and 20th centuries… much more needs to be done, and by more players, if
              the UK is to achieve a global edge‟.
•       The NGS and GOSC explicitly addresses exactly these issues
    –         Common interfaces and operational procedures provide the basis of
          •       efficient sharing of resources
          •       Simple users access to an increasing range of resources
          •       key step towards a service economy for data and computation.
                The National Grid Service




                               Launched April 2004
                               Full production - September 2004

                               Focus on deployment/operations
                               Do not do development

                               Responsive to users needs

+ Belfast, Westminster …
                                              NGS Users
                                              Number of Registered NGS Users

                  300


                  250
Number of Users




                  200


                  150
                                                                                                    NGS User
                                                                                                    Registrations
                  100
                                                                                                    Linear (NGS User
                                                                                                    Registrations)
                   50


                    0
                  14 January   23 April   01 August    09         17      28 May       05       14
                     2004       2004        2004    November   February    2005    September December
                                                      2004       2005                2005      2005

                                                         Date
                                                        Hours




                                       6
                               50000
                                               100000
                                                                      150000
                                                                               200000
                                                                                        250000




                           0
                      1
                      2
                      3
                      4
                      5
                      6
                      7
                      8
                     19
                     10
                     11
                     12
                     13
                     14
                     15
                     16
                     17
                      8
                     19
                     20
                     21
                     22
                     23




                                           4
                     24
                     25
                     26
                     2
                     27
                     28
                     39
                     30
                     31
                     32
                      3
                     34
                     35
                     36
                     37
                     38
                     39
                     40
                     41
                     4
                     42
                     43
                     44
                     45
                     46
                     47
                      8
                     49
                     50
                     51
                     52
                     53
                     54
                     55
                     56
                     5
                     57
                     58
                     69
                                                                                                 1




                     60
                     61
                     62
                      3
                     64
                     65
                     66
                     67
                     68
                     6
                     79
                     70
                     71
                     72
                     73
                     74
                     75
                     76
                     77
                      8
                     79
                     80
                     81
                                                        2


                     82
                     83
                     8
                     84
                     85
                     86
                     87
                     88
                     99
                     90




Users (Anonymous)
                     91
                     92
                     93
                     94
                                       5

                     95
                     96
                     97
                    198
                    109
                    100
                    1021
                    10
                    103
                    104
                     05
                    106
                    107
                    108
                    119
                    110
                    111
                    112
                    113
                    114
                    115
                    116
                    117
                    1
                    118
                    129
                     20
                    121
                    122
                    123
                    124
                    125
                    126
                    127
                    128
                    139
                                                                                                 Usage Statistics (Total Hours for all 4 Core Nodes)




                    130
                    131
                    132
                    1
                    133
                    134
                     35
                    136
                       7
                                           3
                                                            User DN
          SRB Storage history for month prior to 31/08/05




Detailed information -> https://www.ngs.ac.uk/ops/gits/srb/srbreport.txt
                                      0
                                          5
                                              10
                                                   15
                                                        20
                                                             25
                                                                  30
                                                                       35
                                                                            40
                                                                                 45
                                                                                      50
            O=universiteit-utrecht




      IB
                        OU=BBSRC

                     OU=Birmingham




                                                                                       Count of OU=
                         OU=Bristol

                     OU=Cambridge

                         OU=Cardiff

                         OU=CLRC

                         OU=CPPM

                           OU=DLS

                        OU=DMPHB




            IB
                      OU=Edinburgh

                       OU=Glasgow

                        OU=Imperial

                      OU=Lancaster

                         OU=Leeds
                                                                                                      Total




                       OU=Liverpool




OU=
                     OU=Manchester

                      OU=Newcastle

                     OU=Nottingham

                         OU=OASIS

            INRIA        OU=Oxford

                     OU=Portsmouth

                          OU=QUB

           OU=QueenMaryLondon

                       OU=Reading
                                                                                                              Users by institution




                       OU=Sheffield

                    OU=Southampton

                           OU=UCL

                       OU=Warwick

                    OU=Westminster

                          OU=York
                         Users by “Research
                              Council”
                                                                               1000000


                                             Total                             100000




                                                              CPU Time (Hrs)
                                                                                10000
      Count of "RC"
160
                                                                                 1000

140                                                                               100


                                                                                   10
120

                                                                                    1
100                                                                                      1   10                   100    1000
                                                                                                   Storage (GB)

80


60


40


20


  0
            bbsrc     cclrc   epsrc   nerc            pparc                          AHRC    mrc                  esrc

                                               "RC"
                   P-GRADE NGS Portal
                        http://www.cpc.wmin.ac.uk/ngsportal
The P-GRADE NGS portal, operated by the University of Westminster, offers an alternative
to the NGS Portal for executing and monitoring computational jobs on the UK National Grid
Service. In addition, it enables the graphical development, execution and visualisation of
workflows – composed of sequential and parallel jobs – on the NGS resources.

                                                             Execute
                                                             workflow



Log in



                                                                               Visualise
                                                                               execution

   Create
  workflow                         Map
                                 execution


                       http://www.cpc.wmin.ac.uk/ngsportal
GEMLCA -                Legacy Code Support for NGS Users
                      http://www.cpc.wmin.ac.uk/gemlca
•   If you have a legacy application that you would like to make accessible for other NGS
    users utilise GEMLCA to:
    •    upload your application into a central GEMLCA repository
    •    make it available for authorised users through the P-GRADE NGS portal
                                                                  Browse repository
        Publish legacy code




                                         Create workflow




                       http://www.cpc.wmin.ac.uk/gemlca
                              GODIVA
                           Diagnostics Study of
                           Oceanography Data
      EVE
  Excitations and
Visualisation Project
                        Integrative Biology
                          Simulation of Strain in
                         Soft Tissue under Gravity
                                       The results reveal
                     PDB database of   biologically
                      DNA crystal      important patterns
                       structures      of sequence-
PDB2MD:                   116d         dependent
                          180d         flexibility.
                          196d
An automated
                          1d56
pipeline performs
molecular
dynamics
simulations on
DNA crystal
structures.

Charlie Laughton        AMBER
School of Pharmacy                       MD database
University of        running on NGS
                                         & analysis
Nottingham
                                       The results reveal
                     PDB database of   biologically
                      DNA crystal      important patterns
                       structures      of sequence-
PDB2MD:                    1ilc
                          116d         dependent
                          116d
                          180d         flexibility.
                          180d
                          196d
An automated
                          196d
                          1d56
pipeline performs
molecular
dynamics
simulations on
DNA crystal
structures.

Charlie Laughton        AMBER
School of Pharmacy                       MD database
University of        running on NGS
                                         & analysis
Nottingham
Reality Grid Example
                         BRIDGES GridBLAST Job Submission
                                                                                        ScotGRID
                                                                       ScotGRID          worker
                                                                       masternode        nodes

        end user                            NESC Grid Server
        machine                                (Titania)               PBS server
                           send job                                      side
                                                                           +
                           request                                      BLAST
      GridBLAST client                   GT 3 core
                                        grid service
                           return
                           result
                                                                                  jobs farmed out to
                                                                                    compute nodes
                                                            PBS
                                                           wrapper
                          Apache         BRIDGES
                          Tomcat           Meta-
                                                           Condor        Condor
                                         Scheduler                         +
                                                           wrapper
                                                                         BLAST
The BRIDGES project
                                               GT2.4
                                              wrapper                Condor Central
                                                                       Manager         NESC Condor
Micha Bayer                                                                               pool
NeSC-Glasgow


                                          GT2.4                           GT2.4
                                            +                               +
                   NGS                    BLAST                           BLAST



                                      Leeds headnode                 Oxford headnode
                                                        execution                       execution
                                                          hosts                           hosts
                                               Roadmap
•        Goals :
    1.      Support the current service functionality and user base
    2.      Expansion to include new partners,
    3.      Improve interoperability with EGEE, TeraGrid and DEISA
    4.      Convergence with project/community/campus infrastructure
    5.      Provision of value added services on top of the basic NGS infrastructure
•        Specific Targets (services, operations, technology)
    1.       Improved operational security and incident response procedures 
    2.       Deployment of resource broker ?
    3.       Virtual Organisation Management Service (VOMS)
    4.       Incremental support for the LCG Baseline Services. ☺
    5.       A co-scheduling system ?
    6.       Job submission to the NGS through the emerging JSDL standard 
    7.       Support for access to a range of data storage services ☺
    8.       Resource accounting and improved grid account management 
    9.       Provision of a test system for dynamic service hosting. 
    10.      Services from the OMII managed programme. ☺
    11.      Initial integration with Shibboleth authentication ?
    12.      First examples of “authorisation aware” generic services ?
•What Next
                                 UK e-Infrastructure
                               Users get common access, tools, information,
                               Nationally supported services, through NGS
               HPCx + HECtoR
Regional and
Campus grids




Community Grids
                                      Integrated
                                      internationally


                                                                    LHC

VRE, VLE, IE


                                                                    ISIS TS2
                                     The Vision of the
                                      National Grid
                                         Service
•   Integrated coherent electronic access for UK researchers to all resources
    and facilities that they require to carry out their research.
     –   From advanced real time facilities: real time instruments to historical data
     –   Supporting access to regional, national and international facilities,
     –   Integrating institutional or even departmental resources
•   Examples of the services:
     –   Location independent data storage and replication
     –   Location independent access to institutional repositories and certified long term archival
     –   Location independent access to local, regional, national and international data resources
     –   Access to local, regional, national and international computational resources
     –   Internationally recognized electronic identity and identity management tools
     –   Tools for managing collaborative project or Virtual Organisation based authentication and
         authorisation
     –   Co-scheduling and operation of a wide range of national and international resources.
     –   Tools to support distributed collaborative working
•   In addition the GOSC will support
     –   24 hour monitoring of the UK‟s grid infrastructure
     –   The policy framework for operations
     –   Review of partner services.
     –   A central UK help desk and support centre
     –   A repository of user and system documentation,
                             Next 3 years
• The GOSC and NGS development will lead to:
   – An expanded National Grid Service for UK research
        • partners and affiliates
        • range of services provided.
   – Robust NGS operations and management procedures.
   – Interoperability with other national and international e-
     infrastructures
   – Integration of key data sources and data services including
        • Data Centre - Edina, Mimas, AHDS,
        • Facilities - LHC, DIAMOND and ISIS.
   –   Improved and measured NGS reliability
   –   Supported scientific research over next 3-5 years
   –   Value added services deployed on the basic NGS infrastructure.
   –   Detailed service definitions for sustainable infrastructure
•In more detail…
                           Support Centre
• Front line e-infrastructure support for UK academic community.
• Authentication framework required by national and international
  agreement.
• Infrastructure services
    – authentication, VO/authorisation, information, monitoring, resource
      brokering
• Support interfaces with partner infrastructures
    – within the UK: e.g. MIMAS, Edina, AHDS, GridPP, e-minerals, and
      others
    – Internationally: e.g TeraGrid, EGEE
• Website, including documentation, training materials and related
  links
• Work with NeSC and other organisations to provide training
• Coordinate development and deployment programme for NGS.
    – Development expertise
• User management for the NGS
• Partner and affiliate management for the NGS,
                                                    National and
                                                    International
                                                      Facilities
•   UK gateway:
     –   Regional, National and international HPC facilities
           •   HECToR, TeraGrid and DEISA.
     –   Advanced national experimental facilities
           •   DIAMOND, ISIS and the National Crystallographic Service.
     –   National, regional and institutional data centres.
     –   Campus grid developments
     –   Grid infrastructures
     –   Grid Interoperability workshop at SC05
           •   EGEE, TeraGrid, OSG, NGS, Naregi, IPAC
           •   First of a regular series – next meeting at GGF in March
           •   Some key proto-agreements
                  –   JDSL, SRM, GridFTP, GSI and VOMS,“GLUE” (and CIM),RU records
•   GOSC will
     –   Integrate access to data from experimental facilities.
     –   Support collaborative activities spanning each (or all) of these infrastructures
     –   Where possible, deploy a robust cross-service scheduling system
     –   Common tools and infrastructures for data handling, organisation, manipulation and creation
         of user interfaces.
                             Grid Interoperation
 • Leaders from TeraGrid,                                   • Six international teams will
                                                              meet for the first time at
   OSG, EGEE, APAC,                                           GGF-16 in February 2006
   NAREGI, DEISA,                                             – Application Use Cases
   Pragma, UK NGS,                                              • (Bair/TeraGrid,
                                                                  Alessandrini/DEISA)
   KISTI will lead an                                         – Authentication/Identity Mgmt
   interoperation initiative                                    • (Skow/TeraGrid)
   in 2006.                                                   – Job Description Language
                                                                • Newhouse/UK-NGS
                                                              – Data Location/Movement
                                                                • Pordes/OSG
                                                              – Information Schemas
                                                                • Matsuoka/NAREGI
                                                              – Testbeds
                                                                • Arzberger/Pragma
Leaders from nine Grid initiatives met at SC05 to plan an
application-driven “Inerop Challenge” in 2006.
                                 Integration of
                               Computational and
                                Data Resources
•   Work with major and representative data providers.
     – Provide a common gateway as the basis for continued developments
     – Progressively more of the data providers should be incorporated
•   To develop the necessary integration services the GOSC will work with
    research communities that require data integration, for example,
    environmental, biomedical and socio-economical research.
•   The NGS will support high-level tools, driven by metadata and abstractions
    that suit the research disciplines, working in conjunction with OMII-UK and
    international tool developers to promote and understand these
    developments.
•   The MIMAS and EDINA data centres are already, or will soon be, funded
    explicitly to “grid enable” some of their resources. GOSC will work closely
    with these data centres to ensure that these developments and compatible
    with and integrated into the NGS services.
                                         Core NGS
                                          nodes
•   Core NGS nodes
     –   Provide a controlled test environment for new services
     –   Provide a high level of, professionally managed, central services
     –   Facilitate the rapid technical development of the NGS infrastructure
     –   Provide a common infrastructure and proving ground for new users
     –   Drive a technical agenda, developing experience and best practice
     –   Represent a neutral core around which partners can converge
     –   Provide resources to guarantee access for new grid users

•   Positioned between HPC and campus grid

•   Upgrade existing core nodes in 2007
                              Accounting
                             and Charging
• Accounting and metering is already important for the NGS
   – Monitoring of the infrastructure is important for operations
   – Monitoring of service availability/performance for partnership
   – Monitoring and accounting of service/resource usage
• Effective distributed accounting/metering provides the basis by
  which partners can have confidence in resource sharing and in
  additional resource provision and consumption
• This work will continue and the infrastructure to support this will
  continue to be developed.
• NGS must, at very least, provide a mechanism for parters to “trade”
• FEC currently a complication
   – Partners must recover their investment (loan)
   – Users have no “money”
   – NGS sustainability
                                 Integration with
                                Computing Service
                                    Providers
•   Factors for closer integration with computing service providers are:
     – Well defined interfaces and procedures
     – Clear understanding of the roles and commitments
     – A well understood core software stack with components of known provenance
     – Effective metering and auditing of services and users
     – Integration of the NGS AAA framework with the standard national and campus
       systems
     – Training, documentation and awareness raising targeted at computing services
•   The role of UKERNA is key here. UKERNA has extensive experience in
    dealing with the service providers in question and excellent connections into
    these institutions.
•   GOSC will work with UKERNA and other key groups such as UCISA and
    RUGIT
•   As the number and integration of partners grows, need to integrate better
    with the UCISA community
                        Outreach and
                          Training
• UK needs training as
  –   Support for decision makers
  –   Support for e-Infrastructure providers
  –   Support for users
  –   Support for application, portal and tool developers
  –   Support for educators
• Essential to coordinate and leverage work of
  other bodies
       • JISC, RIN, NeSC, EGEE …
                                                      OMII
•   The initial OMII releases were of limited value to NGS
     –   experimental deployments of OMII software have been installed on the NGS since November
         2004.
•   OMII 2.0 has more interesting things
     –   OGSA-DAI WS-I, GridSAM
•   OMII-UK brings in more
     –   OGSA-DAI and myGrid
•   OMII-UK goal is to supply interoperable Web Services (Linux and Microsoft
    platforms)
     –   GOSC fully support these goals
•   NGS will:
     –   Continue its deployment of OGSA-DAI on the data nodes
     –   Provide access to the NGS compute resources through the GridSAM
     –   Deploy the Resource Usage Service
     –   In addition GOSC and OMII will work together to
     –   Provide services that can be integrated into workflows controlled through (OMII-UK) tools
     –   Improve management of web services.
     –   Improve accounting for grid service use
     –   Integrate Shibboleth and other VO tools used by GOSC.

•   Interoperability of OMII software with existing and likely future NGS infrastructure
    remains a key requirement.
                                                   Next 2-3 Years

Provisional Budget
         Service                                  See Note Annual cost                 3 year cost
         GOSC/NGS Management                      1            £140k                       £420k
         GOSC                                     2            £985k                      £2955k
         Core NGS Nodes                           3            £240k                      £3120k
         Data Integration Development             4            £120k                       £360k
         Community Support                        5             £510                       £1530
         Total                                    6            £1995                       £8385

•    Notes:
1.   PI time (20%), Director (100%), Technical Director (50%), technical administration (100%) and secretarial
     support (100%).
2.   All GOSC roles except the explicit support for core NGS nodes, data service integration, community
     gateways and management.
3.   A full cost assuming 4 nodes @£600k each plus 3 years system administration effort for each.
4.   Data centre integration expertise, matching computational expertise form the Core NGS nodes.
5.   Funding for “application” or “community” gateway activities. It is essential that some form of activity like this
     be supported in some way.
6.   UKERNA work on networking, security and operational best practice is not explicitly included above.
                    Relation to GridPP

• NGS and GridPP partners in EGEE
• Joining NGS brings generic user and admin
  support
• “GridPP” not a partner in NGS, but expect
  individual GridPP sites to be partners or affiliates
   – Build and support broader communities locally
   – LCG(UK) eventually a logical view of the UK grid
• Partners can offer a wide range of services

				
DOCUMENT INFO
Shared By:
Categories:
Stats:
views:6
posted:7/14/2011
language:English
pages:33