wheeler by suchenfz


									Research & Academic
  Computing @ IU
            Bradley C. Wheeler
           Associate Vice President & Dean
 Office of the VP for Information Technology & CIO

     A mission to support researchers and artists in co-creating the future

A foundation of sustainable production services

                          All computing/research images from Indiana University sites
Research & Academic Computing
                                                   Our Work

                                    Front Office                Back Office

                   Reliable    Researcher
                 Production    Consulting                              Systems
 Our Objective

                   Services    & Education

                 Co-Creating   Grant Initiation
                  the Future   & Collaboration

                                                         Dr. Kate Pilachoski, Professor of Astronomy
Central Research Computing at IU

   $8.6M Budget

   68 Staff
     9 Ph.D.s
     New TeraGrid SiteLead Appointment

   Assoc VP for University-wide (8 campuses)
     IUB & IUPUI two core research campuses
Of CSG Interest…

   Philosophy of Activities and Funding

   High Performance Computing
     Leveraged Facilities Management Approach
     Central-edge partnering

   Rethinking the Research Front Office

   Very high speed optical fiber network
     connects IUB, IUPUI, and Purdue
     multiple strands of the most modern fiber
     first higher ed owned in nation
   Provides enough networking capacity
    for the next 10-20 years between the
    three main research campuses (IU,
    Purdue, IUPUI)
   The networking infrastructure for
    collaboration of many sorts
IBM Research SP
(Aries/Orion Complex)
     1.005 TeraFLOPS. 1st
      supercomputer in US to
      exceed 1 TFLOPS peak
      theoretical processing
     Geographically
      distributed at IUB and
     Initially 50th, now 170th
      in Top 500
      supercomputer list
     An enabler of
      collaborative research
      using very large scale
           Analysis and Visualization
            of Instrument-Driven
           Distributed Linux cluster.
            Three locations: IUN,
            IUPUI, IUB
           2.164 TFLOPS, 0.5 TB
            RAM, 10 TB Disk
           First distributed Linux
            cluster to achieve more
            than 1 TFLOPS on
            Linpack benchmark –
            currently 50th on Top500
Massive Data Storage System
    Easy to use, no cost to
    Reliable and robust
    HPSS (High Performance
     Software System)
    Automatic replication of
     data between Indianapolis
     and Bloomington, via I-
    180 TB capacity with
     existing tapes; total
     capacity of 2.4 PB.
    100 TB currently in use; >5
     TB for biomedical data
John-E-Box - Commercialized
    R&D Disclosures
   7 inventions disclosed
    since 1997
      6 open source software

   John-E-Box design
    licensed and now in
    production by an
    Indiana firm
Digital Libraries…Co-creating & Production

                           Mellon Foundation grant helps
                           create digital video archive of
                           world music

                           BLOOMINGTON, Ind. -- A new
                           world of music from around the
                           globe will soon be available to
                           students and scholars. A research
                           team from Indiana University and
                           the University of Michigan has
                           received an $875,000 grant from
                           the Andrew W. Mellon
                           Foundation… 5 May 2003
Production Digital Library Services
Centralization vs Decentralization

      Leveraged Centralized Resources
        3 HPC systems
        Mass storage
        Visualization
                                         Funding Mix
        Networking (Teragrid, etc.)     • One-time cash
                                         • Recurring base
                                         • Many grants
      Leveraged Services                • Some fee for service

        Stat/Math Consulting/Software
        Linux/Unix Support
        Markup Language Services
Centrally Funded…Broad Access

      No charge back
        High Performance Computation
        Massive Data Storage (online, nearline)
          » Possible negotiated chargeback in excess of >50 TB
      Lots of partnerships
        With faculty on vendor equipment grants
        With faculty on major NSF equipment grants
        Partnerships supporting faculty research.
Partnering Example:
Facilities Management Agreement

   Offer to researchers:
      Buy our equipment specifications
      Put your equipment in our machine room
      You have priority on job queue to that
      We leverage it for broader use when you
       aren’t using it
      You get extra “hero runs” against the entire
       cluster as needed
      We handle the system admin work
 Partnering Example:
 Centralized Life Science Data Service
    “Any research within the IU School of Medicine should be able to
     transparently query all relevant public external data sources and all
     sources internal to the IU School of Medicine to which the researcher
     has read privileges”

 Based on use of IBM
 BLAST is accessible via
  DL’s wrappers.
 Implemented in
  partnership with IBM
  Life Sciences via IU-IBM
  strategic relationship in
  the life sciences
 IU contributed writing of
  data parsers
Demonstrating New Capabilities

 6 Continents                                               1st
 641 Processors

    Global analysis of the evolutionary relationships of

    HPC Challenge Award winner at SC03 Conference
    Demonstrates new capabilities in grid computing
     while advancing research in evolutionary biology
Partnering &Collaborations
   AVIDD – 20 faculty, dozens of
    staff, $1.8M in NSF funding
   Research in Indiana – 3
    universities, dozens of faculty
   Simulation of 747 crashing into
    Pentagon: dozens of engineers, 1
    network, 2 supercomputers
   IP-Grid – 2 universities, dozens
    of faculty, $3M in NSF
   INGEN – 100+ faculty, hundreds
    of staff, $105M funding from
    Lilly Endowment, Inc.
   GlobalNoc – dozens of staff
    supporting thousands of
    researchers worldwide
New Front Offices


   Consolidation of Research Support Services into
    “one stop shopping”
   All research technology consulting (HPC, storage,
    Linux, visualization, statistics, markup languages…)
   Reference librarians, Institutional Review Board,
    Technology Transfer, Contracts & Grants, …
Miscellaneous Wins
   “The Least You Need to
    Know” series
   Leverage in licensing
      Fully subsidized
      Low per user

   Ongoing work…
     Leveraging IU’s general
      support KnowledgeBase for
      research support
     More internal partnering for
      leverage in services
National and International Agenda

   NSF: National Cyberinfrastructure Report

      Recognition that IT infrastructure is essential
       for advancing scientific research
      Planning for leverage and scale are essential –
       the “one off” project model is inadequate
      Unfunded roadmap…but influential

The “Business” of “eScience”
     Establishing effective organizational designs
      and shared organizational routines that
      achieve investment objectives
         » We will develop or they will be imposed by NSF

     Developing economies of scale beyond a “lab”
         » Service Level Agreements from Resource Providers

     Maturing support mechanisms, security, and
      documentation for the masses
Dear Colleagues…

   “…NSF has identified a management model
     to support ETF management and
     operations. The model identified includes
     one System Management Group (SMG),
     nine Resource Providers (RPs) and an
     ETF Advisory Board (EAB). The
     respective roles and responsibilities of the
     SMG, the RPs and the EAB are defined
Economies of Scale?
   Are there economies of scale in activity X?
   Can universities capture these economies?

                                      •Faculty Member & School?
                                      •School & Campus?
           Cost $                     •Campus & Univ System?
                                      •Among Universities?
                                      •Among Nations
                                      •Between Domains

                              Number Participating
Research & Academic
  Computing @ IU
            Bradley C. Wheeler
           Associate Vice President & Dean
 Office of the VP for Information Technology & CIO


To top