Docstoc

The Complete IDC Intelligence So

Document Sample
The Complete IDC Intelligence So Powered By Docstoc
					Welcome To The
             34 th

 HPC User Forum
     Meeting
   October 2009
      Thank You To:
HLRS/University of Stuttgart
 For Hosting The Meeting!
Thank You To Our Sponsors!

Altair Engineering
Bull
IBM
Microsoft
Introduction: Logistics

We have a very tight agenda (as usual)
   Please help us keep on time!

Review Agenda Times:
   Please take advantage of breaks and free
    time to network with attendees
   Note: We will post most of the
    presentations on the web site
 HPC User Forum Mission


   To improve the health of the high-
    performance computing industry
through open discussions, information-
 sharing and initiatives involving HPC
   users in industry, government and
academia, along with HPC vendors and
        other interested parties.
 HPC User Forum Goals

Assist HPC users in solving their ongoing computing,
technical and business problems
Provide a forum for exchanging information, identifying
areas of common interest, and developing unified positions
on requirements
       By working with users in other sectors and vendors
       To help direct and push vendors to build better products
       Which should also help vendors become more successful
Provide members with a continual supply of information on:
         Uses of high end computers, new technologies, high end
          best practices, market dynamics, computer systems and
          tools, benchmark results, vendor activities and strategies
Provide members with a channel to present their
achievements and requirements to interested parties
1Q 2009 HPC
Market Update
   Q109 HPC Market Result – Down 16.8%


                                      HPC
                                    Servers
                                     $2.1B




                                                              Workgroup
     Supercomputers                                         (under $100K)
      (Over $500K)                                              $282M
         $802M


                         Divisional        Departmental
                      ($250K - $500K)     ($250K - $100K)
                          $237M               $754M

Source IDC, 2009
Q109 Vendor Share in Revenue




                Bull Dawning
                               Other
         Appro 0.6%
                      0.3%     11.7%
          0.4%
        SGI
       1.5%                             HP
           Cray
                                       28.9%
           2.9%
    Fujitsu
     3.4%
         Sun
         3.6%
         NEC
         9.4%
Q109 Cluster Vendor Shares

           Appro     Other
           0.7%      21.6%
                                  HP
    Dawning                      30.8%
      0.6%
     Bull
     1.1%
 SGI
1.1%
  NEC
  1.8%

 Fujitsu
  2.6%        Sun     IBM     Dell
              5.7%   12.3%   21.7%
HPC Compared
    To IDC
Server Numbers
HPC Qview Tie To Server Tracker:
1Q 2009 Data
Tracker QST Data Focus:               HPC Qview Data Focus:
Compute Nodes                         The Complete System:
                                      “Everything needed to turn it on”



                                                           HPC Special Revenue
                           1              2                Recognition Services
                           QST                           Includes those sold through
     All WW                              HPC
                                                     custom engineering, R&D offsets,
   Servers As          HPC             HPC Special
                                          Revenue
                                                      or paid for over multiple quarters

   Reported In        Qview            Recognition
                                          Services
                     Compute
   IDC Server          Node
                                         ~$474M

     Tracker         Revenues        Revenue
                                      Beyond          HPC Computer System Revenues
      $9.9B          ~$1.05B*       Base Nodes
                                                     Beyond The Base Compute Nodes:
                                                     Includes interconnects and switches,
                                      ~$576M           inbuilt storage, scratch disks, OS,
                                                      middleware, warranties, installation
                                         3            fees, service nodes, special cooling
                                                                   features, etc.
                                        HPC



                          * This number ties the two data sets on
                            an apples-to-apples basis
2010 IDC HPC Research Areas

• Quarterly HPC Forecast Updates
    Until the world economy recovers
• New HPC End-user Based Reports:
    Clusters, processors, accelerators, storage, interconnects,
     system software, and applications
    The evolution of government HPC budgets
    China and Russia HPC trends
• Power and Cooling Research
• Developing a Market Model For Middleware and
  Management Software
• Extreme Computing
• Data Center Assessment and Benchmarking
• Tracking Petascale and Exascale Initiatives
 Agenda: Day One

12:45   HPC User Forum Welcome/Introductions: Steve Finn (Chair,
        HPC User Forum) and Earl Joseph (IDC)
13:00   HLRS Welcome/Introductions: Michael Resch, HLRS
13:15   Michael Resch, HLRS, a European View of HPC
13:45   Robert Singleterry, NASA HPC Directions, Issues and
        Concerns
14:15   Tom Sterling, Trends and New Directions in HPC
14:45   ISC Update
15:00   Break
15:45   Jim Kasdorf, Pittsburgh Supercomputer Center, National
        Science Foundation Directions
16:15   Erich Schelkle, ASCS/Porsche, End User HPC Site Update
16:45   Vijay K. Agarwala, Developing a Coherent Cyberinfrastructure
        from Local Campus to National Facilities
17:00   Thomas Eickermann, Juelich Research Center, PRACE
        Program Update
17:30   Networking Get-together
18:30   End of first day
   Welcome
To Day 2 Of The
HPC User Forum
    Meeting
 Agenda: Day Two

9:10    Welcome/Logistics – Earl Joseph and Steve Finn,
        BAE Systems
9:15    Jack Collins, National Cancer Institute Update,
        Directions and Concerns
9:45    Marie-Christine Sawley, ETH Zurich, CERN
        group,Data taking and analysis at unprecedented
        scale: the example of CMS
10:15   Paul Muzio, HPC Directions at the City University of
        New York
10:45   Bull Technology Update, Jean-Marc Denis
11:30   Break
11:45   Lutz Schubert, HLRS, Workflow Management
12:15   New Software Technology Directions at
        Microsoft, Wolfgang Dreyer
12:30   Wrap up and plans for future HPC User Forum
        meetings, Michael Resch, Earl Joseph and Steve Finn
12:35   Farewell and Lunch
 Important Dates For Your Calendar

FUTURE HPC USER FORUM MEETINGS:

October 2009 International HPC User Forum Meetings:
   HLRS/University of Stuttgart, October 5-6, 2009
    (midday to midday)
   EPFL, Lausanne, Switzerland, October 8-9, 2009
    (midday to midday)

US Meetings:
   April 12 to 14, 2010 Dearborn, Michigan at the
    Dearborn Inn
   September 13 to 15, 2010 Seattle, Washington
     Thank You
For Attending The 34th

  HPC User Forum
       Meeting
 Questions?


Please email:
hpc@idc.com



Or check out:
www.hpcuserforum.com
 Questions?


Please email:
hpc@idc.com



Or check out:
www.hpcuserforum.com
Welcome To The
             35 th

 HPC User Forum
     Meeting
   October 2009
        Thank You To:
Ecole Polytechnique Fédérale de
       Lausanne (EPFL)
   For Hosting The Meeting!
Thank You To Our Sponsors!

Altair Engineering
Bull
IBM
Microsoft
Introduction: Logistics

We have a very tight agenda (as usual)
   Please help us keep on time!

Review Agenda Times:
   Please take advantage of breaks and free
    time to network with attendees
   Note: We will post most of the
    presentations on the web site
 HPC User Forum Mission


   To improve the health of the high-
    performance computing industry
through open discussions, information-
 sharing and initiatives involving HPC
   users in industry, government and
academia, along with HPC vendors and
        other interested parties.
 HPC User Forum Goals

Assist HPC users in solving their ongoing computing,
technical and business problems
Provide a forum for exchanging information, identifying
areas of common interest, and developing unified positions
on requirements
       By working with users in other sectors and vendors
       To help direct and push vendors to build better products
       Which should also help vendors become more successful
Provide members with a continual supply of information on:
         Uses of high end computers, new technologies, high end
          best practices, market dynamics, computer systems and
          tools, benchmark results, vendor activities and strategies
Provide members with a channel to present their
achievements and requirements to interested parties
1Q 2009 HPC
Market Update
   Q109 HPC Market Result – Down 16.8%


                                      HPC
                                    Servers
                                     $2.1B




                                                              Workgroup
     Supercomputers                                         (under $100K)
      (Over $500K)                                              $282M
         $802M


                         Divisional        Departmental
                      ($250K - $500K)     ($250K - $100K)
                          $237M               $754M

Source IDC, 2009
Q109 Vendor Share in Revenue




                Bull Dawning
                               Other
         Appro 0.6%
                      0.3%     11.7%
          0.4%
        SGI
       1.5%                             HP
           Cray
                                       28.9%
           2.9%
    Fujitsu
     3.4%
         Sun
         3.6%
         NEC
         9.4%
Q109 Cluster Vendor Shares

           Appro     Other
           0.7%      21.6%
                                  HP
    Dawning                      30.8%
      0.6%
     Bull
     1.1%
 SGI
1.1%
  NEC
  1.8%

 Fujitsu
  2.6%        Sun     IBM     Dell
              5.7%   12.3%   21.7%
HPC Compared
    To IDC
Server Numbers
HPC Qview Tie To Server Tracker:
1Q 2009 Data
Tracker QST Data Focus:               HPC Qview Data Focus:
Compute Nodes                         The Complete System:
                                      “Everything needed to turn it on”



                                                           HPC Special Revenue
                           1              2                Recognition Services
                           QST                           Includes those sold through
     All WW                              HPC
                                                     custom engineering, R&D offsets,
   Servers As          HPC             HPC Special
                                          Revenue
                                                      or paid for over multiple quarters

   Reported In        Qview            Recognition
                                          Services
                     Compute
   IDC Server          Node
                                         ~$474M

     Tracker         Revenues        Revenue
                                      Beyond          HPC Computer System Revenues
      $9.9B          ~$1.05B*       Base Nodes
                                                     Beyond The Base Compute Nodes:
                                                     Includes interconnects and switches,
                                      ~$576M           inbuilt storage, scratch disks, OS,
                                                      middleware, warranties, installation
                                         3            fees, service nodes, special cooling
                                                                   features, etc.
                                        HPC



                          * This number ties the two data sets on
                            an apples-to-apples basis
2010 IDC HPC Research Areas

• Quarterly HPC Forecast Updates
    Until the world economy recovers
• New HPC End-user Based Reports:
    Clusters, processors, accelerators, storage, interconnects,
     system software, and applications
    The evolution of government HPC budgets
    China and Russia HPC trends
• Power and Cooling Research
• Developing a Market Model For Middleware and
  Management Software
• Extreme Computing
• Data Center Assessment and Benchmarking
• Tracking Petascale and Exascale Initiatives
 Agenda: Day One

14:00   HPC User Forum Welcome/Introductions, Steve Finn and
        Earl Joseph
14:15   EPFL Welcome/Introductions, Henry Markram, EPFL and
        Giorgio Magaritondo, VP, EPFL
14:30   Neil Stringfellow, CSCS/ETHZ, HPC Strategy in
        Switzerland, Swiss National Supercomputing Centre
15:00   Henry Markram, Felix Schuermann, EPFL, and David
        Turek, IBM, "Blue Brain Project Update"
15:30   IBM Research Partnerships, Dave Turek
15:45   Altair Technology Update, Paolo Masera
16:00   Jack Collins, National Cancer Institute Update, Directions
        and Concerns
16:30   Break
16:45   Markus Schulz, CERN High-throughput computing
17:15   Robert Singleterry, NASA
18:00   End of First Day
   Welcome
To Day 2 Of The
HPC User Forum
    Meeting
 Agenda: Day Two

9:00    Welcome/Logistics – Earl Joseph and Steve Finn, BAE
        Systems, Summarizing the September '09 User Forum
9:00    Victor Reis, US Department of Energy
9:30    Alan Gray, EPCC End User Site Update, University of
        Edinburgh
10:00   Jim Kasdorf, Pittsburgh Supercomputer Center, "National
        Science Foundation Directions"
10:30   Thomas Eickermann, Juelich Supercomputing Centre,
        PRACE Project Update
11:00   Break
11:15   Panel on Using HPC to Advance Science-Based Simulation
        Panel Moderators: Henry Markram and Steve Finn
        Panel Members: Jack Collins, Thomas Eickermann, Victor
        Reis, Felix Schuermann, Markus Schulz and Neil
        Stringfellow,
12:15   New Software Technology Directions at Microsoft
12:30   Wrap up and plans for future HPC User Forum meetings,
        Henry Markram, Earl Joseph and Steve Finn
12:45   Farewell and Lunch
 Important Dates For Your Calendar

FUTURE HPC USER FORUM MEETINGS:

October 2009 International HPC User Forum Meetings:
   HLRS/University of Stuttgart, October 5-6, 2009
    (midday to midday)
   EPFL, Lausanne, Switzerland, October 8-9, 2009
    (midday to midday)

US Meetings:
   April 12 to 14, 2010 Dearborn, Michigan at the
    Dearborn Inn
   September 13 to 15, 2010 Seattle, Washington
     Thank You
For Attending The 35th

  HPC User Forum
       Meeting
 Questions?


Please email:
hpc@idc.com



Or check out:
www.hpcuserforum.com
 Questions?


Please email:
hpc@idc.com



Or check out:
www.hpcuserforum.com
      OEM Mix Of HPC Special Revenue
      Recognition Services
                          Non-SEC Reported Product Revenues = $474M
                                          Other
                                   Sun
            2                             11%
                                   4%
                            Dell
                            12%
                                                           HP
                                                          30%




                                   IBM
                                   43%

Notes:
• Includes product sales that are not reported by OEMs as product revenue in a given quarter
     Sometimes HPC systems are paid for across a number of quarters or even years
• Includes NRE – if required for a specific system
• Includes custom engineering sales
• Some examples – Earth Simulator, ASCI Red, ASCI Red Storm, DARPA systems, and many small
  and medium HPC systems that are sold through a custom engineering or services group
  because that need extra things added
Areas Of HPC “Uplift” Revenues

         How The $576M "Uplift" Revenues Are Distributed
                          Misc. items
       Bundled warranties    7%
3
              8%


         Software
           16%                                Computer
                                             hardware (in
                                               cabinet)
                                                 45%

     External storage
          12%
                           External
                        interconnects
                             12%
     Areas Of HPC “Uplift” Revenues

Notes:
                                                               3

* Computer hardware (in cabinet) -- hybrid nodes, service nodes,
  accelerators, GPGPUs, FPGAs, internal interconnects, in-built
  disks, in-built switches, special cabinet doors, special signal
  processing parts, etc.
* External interconnects -- switches, cables, extra cabinets to hold
  them, etc.
* External storage -- scratch disks, interconnects to them, cabinets to
  hold them, etc. (This excludes user file storage devices)
* Software -- includes both bundled and separately charged software
  if sold by the OEM, or on the purchase contract -- includes the
  operating system, license fees, the entire middleware stack,
  compilers, job schedules, etc. (it excludes all ISV applications
  unless sold by the OEM and in the purchase contract)
* Bundled warranties
* Misc. items -- Since the HPC taxonomy includes everything
  required to turn on the system and make it operational, items like
  bundled installation services, special features and other add-on
  hardware, and even a special paint job if required
Special Paint Jobs Are Back …




         http://www.afrl.hpc.mil/consolidated/hardware.php

				
DOCUMENT INFO