Docstoc

High Density Cooling

Document Sample
High Density Cooling Powered By Docstoc
					         To 30kW and Beyond:
High-Density Infrastructure Strategies that
   Improve Efficiencies and Cut Costs
Emerson Network Power: The global
leader in enabling Business-Critical Continuity

                                                        Emerson Technologies

                                                       Uninterruptible Power
                                                       Power Distribution
                                                       Surge Protection
                                                       Transfer Switching
                                                       DC Power
                                                       Precision Cooling
                                                       High Density Cooling
                                                       Racks
                                                       Rack Monitoring
                                                       Sensors and Controls
                                                       KVM
                                                       Real-Time Monitoring
                                                       Data Center Software




                     © 2010 Emerson Network Power
Emerson Network Power –
An organization with established customers




                    © 2010 Emerson Network Power
Presentation topics


• Emerson Network Power overview

• “High Density Equals Lower Cost: High Density Design
  Strategies for Improving Efficiency and Performance,” Steve
  Madara, Vice President and General Manager, Liebert North
  America Precision Cooling, Emerson Network Power

• “Sandia National Laboratories’ Energy Efficient Red Sky
  Design,” David Martinez, Facilities Coordinator, Computing
  Infrastructure and Support Operations, Sandia National Laboratories

• Question and Answer session




                            © 2010 Emerson Network Power
High Density Equals Lower Cost:
High Density Design Strategies
for Improving Efficiency and
Performance
Steve Madara
Vice President and General Manager
Liebert North America Precision Cooling
Emerson Network Power



    © 2010 Emerson Network Power
Agenda

• Industry trends and challenges
• Three strategies for enhancing cooling efficiency
• High density equals lower cost




                          © 2010 Emerson Network Power
Industry Trends and Challenges




          © 2010 Emerson Network Power
Trends and challenges

• Server design
   – Higher ∆T across the server raising leaving air temperatures
                                                          Traditional cooling unit




                           © 2010 Emerson Network Power
Trends and challenges

• Regulatory
   – ASHRAE 90.1 Standard – 2010 Revisions
   – Refrigerant legislation
• Discussion around lower dew point limits
• Raising cold aisle temperatures
   – Intended to improve cooling capacity and efficiency
   – Monitor the impact on server fans
• No cooling




                           © 2010 Emerson Network Power
        Spring 2010 Data Center Users’ Group survey
                                                 Average kW per Rack
        0%        10%         20%         30%     40%           50%        60%         70%            80%         90%         100%




                                                                                                                                     Average:
Now 4%                   28%                                   36%                              12%     4% 3% 1% 11%
                                                                                                                                      ~8kW


  In
 two         9%                     30%                       21%                    15%         4% 3%3%           15%               Average:
years                                                                                                                                10-12kW

                                                 Maximum kW per Rack
        0%        10%         20%         30%     40%           50%         60%           70%         80%         90%         100%




Now          5%         20%                     23%                    16%            9%          8%        11%         11%
                                                                                                                                     Average:
                                                                                                                                      >12kW


  In
 two 3% 10%                     20%               16%                 14%            7%         11%               19%                Average:
years                                                                                                                                14-16kW

              2 kW or less                            >2 — 4 kW                                   >4 — 8 kW
              >8 — 12 kW                              >12 — 16 kW                                 >16 — 20 kW
              >20 — 24 kW                               © 2010 than 24 kW
                                                      GreaterEmerson Network Power                Unsure
Spring 2010 Data Center Users’ Group survey:
Top data center issues

                                 0%       5%       10%          15%    20%   25%    30%   35%   40%   45%


        Experienced hot spots                                                                     40%

             Run out of power                                                      26%

       Experienced an outage                                                 23%

   Experienced a “water event”                                               23%

N/A — Have not had any issues                                                23%

            Run out of cooling                                         18%

        Run out of floor space                                        16%

       Excessive energy costs                                   13%

                        Other           2%


                                 © 2010 Emerson Network Power
    Spring 2010 Data Center Users’ Group survey:
    Implemented or considered technologies
                      0%         10%        20%        30%         40%          50%           60%        70%     80%         90%      100%

Fluid economizer on         14%         5%         14%           10%                        28%                         29%
     chiller plant
 Fluid economizer
                                16%         4%     14%           7%                         31%                         28%
 using dry coolers

  Air economizer                16%         5%           23%                   11%                  22%                  24%

    Cold aisle
   containment                        28%                  11%                               38%                       14%          6% 3%

  Containerized/
                       5% 2%                24%                     18%                              35%                       16%
modular data center

Wireless monitoring        7%    5%                        44%                                9%           18%                18%


    Solar array       1%
                       3%         14%            10%                                  47%                                25%


Rack based cooling         11%        6%                      38%                                  18%           13%           15%


    DC power           6% 3%               18%                18%                                  35%                        20%

         Already implemented                                                           Plan to implement
         Still considering                                                             Considered, but decided against
         Will not consider                             © 2010 Emerson Network Power    Unsure
    Spring 2010 Data Center Users’ Group survey:
    Implemented or considered technologies
                      0%         10%        20%        30%         40%          50%           60%        70%     80%         90%      100%

Fluid economizer on         14%         5%         14%           10%                        28%                         29%
     chiller plant
 Fluid economizer
                                16%         4%     14%           7%                         31%                         28%
 using dry coolers

  Air economizer                16%         5%           23%                   11%                  22%                  24%

    Cold aisle
   containment                        28%                  11%                               38%                       14%          6% 3%

  Containerized/
                       5% 2%                24%                     18%                              35%                       16%
modular data center

Wireless monitoring        7%    5%                        44%                                9%           18%                18%


    Solar array       1%
                       3%         14%            10%                                  47%                                25%


Rack based cooling         11%        6%                      38%                                  18%           13%           15%


    DC power           6% 3%               18%                18%                                  35%                        20%

         Already implemented                                                           Plan to implement
         Still considering                                                             Considered, but decided against
         Will not consider                             © 2010 Emerson Network Power    Unsure
Impact of design trends on the cooling system

                                                                 72 F

                  By-pass air – 30% at 60 F                             CRAC
                                     Server                                       Design ΔT    - 21 F
                                      4 kw                              ΔT -      Operating ΔT - 16 F
                                                                        16 F
                                   ΔT – 18 F
                                                    78 F                CFM –
                          60 F       CFM –                              100%
                                      70%

                                                                           58 F



•   Legacy data center designs with poor airflow management resulted
    in significant by-pass air
    – Typical CRAC unit has a ΔT of around 21 F at 72 F return air by design
    – Low ΔT servers and significant by-pass air resulted actual lower CRAC
      ΔT and thus lower efficiency


                                  © 2010 Emerson Network Power
Impact of higher temperature rise servers

                                                                      75 F


                 By-pass air – 40% @ 56F                  Hot aisle
                                                                             CRAC
                                     Server                75 F
                                      8 kw                                   ΔT -
                                                                             21 F
                                   ΔT – 35 F
                                                    91 F                     CFM –
                         56 F        CFM –                                   100%
                                      60%

                                                                                54 F

•   Even with improved best practices, newer servers can create
    challenges in existing data centers
    – Server ΔT greater than CRAC unit ΔT capability requires more airflow
      to meet the kW server load
    – Requires even more careful management of the return air to the cooling
      unit as a result of the high exiting temperatures at the server.

                                 © 2010 Emerson Network Power
Introducing Efficiency Without Compromise




                                 Improving performance of the
                                     IT infrastructure and
                                          environment




                  ™


      Balancing high     Adapting to IT changes                 Delivering
         levels of           for continuous               architectures from 10-
      availability and     optimization and               60kW/rack to minimize
        efficiency          design flexibility               space and cost

                         Expertise & Support
                           © 2010 Emerson Network Power
Three Strategies for Enhancing Cooling Efficiency




                    © 2010 Emerson Network Power
Strategy 1: Getting the highest temperature to the
cooling unit

• Higher return air temperatures increase the cooling unit capacity and
  efficiency
• Increases the CRAC unit ΔT
• Increases the SHR




                            © 2010 Emerson Network Power
  Raising the temperature at the room level

   Without containment, hot air wrap-            With containment, hot air wrap-around
   around will occur and will limit max             is eliminated and server supply
          return temperatures                         temperatures are controlled
 CRAC              CRAC              CRAC       CRAC              CRAC             CRAC




                                                                          Rack



                                                                                    Rack
                                                               Rack
                                                     Rack
                                   Rack
    Rack



              Rack



                         Rack




• Impacted by                                          • Containment can be at the room
  – The length of the rack rows (long                       or zone level
    rows difficult to predict)                         • Supply air control provides
  – Floor tile placement                                    consistency between
  – Rack consistency and load                               containments
• Improved with                                        • Can be used in combination with
  – Ducted returns                                          high density modules or row
                                                            based cooling
  – Local high density modules     © 2010 Emerson Network Power
Strategy 2: Providing the right cooling and airflow

• Efficiency is gained when:
              Server Load (kW) = Cooling Unit Capacity (kW)
                   Server Airflow = Cooling Unit Airflow
• Challenges
  – Rising server ΔT result in higher
    by-pass air to meet the cooling
    load
  – Requires variable speed fans
    and independent control of the
    airflow from the cooling capacity




                               © 2010 Emerson Network Power
Strategy 3: Provide the most efficient heat rejection
components




• Reduce cooling unit fan power
• Maximize use of economization mode to reduce compressor hours of
  operation (chiller / compressor)
• Improve condenser heat transfer and reduce power
• Direct cooling – eliminate fans and compressor hours

                          © 2010 Emerson Network Power
High Density Equals Lower Cost




          © 2010 Emerson Network Power
  High density equals lower cost

  Adding 2000 kW of IT at 20kW/rack vs. 5kW/rack
  •   Smaller building for high density
  •   Fewer racks for high density
  •   Capital equipment costs more
  •   Equipment installation costs higher
  •   High density cooling is more efficient
                                                                         Cost Difference
                                                                   Low Density vs. High Density
             Building Capital Costs @ $250/sq. ft.                        ($1,875,000)
      Rack and PDU Capital Costs @ $2,500 each                             ($750,000)
                 Cooling Equipment Capital Costs                            $320,000
                                Installation Costs                          $750,000
                               Capital Cost Total                         ($1,555,000)
                  Cooling Operating Costs (1yr)                            ($420,480)
Total Net Savings of a High Density Design                                ($1,975,480)
     5 yr Total Net Savings of High Density                               ($3,657,400)
                                    © 2010 Emerson Network Power
     Total cost of ownership benefits

                                                                     Traditional Cooling                                                                  Liebert XD


Fan power- 8.5kW per
  100 kW of cooling
Average entering air                                                                                                              Cooling                           Fan power- 2 kW per
 temperature of 72-                                                                                                                Unit                              100 kW of cooling
       80 F
                                                                                                                                                                    Average entering air
                                                                                                                                                                      temperature of
                                                                                                                                                                         95-100 F
                                                                                      Chiller Capacity
                                                         1.40
                                                                                                                 Latent Load
      kW Chiller Capacity per kW of Sensible Heat Load




                                                         1.20
                                                                                                                 Fan Load
                                                                                                                 Sensible Load
                                                                                                                                          •   65% less fan power
                                                         1.00
                                                                                                                                          •   Greater cooling coil effectiveness
                                                         0.80
                                                                                                                                          •   100% sensible cooling
                                                         0.60
                                                                                                                                          •   20% less chiller capacity required
                                                         0.40
                                                                                                                                          •   Overall 30% to 45% energy savings
                                                                                                                                              yielding a payback down to 1 year
                                                         0.20



                                                         0.00
                                                                Traditional CW CRAC      CW Enclosed Rack   Refrigerant Modules
                                                                                                                           © 2010 Emerson Network Power
Solutions transforming high density to high efficiency

                              XDO20




                   XDV10
                                                                              XDS20
  XDC   or   XDP                       Embedded                            Component
                                       Cooling in               XDR10/40     Cooling
      Base                               Super
 Infrastructure                                                              without
                   XDH20/32            Computers                 Rear      Server Fans
    (160 kW)
                                      (microchannel              Door
                     Standard                                                20 - 40 kW
                     Cooling           intercoolers)            10-40 kW   Capable > 100 kW
                     Modules             35 – 75kW
Future pumping       10-35 kW
                                                                New & Future Product Configurations
 units of larger
 capacity and
 N+1 capability
                                 © 2010 Emerson Network Power
Rear door cooling

• Passive rear door cooling module
   – No cooling unit fans – server air flow only
   – Optimal containment system
   – Allows data centers to be designed in a “cascading”
     mode
• Simplifies the cooling requirements
   – Solves problem for customers without hot and cold
     aisles
   – Room neutral – does not require airflow analysis
   – No electrical connections




                             © 2010 Emerson Network Power
Customer layout




                                                       Cascade Effect
                  Cascade Effect




                        © 2010 Emerson Network Power
Direct server cooling without server fans

Liebert XDS configuration
•   A cooling rack which uses the Clustered
    Systems cooling technology to move heat
    directly from the server to the Liebert XD
    pumped refrigerant system
•   Heat is never transferred to the air




                                                                CPU 1

                                                                        CPU 0
•   Provides cooling for 36 1U servers
•   Available initially in 20kW capacity with a 40kW
    rack in the future
Benefits
•   No cooling module fans or server fans
•   8% to 10% smaller IT power requirements
•   Chill Off 2 tested at 80 F fluid temperature
    result in effective PUE<1
•   Next opportunity: XDP connection to a cooling
    tower without a chiller
                                 © 2010 Emerson Network Power
Energy consumption comparisons




                                                 Equivalent
                                                  PUE<1




                  © 2010 Emerson Network Power
Industry evolving technology to improve data center
cooling efficiency




                    © 2010 Emerson Network Power
Sandia National Laboratories’ Energy Efficient
              Red Sky Design

                   David J. Martinez
                   Facilities Coordinator
                   Corporate Computing Facilities
                   Sandia National Laboratories

                     Sandia is a multiprogram laboratory operated by
                   Sandia Corporation, a Lockheed Martin Company, for
                    the United States Department of Energy’s National
                    Nuclear Security Administration under contract DE-
                                    AC04-94AL85000
Why build Red Sky?

What happened to consolidation?

• Addresses critical national need for High Performance
  Computing (HPC)

• Replaces aging current HPC system




                       © 2010 Emerson Network Power
System comparison


     THUNDERBIRD                                          RED SKY
 140 Racks (Large)                         36 Racks (Small)

 50 Teraflops Total                        10 Teraflops per rack

 ~13 kW / rack full load                   ~32 kW / rack full load

 518 tons cooling                          328 tons cooling

 12.7M gal water per yr                    7.3M gal water per yr




                           © 2010 Emerson Network Power
Red Sky and new technologies

System design implemented three new technologies for
power and cooling:

 Modular Power Distribution Units

 Liebert XDP Refrigerant Cooling Units

 Glacier Doors

The facility had to be prepared…..




                        © 2010 Emerson Network Power
Facility preparation

   3.5 months
   Zero accidents
   0.5 miles copper
   650 brazed connections
   400 ft. carbon steel
   140 welded connections




                       © 2010 Emerson Network Power
Facility preparation




                       © 2010 Emerson Network Power
Facility preparation




                       © 2010 Emerson Network Power
Facility preparation




                       © 2010 Emerson Network Power
© 2010 Emerson Network Power
New cooling solutions

 Glacier Doors
 First rack-mounted, refrigerant-based passive cooling
 system on the market

             XDP Refrigerant Units
             Pumping unit that serves as an isolating
             interface between the building chilled
             water system and the pumped
             refrigerant (134A) circuit

             • Operates above dew point
             • No compressor
             • Power used to cool computer not
               dehumidify
             • 100% sensible at 0.13 kW per kW cooling


                           © 2010 Emerson Network Power
The total cooling solution

How it works

 90% heat load removed with Liebert XDP-Glacier Door
  combination

 Uses Laminar Air Flow concept

 Perforated tiles only needed in 1st row




                       © 2010 Emerson Network Power
Laminar air flow




                   © 2010 Emerson Network Power
How it all comes together


                 45o



Chiller
Plant                              Liebert
                                    XDP

                                                      Glacier
                                                       Door

                       © 2010 Emerson Network Power
How it all comes together


                                                    April to Sept.


                                                                            Chiller
                53o
                                                                            Plant
                                                            61o
  Liebert
                       Plug Fan                                            52o
   XDP
                      CRAC Unit

                                        Oct .to March
            Chiller
  0.46 kW power = 1 ton cooling

  Plate Frame Heat Exchanger                                      Plate Frame
  0.2 kW power = 1 ton cooling                                        Heat
                                                                  Exchanger
                             © 2010 Emerson Network Power
Comparison of compute power and footprint




                   © 2010 Emerson Network Power
Tons of cooling used




                   © 2010 Emerson Network Power
Annual water consumption




                  © 2010 Emerson Network Power
Carbon footprint
           1,000                                                          250

            900

            800                                                           200

            700

            600                                                           150

            500
  Tonnes




                                                                                gha
            400                                                           100

            300

            200                                                           50

            100

              0                                                           0
                   Red Sky                                  Thunderbird
       CO2E         203                                        912
       Footprint     46                                        205



                             © 2010 Emerson Network Power
Chiller plant power consumption and cost

                               $1,400,000                                                      18,000,000
                                                                                               16,000,000
Kilowatt Hours cost per Year




                               $1,200,000
                                                                        37% Reduction          14,000,000
                               $1,000,000




                                                                                                            Kilowatt Hours per Year
                                                                                               12,000,000
                                $800,000                                                       10,000,000

                                $600,000                                                       8,000,000
                                                                                               6,000,000
                                $400,000
                                                                                               4,000,000
                                $200,000                                                       2,000,000
                                      $0                                                       0
                                            Thunderbird                           Red Sky
                                            (518 Tons of                        (328 Tons of
                                              Cooling)                            Cooling)
            kWh Cost                        $1,324,222                           $838,504
            Kilowatt Hours                  15,954,483                          10,102,452



                                                 © 2010 Emerson Network Power
Energy consumption
                                      $140,000                                                       1,800,000

                                                                                                     1,600,000




                                                                                                                 Kilowatt Hours Used per Year
                                      $120,000
  Kilowatt Hours Cost per Year




                                                                                                     1,400,000
                                      $100,000
                                                                                                     1,200,000

                                       $80,000                                                       1,000,000
                                                                                  77% Reduction
                                       $60,000                                                       800,000

                                                                                                     600,000
                                       $40,000
                                                                                                     400,000
                                       $20,000
                                                                                                     200,000

                                            $0                                                       0
                                                  Thunderbird (21                 Red Sky (12 XDPs
                                                     CRACs)                         & 3 CRACs)
                                 kWh Cost            $126,791                           $28,256
                                 Kilowatt Hours      1,527,604                          340,437


                                                         © 2010 Emerson Network Power
Q&A


Steve Madara, Vice President and General Manager,
Liebert North America Precision Cooling, Emerson
Network Power

David J. Martinez, Facilities Coordinator, Corporate
Computing Facilities, Sandia National Laboratories

Thank you for joining us!
• Look for more webcasts coming this fall!
• Follow @EmrsnNPDataCntr on Twitter




                         © 2010 Emerson Network Power

				
DOCUMENT INFO