© Photo | ebm-papst, Ralf Kreuels

Stay cool

A climatic solu­tion for the new Computer Centre of the Deutscher Wetter­di­enst


22 degrees Celsius, 55 percent air humidity; and that, prefer­ably without any devi­a­tions, round the clock, 365 days per year. The super­com­puters in the brand new Computer Centre of the German Weather Service have clear ideas on their own “comfort­able climate”. When it is only a few degrees too warm for the computing compo­nents, they stop working. To avoid this happening, Michael Jonas, head of the German Mete­o­ro­log­ical Computer Centre (DMRZ), is relying upon a clev­erly devised climatic concept. The visible icing on the cake of this cooling system sits enthroned at the very top, on the roof of the newly built weather head­quar­ters: Eight mighty heat exchangers with a total capacity of 2,600 kilo­watts absorb the concen­trated thermal load from the computer centre, in which 80 large EC axial fans move an air volume of nearly one million cubic metres per hour, to allow the weather computers to stay cool and continue their computing. Neigh­bours in the surrounding resi­den­tial area can never­the­less sleep soundly: thanks to the infi­nitely vari­able and whisper-quiet EC motors the sound pres­sure level at a 50-metre distance is only 19 dB(A), it there­fore lies far below the legal require­ment for resi­den­tial areas.

And now for the weather

dwd_dachluefter For a good weather fore­cast the modern mete­o­rol­o­gist needs a lot of expe­ri­ence and a great deal of measured data, as well as one thing above all: computing power! Numeric weather models cover the globe with a finely meshed math­e­mat­ical grid. The finer this digital fishnet stocking, the more real­istic the results — and the greater the hunger for power. A hunger, however, that Michael Jonas can not quench without limits. As he is not only respon­sible for the tech­nology, but also for the budget, he must keep an eye on the cost-effec­tive­ness of the computer centre. This also applies to the cooling for the two computer centre rooms. In fact, the computers would prefer to be a few degrees cooler; the value of 22 degrees is a compro­mise between the current drain and the avail­ability and reli­a­bility of the systems. On an area of over 1,000 square metres computers line up next to computers, racks to racks, cabi­nets to cabi­nets, and it is contin­u­ously expanding. As a compar­ison: in 2003 the DMRZ reached a computing capacity of 3,000 gigaflops?—?around 3,000 billion computing oper­a­tions per second, equating approx­i­mately to the power of 20,000 PCs. In the final construc­tion stage in 2012 Michael Jonas will make a power of 50 teraflops avail­able to its users. This is around 50,000 billion computing oper­a­tions per second — the perfor­mance of around 400,000 PCs. Whoever wants to compute so much also needs a lot of elec­tricity to do this: even today the systems, including the required cooling, allow them­selves a current draw of a good 600 kilo­watts. A value that will increase to roughly 2,000 kilo­watts by 2012.

Current in, heat out

thermofin What comes in a current, must then go out as heat, according to a rule of thumb. Different factors must there­fore be taken into account when plan­ning the air condi­tioning for the computer centre: in addi­tion to room size, current draw, redun­dancy and ther­mo­dy­namics, the energy effi­ciency plays a growing role. The DMRZ relies on a cold-water pump circuit system (5,000-litre reser­voir in the cellar) as well as forced cooling with Stulz Cyber-Air preci­sion air-condi­tion systems. In the cold months external air is fed in for cooling. As the external temper­a­ture sinks, the load for the compres­sors in the air-condi­tioning systems also sinks. Intel­li­gent standby manage­ment reduces this load even further. It distrib­utes the stored reserve capac­i­ties equally to all systems which then run in the partial load range and thus very econom­i­cally. Further savings poten­tials are provided by the fans in the CyberAir air-condi­tioning systems. Just like their big “colleagues” on the roof, they are also powered by elec­tron­i­cally controlled EC direct current motors and deliver precisely the air flow that is used. Not less, but also no more. They adapt without vari­a­tion to all power require­ments and run very effi­ciently in the partial load range.

Controlled Air Traffic

stulz The cold air gener­ated by the air-condi­tioning systems is directed down into the raised floor and then along to racks and computer cabi­nets. The “cold-aisle/hot-aisle contain­ment” prin­ciple provides an optimum cooling circuit. In a cold-aisle the cool air from the air-condi­tioners is passed through the perfo­rated floor plates and drawn-in by the computer fans. The heated air then flows into the oppo­site hot-aisle, rises to the ceiling and flows back to the air-condi­tioner. A water-glycol mixture absorbs the excess heat and trans­fers it to the heat exchangers on the building roof. In the cool months the heat exchangers don’t have much to do, the amount of waste heat is nearly zero. The reason: the heat is extracted from the coolant by a heat pump and is then used for heating the German Weather Service office areas totalling over 22,000 square metres. The computers there­fore not only supply the much longed-for computing power to the mete­o­rol­o­gists and scien­tists, they also simul­ta­ne­ously provide a cosy atmos­pheric envi­ron­ment.

 

Discover more:

ebm-papst in data centres

Cool solu­tions for hot tech­nology.

Required fields: Comment, Name & Mail (Mail will not be published). Please also take note of our Privacy protection.