One of the lesser-known areas of the Barcelona Biomedical Research Park (PRBB) is its computing brain. The Data Centre (DC) located on floor 0 is the space that guarantees the high-performance computational research that takes place in the Park. A 120 m2 area houses the servers and other electronic equipment for the computing service that facilitates data storage, processing and use.
In order to optimise its present and future performance, during the last 6 months several improvement works have been carried out. From el·lipse we have had the opportunity to talk to Oliver Blanco, head of the Infrastructure Department of the PRBB Consortium to learn first-hand about the changes that have been made.
The Data Centre (DC)
Taking into account the space occupied by the 6 centres of the Park, its groups and services and the Consortium, the Data Centre of the PRBB currently has only 4% of free space, being at almost maximum capacity. It is estimated that there are around 6,500 CPUs (Central Processing Units) housed in the DPC. And it is estimated that they store about 10.4 Petabytes (or 10,400 Terabytes) of information.
To get a more visual idea of how much information this is, there are some estimates that one Petabyte of storage, if digitally printed on pages of text, would occupy 20 million large folders. Inside these folders would be over 500,000,000,000,000 (five hundred billion) pages of nothing but text. If we do the calculations, 10Pb would be 200,000,000,000 (two hundred million) folders and 5,000,000,000,000,000 (5 trillion!) pages.
Improving cooling reliability
The processing and storage of information involves the use of a type of equipment with very demanding cooling requirements. Also, high temperatures cause electronic devices to lose performance and life time. This is why it is recommended that the temperature of the room where these devices are housed should not exceed 24ºC. “To give you an idea, with the previous air conditioning system – which was 18 years old! – when a cooling machine broke down, we could go from 24ºC to 30ºC in just 10 minutes”, says Blanco.
Thus, a large part of the work consisted of replacing the old refrigerant gas cooling machines with water machines, which are much more efficient. An action that is not only sustainable due to the removal of the refrigerant gas, but also reduces the economic and energy demands of the Park. “Bearing in mind that the new generations of racks are high-density with much higher power consumption than the current ones, we have recalculated the power dissipation of the DC, thinking about future equipment”, adds Blanco.
In order to guarantee more efficient cooling of the racks, small cubicles have been installed inside the DC itself to house the different IT equipment, reducing the surface area and the volume of air to be cooled. In the centre of these closed spaces, ventilation grilles have been placed to expel cold air from the floor. This allows, as shown in the image below, the racks to absorb the cold air from inside this small room, let it pass through their interior cooling them, and expel the hot air outside of the space where they are located.
In addition, taking advantage of the change of cooling machines, the building’s central cooling plants have been connected to the main generators, in order to ensure the production of cooling water. This manoeuvre also allows the entire system to be connected so that other machinery outside the DC can also be supplied with cold water for climate control purposes.
“The PRBB and its DC have made an enormous qualitative leap in terms of security and scalability of the facilities”
Oliver Blanco, head of the Infrastructure Area of the PRBB Consortium.
All this, the new 6 cooling machines, as well as their connection to generators, has enabled the DC to work with redundant equipment in order to guarantee the climate supply at all times. “Based on a very high level of knowledge of the facilities, and by making small – not very expensive – investments at the right time, we have succeeded in making a spectacular qualitative leap in terms of the security of the facilities and their continuity. We now have safe production of cooling water. In other words, we can guarantee that even if there is a power cut, we will be able to cool the DC. Such a security level in the tertiary sector is only found in hospitals or very critical facilities”, Blanco assures.
Density and weight of servers
The speed at which servers and hard disks are evolving to become more powerful and store even more information is a challenge for an infrastructure such as the Data Centre. ‘Over the last few years we have had a lot of problems with weight requirements. The density of hard drives, especially backup drives, has evolved a lot lately and even though they are getting smaller, they are getting heavier. Where once you could fit one hard drive, now you can fit 10,’ says Oliver. This is why, little by little, specific actions have been carried out in the installation to support the weight of the new equipment.
With the current refurbishment, interventions have been carried out throughout the entire DC. The floor, which previously supported 400kg/m2, can now support a weight of up to 1,500kg/m2 over almost the entire surface of the room. To reinforce this space and ensure this resistance, two actions have been implemented:
- At the level of the room itself, a grid has been installed under the racks, which allows the weight to be better distributed over the entire floor slab.
- At the structural level, 9 steel beams have been installed under the Data Centre, on the parking ramp, which support the floor of the facility.
These works have updated the Data Centre to the current energy and structural requirements of the space. But they have also prepared it for the future requirements and extensions that, sooner or later and as a result of the speed at which technological equipment evolves, it will need.
In all the improvement works, the PRBB is committed to taking advantage of the moment to introduce small changes that will make the Park a more sustainable building, always bearing in mind the possible needs that a scientific infrastructure of excellence may require in the future.