
As summer temperatures rise, data center operators are hard at work, ensuring data center temperatures remain consistent without raising their energy bills. They are looking for innovative data center cooling systems, monitoring technology, architecture and more. With so many options on the market and many more entering each year, it can be hard to decide which one is best for your data center.
We hear this from many of our customers and compiled a list of 5 options modern data centers employ to help them beat the summer heat.
1) Employ Regularly Scheduled Maintenance
2) Optimize Server Racks for Cooling Efficiency
3) Rethink Your Data Center Architecture
4) Increase Data Center Temperatures
5) Utilize Liquid Cooling
Employ Regularly Scheduled Maintenance
Regularly scheduled maintenance plays a significant role in sustaining data center uptime in the summer heat. Having a regular preventative maintenance schedule from server racks to your facility’s cooling system ensures everything is running as it should be. Without regularly scheduled maintenance, by the time temperatures rise, it’s too late and could cause a data center disaster.
Optimize Server Racks for Cooling Efficiency
In a high-density data center, it’s critical to have an effective cable management structure to improve cooling at a rack level. With so much equipment in a modern-day rack, the cables can take up a good deal of available space. If you don’t dress them properly/ tie them the right way and effectively manage them, they will start to block airflow. We recommend streamlining cables to allow hot air within the rack to easily flow to the back of the rack. A data center manager should also consider purchasing server racks with built-in cable management.
Another item that can be optimized within a server rack for cooling efficiency is the positioning of the power distribution units (PDUs). If not positioned correctly, PDUs can block airflow within a cabinet. Our engineers strongly recommend implementing a recessed PDU cavity to alleviate this pain point. A recessed PDU cavity moves the PDU outward from the cabinet, and away from the area that would typically block hot air. It’s important to discuss these options with your current server rack supplier to ensure they have a model with this design, or if they do not, the capability to manufacture a custom server rack to your needs. So, what do you do with space within a rack that does not have equipment?
Utilizing blanking panels inside of server racks can be a cost-effective way to improve data center cooling. The blanking panels cover any open spaces without equipment and block the surface of the rack, allowing cool air to enter from the cool aisle or to exhaust hot air into the hot aisle. The color of a server rack is also an important detail in improving cooling efficiency at a rack level.
Many data centers are adopting white hardware as their standard instead of black for heat retention purposes. Consider this, white server racks reflect 80% of light, while black server racks reflect about five percent of light. What does that mean from a cost savings perspective? If you were to change your all-black data center to an all-white data center, you would reduce your lighting requirements by up to 30 percent. That is a tangible cost savings on your monthly electric bill from energy efficiency. The best part of this suggestion? Manufacturers typically charge the same price for a white cabinet as they would a black cabinet with the same specs, so there would be no additional investment.
Rethink Your Data Center Architecture
Hot and cold aisle designs alternate rows of hot and cold aisles to improve cooling efficiency. The concept is simple – cold air enters the front of racks in the cold aisle, and hot exhaust air exits the back of the racks into the hot aisle. Now many companies are taking this a step further and implementing hot and cold aisle containment systems to also lessen air mixing and reduce operational costs associated with cooling.
The goal for cold aisle containment is to create a smaller area to cool. The cold row is capped at the top of the cabinets and across the aisle, and doors are installed at the ends of the rows to contain cold air.
With hot aisle containment systems, hot air is isolated with vertical panels to reduce energy use and costs. This barrier prevents hot and cold air mixing and directs exhaust airflow into an AC return, which will increase the capacity of computer room air conditioning (CRAC) units.
Deciding between the two systems is specific to your data center needs. Still, our engineers encourage hot aisle containment solutions if your data center has recently increased density/ will be increasing in density.
Increase Data Center Temperature
Increasing data center temperature to reduce cooling costs may seem to be counter-intuitive to maintain uptime, but running your data center at a higher temperature than the norm, 68°F – 71°F, has been shown to increase data center efficiency. Google shared that it keeps its data centers as high as 80°F to reduce energy usage. Keep in mind that to employ this method, you need to not only invest in high-efficiency equipment, but you must also have a robust monitoring system to prevent overheating.
Utilize Liquid Cooling
The last item we will review is liquid cooling. Liquid cooling is a budding data center trend, and the concept is much like that of a radiator. Although it is gaining popularity, many data center operators are hesitant to bring liquids into their data centers. Many companies also object that liquid-based cooling is more expensive than air-based cooling. Still, Data Center Dynamics shared “that with total immersion cooling technology, expenses such as chillers, computer room air handling (CRAH) equipment, raised floors or ducting are no longer necessary.” In the same article, they also shared that without CRAH equipment and fans inside servers, “liquid cooling technology can reduce energy bills by up to 80 percent.” It will be interesting to see how liquid cooling trends over the next few years with the rise of high-density data centers.
Data Center Cooling for Modern Day Data Centers
Our engineers shared five cooling solutions for how to keep your data center cool this summer:
1) Employ Regularly Scheduled Maintenance
2) Optimize Server Racks for Cooling Efficiency
3) Rethink Your Data Center Architecture
4) Increase Data Center Temperatures
5) Utilize Liquid Cooling
We predict that as data center cooling concerns continue to rise, there will also be a rise in data center cooling best practices and data center cooling technology. We’d love to hear from you on how you maintain your cool while keeping your costs down in your data center in the comments below.
Are you looking for a partner to help you keep your data center cooling costs down? Learn more about DAMAC’s premium data center solutions: https://www.maysteel.com/data-center-solutions/ or contact our team of engineers by clicking here.
©2019 Maysteel Industries, LLC