
In the past year, Artificial Intelligence (AI) has surged forward like a digital renaissance, echoing the rapid and transformative rise of the Internet in the late 1990s. It has revolutionized industries and redefined our daily lives with extraordinary speed and impact.
The impact of AI is set to grow even more substantially in the coming years. In fact, the investments on generative AI reached $25.2 billion in 2023, nearly nine times the amount invested in 2022. This is also approximately 30 times the funding seen in 2019, as highlighted by Stanford University’s AI Index report.
The growth of AI also presents data center companies with opportunities to innovate, expand their service offerings, and cater to the evolving needs of AI-driven applications and enterprises. By embracing AI technologies and adapting their infrastructure and operations accordingly, data centers play a crucial role in enabling the broader adoption and success of AI across various sectors.
However, it also comes at a cost. AI currently requires 4.3GW of data center power. Projected to reach up to 18GW by 2028, this surge surpasses current data center power demand growth rates, presenting capacity and sustainability challenges for data center players. Data centers must adapt to meet the evolving power needs of AI-driven applications effectively and sustainably.
AI requires futuristic data centers. It is not just an add-on application for existing data centers; it requires a fundamentally different architecture, including specialized IT infrastructure and power, and cooling systems.
Powering Sustainable AI Data Centers
We predict that AI workloads will grow two to three times faster than legacy data center workloads and represent 15 to 20% of all data center capacity by 2028. More workloads will also start moving closer to the users at the edge.
Training Large Language Models (LLMs) often necessitates thousands of Graphics Processing Units (GPUs) working in unison. In large AI clusters, the cluster size is roughly 1 MW to 2 MW with rack densities ranging from 25 kW to 120 kW, depending on the GPU model and quantity. These characteristics significantly impact rack power density.
Currently, most data centers can only support peak rack power densities of about 10 to 20 kW. Deploying tens or hundreds of racks, each exceeding 20 kW, in an AI cluster will present substantial infrastructure challenges for data centers. At Schneider Electric, we specialize in optimizing physical infrastructure to meet AI requirements. Leveraging our site engineering expertise, we support customers in transitioning from low-density to high-density configurations.
We recently collaborated with NVIDIA to revolutionize data center infrastructure, enabling advancements in edge AI and digital twin technologies. We have since launched three retrofit reference designs for data center operators who are looking to add AI clusters into an existing facility. We also introduced a new-build design that can be scaled for operators looking to build an IT space specifically for AI clusters, that are also specifically tailored for NVIDIA’s accelerated computing clusters. These designs are optimized for various applications, including data processing, engineering simulation, electronic design automation, computer-aided drug design, and generative AI.
By addressing the evolving demands of AI workloads, these reference designs will provide a robust framework for integrating NVIDIA’s accelerated computing platform into data centers, enhancing performance, scalability, and sustainability.
With rising energy costs and increasing environmental concerns, data centers must also prioritize energy-efficient hardware and infrastructure, including high-efficiency power supplies and renewable energy sources to reduce operational costs and carbon footprints. Schneider Electric’s infrastructure designs not only support AI operations but also address future energy challenges by facilitating the development of scalable data centers.
Keeping AI Data Centers Cool
AI data centers generate substantial heat. This increased heat output necessitates the use of liquid cooling to ensure optimal performance, sustainability, and reliability.
Cooling systems, aside from IT infrastructure, also rank as the second-largest energy consumers in data centers. In less densely utilized traditional data centers and distributed IT locations, cooling can account for 20 to 40% of the facility’s total energy consumption.
Liquid cooling is an architecture that provides many benefits for data center companies such as higher energy efficiency, smaller footprint, lower TCO, enhanced server reliability, lower noise, and more at similar server densities than current values. In Asia, data center companies are actively transitioning to liquid cooling to reduce power consumption at current densities, though this approach isn’t universally compatible with AI requirements.
As the demand for AI processing power grows and thermal loads increase, liquid cooling has become a critical element in data center design, necessitating an innovative and comprehensive yet flexible approach. At Schneider Electric, we support customers in adopting liquid cooling with a diverse portfolio that covers white space solutions to heat rejection strategies.
We also published a new white paper recently titled “Navigating Liquid Cooling Architectures for Data Centers with AI Workloads.” This resource is designed to help data center companies navigate the intricacies of liquid cooling, providing insights into system design, implementation, and operational considerations.
Impact of AI on Sustainability
The dual nature of AI’s impact is evident: it has the potential to optimize energy usage, yet it also raises concerns about increased energy consumption.
Some industry experts foresee that accelerated computing, which drives the AI revolution, will enable us to achieve more with fewer resources in data center infrastructure. As accelerated compute increases individual rack density, the total number of racks in a data center might decrease substantially. Essentially, accelerated computing holds the promise of achieving greater efficiency overall.
However, it is crucial to carefully evaluate AI’s broader impact on energy consumption and the environment. In fact, Gartner reveals that 80% of CIOs will have performance metrics tied to the sustainability of the IT organization by 2027. When aiming to decrease energy consumption and CO2 emissions in IT and data centers, companies need to establish a factual baseline and have access to real-time and historical data.
Software enables companies to measure and report their data center performance based on historical data and trends analysis. Our EcoStruxure IT data center infrastructure management (DCIM) software combines this with AI and real-time monitoring to turn it into actionable insights for improved sustainability for customers. According to Forrester’s Total Economic Impact study commission by Schneider Electric, companies can experience gross savings on utility expenses of up to 22.5% through the advanced energy management capabilities of EcoStruxure that optimizes power usage and cooling efficiencies.
The recent launch of a new model based, automated sustainability reporting features to our EcoStruxure IT software offer even more visibility of energy and resource consumption, historical data analysis and detailed metrics. These features combine 20 years of sustainability, regulatory, data center and software development expertise with advanced machine learning to help organizations meet imminent regulatory reporting requirements.
Schneider Electric’s Commitment to Data Center Sustainability
Data centers operate with significant energy demands, posing challenges to environmental sustainability. Through optimizing energy efficiency, lowering carbon emissions, and enhancing operational resilience, Schneider Electric is committed to enabling data centers to operate responsibly, fostering a more sustainable future.
One example of our commitment to sustainability is our active participation in industry collaborations. Schneider Electric is a founding coalition member of the Infrastructure Masons Climate Accord, together with 50 companies. The accord aims to decrease carbon emissions across digital infrastructure materials, products, and power. It also seeks to establish global standards for carbon accounting in digital infrastructure, shaping market decisions to propel the industry toward achieving carbon neutrality.
Schneider also released an industry-first guide to addressing new physical infrastructure design challenges for data centers to support the shift in artificial intelligence (AI)-driven workloads, setting the gold standard for AI-optimized data center design. The whitepaper titled “The AI Disruption: Challenges and Guidance for Data Center Design,” provides invaluable insights and acts as a comprehensive blueprint for companies seeking to leverage AI to its fullest potential within their data centers, including a forward-looking view of emerging technologies to support high density AI clusters in the future.
Be the first to comment