What Does the End of Moore’s Law Mean for the Data Center Industry? | Data Center Knowledge
In case you missed it, Moore’s Law – which states that computing power will steadily increase over time – is dead, or, at best, slowly dying. Computer chips are no longer gaining processing capacity as quickly as in decades past.
What does this change mean for data centers? Quite a bit, potentially. Read on to see how the slowdown in computing power growth could affect the data center industry.
What is Moore’s Law, and why is it dead?
Moore’s Law, named after Intel co-founder Gordon Moore, who proposed the concept in 1965, is the principle that the number of transistors that engineers can fit inside computer chips roughly doubles every two years. By extension, the computing power of the average chip should increase at the same rate, and the cost businesses pay for processing power should decrease.
For decades, Moore’s statement proved to be mostly accurate. Computing capacity increased roughly at the rate he predicted.
But that is no longer true. While it may be too early to say that Moore’s Law is definitely dead, there is reason to believe that we have reached the physical limits of silicon-based CPUs. Without a practical alternative, engineers can no longer increase the computing power of chips as quickly or as cheaply as they have in years past.
It’s certainly possible that smart people will find ways around the current limitations of silicon — or that quantum computers could eventually become practical and completely change the game around computing power. But for now, the data shows that the rate of increase in processing power is slowing, with no clear sign that that trend will change anytime soon.
Moore’s Law and Data Centers
The fact that CPU capacity is not growing as quickly could have several major implications for data centers.
More data centers
Perhaps the most obvious is that we are likely to see more data centers being built.
This would probably happen even if Moore’s law were of course true. Demand for digital services has long outpaced increases in processing power, meaning companies have had to increase the footprint of their IT infrastructure even as the per-server processing power of that infrastructure has increased.
But in a post-Moore’s Law world, we will need even more data centers. If servers stop growing more powerful from year to year, the only way to meet increases in user demand will be to deploy more servers, which means building more data centers.
Data Center Sustainability Challenges
An increase in the total number of data centers will exacerbate existing challenges related to data center sustainability. More servers lead to higher rates of energy consumption, especially if the number of transistors per chip remains flat.
This will likely mean that data center providers that can offer clean energy procurement will become even more attractive. So will the next generation of data center technologies, such as immersion cooling, which can reduce the carbon footprint of data center facilities.
More companies are entering the chip market
For decades, a relatively small number of vendors — namely Intel and AMD — have dominated the market for the computer chips that go into commodity servers. These companies can deliver steadily increasing processing power, which has given other businesses little incentive to get into the chip-making game.
But that has changed in recent years as companies like AWS have begun building their own chips, and the aging of Moore’s Law is likely to push such businesses to invest even more in CPU technology. The reason for this is that they will be looking for newer and better ways to squeeze efficiency out of chips, especially in the context of the specific use cases they are deploying the CPUs for.
In other words, in a world where generic CPUs aren’t getting more powerful and cheaper by the year, companies have greater incentive to create their own special CPUs optimized for the use cases that matter most to them.
Workload optimization is increasing in importance
Reducing the CPU consumption of workloads has always been a smart move for businesses looking to save money on hosting costs. But in a post-Moore’s Law world, workload optimization will become even more crucial.
This means we will likely see more workloads move to containers, for example. The FinOps and cloud cost optimization market is also likely to boom as more and more businesses seek strategies to maximize the efficiency of their workloads.
The data center industry grew up in a world where computer chips were always growing in power and decreasing in cost. But that world passed away. We live in the post-Moore’s Law era, or close to it.
The result is likely to be more data centers, more special-purpose CPUs, and greater pressure on businesses to optimize their data centers. Data center providers and their customers will have to adapt — or, alternatively, cross their fingers that the quantum revolution eventually happens and makes computing power ridiculously cheap, though that’s probably not a winning strategy.