Jon Atkin: The Power of Data Gravity 

Gregor: What components need to be in place to attract AI infrastructure investments in Germany?

Jonathan: To support AI infrastructure, we need an energy supply that is both cheap and clean. But beyond energy, we also need a robust power grid. By „grid,“ I mean the infrastructure required to transmit electricity effectively. Additionally, for low-latency AI applications, reliable network access is essential, though the specific requirements depend on the type of AI being deployed. For example, model training doesn’t require significant network access, as latency isn’t a critical factor. However, for inference—where split-second performance is necessary—proximity to network junctions becomes much more important. 

While all AI applications require power, the emphasis on „green“ power is shifting slightly. The big Internet companies continue to pursue green energy goals, but with the rise of AI, the sheer need for energy often outweighs the green aspect. As a result, energy sources like natural gas, which is not 100% green but cleaner than many alternatives, have become more relevant. Two or three years ago, natural gas was seen as a declining energy source for data centers, but it is now on an increasing trajectory due to the growing demand for reliable energy. 

Gregor: Does AI change the game also for nuclear power plants? 

Jonathan: Well, we are experiencing a renaissance is nuclear power, too. It is zero-carbon, though not without challenges like radiation and waste management. Both large-scale nuclear plants and small modular reactors (SMRs) are gaining interest. Companies like Microsoft, Amazon, and Oracle have made public statements—and in some cases, commitments—indicating their support for nuclear energy. Nuclear power is being actively discussed as a viable option for meeting growing energy needs, though it is not without controversy. 

Nuclear energy faces the challenge of „NIMBY“ (Not In My Backyard) objections. While nuclear energy proponents argue that modern technology has mitigated risks seen in past incidents, such as those in the 1970s or at Fukushima in 2012, public resistance to locating plants near residential areas persists. The key question is where new nuclear energy projects could be sited to minimize opposition and maximize feasibility. 

“Nuclear power plants face objections from the local communities – NIMBY: Not-in-my-backyard.” 

In the U.S., for example, Amazon has committed to using 300 megawatts of nuclear power from an existing plant in Pennsylvania, which may eventually scale up to a gigawatt. However, this energy won’t be immediately useful due to grid constraints. Significant investments will be needed to reinforce transmission infrastructure to connect this capacity to high-demand areas like Northern Virginia. Similarly, Microsoft has committed to reviving a retired nuclear plant at Three Mile Island, also in Pennsylvania, to support its energy needs. These projects highlight the potential of leveraging existing nuclear infrastructure to address energy shortages. 

Gregor: My observation with nuclear power is: For a single plant to start delivering energy, you have to wait +15 years, and you have to pay more than +30 Bn € in advance. 

Jonathan: Building new nuclear plants is indeed expensive and time-consuming. However, small modular reactors (SMRs) might offer a more compact and decentralized option. While still costly, they require less time and resources to deploy compared to traditional nuclear plants. That said, SMRs are not without challenges, particularly in gaining acceptance from local communities. 

For hyperscalers and other large energy consumers, nuclear power is becoming an integral part of the conversation around sustainable energy. As the demand for AI and other compute-heavy workloads grows, nuclear energy may play a key role in ensuring a stable and scalable energy supply. The future of these projects, however, will depend on overcoming public resistance, securing funding, and addressing regulatory hurdles. 

Gregor: Also, this debate is old as time. Have you seen a small nuclear reactor in action? 

Jonathan: The layperson`s argument would be that nuclear submarines, which use even smaller reactors, have operated for decades without significant mishaps. This perspective suggests that scaling up from the technology used in submarines, rather than scaling down from large nuclear power plants, could be a more effective approach to deploying SMRs. However, while there are valid safety arguments from an engineering standpoint, the challenge arises when trying to place these reactors near communities. Also, small nuclear facilities will have the NIMBY problem. 

As you pointed out, also the cost calculations remain uncertain because this type has not yet been deployed at scale. Without widespread adoption, it’s difficult to gauge their economic viability. I have heard subject matter experts argue that the existence of different categories of nuclear technology complicates decision-making. If left to the free market, the process of determining the best approach could take a long time. On the other hand, a standardized approach—such as the one adopted by France in the past—might streamline development and reduce costs. However, achieving this level of standardization would require significant industrial policy changes and regulatory alignment. 

Gregor: I like your submarine argument, still I doubt we will see new nuclear power plants in any form or shape at scale, soon. However, coming back to the components of what factors make up a successful AI infrastructure: Are there other major topics to address? 

Jonathan: Regulation remains a significant factor. As cloud requirements have grown, the need for local permitting and land use approvals has become increasingly important. Even if suitable land is available, its zoning may restrict it to industrial or agricultural use, requiring additional permits. Every local jurisdiction has its own set of hurdles, and national regulations add another layer of complexity. These issues aren’t unique to data centers; even something like building a shopping center involves similar debates. However, data centers face additional scrutiny because they are fundamentally more power-intensive than shopping centers or office buildings. 

The substantial energy demand of data centers creates unique constraints on the grid, elevating them to a special category. Historically, acquiring both land and energy was considered a unified process, but now these are often treated as separate challenges. Environmental regulations further complicate matters. For instance, if a data center is located near a small stream or river, developers must consider potential impacts on water quality, noise, and even wildlife, such as migratory birds in regions like California. 

Additionally, data centers often use water for cooling, which introduces another environmental consideration. Beyond these physical and environmental challenges, labor availability is another critical factor. Building and outfitting data centers requires highly skilled electrical contractors who are also in demand for constructing other structures, such as semiconductor manufacturing facilities. This competition for specialized labor adds another layer of difficulty for the industry. 

Gregor: Talking about labor, how many jobs do big data centers create locally? 

Jonathan: The construction phase of a data center creates hundreds of jobs, providing a temporary boost to the local economy. However, once the facility is operational, it typically requires only about 30 full-time staff to run around the clock. This means that while data centers generate significant activity and investment during construction, the labor demand diminishes once they are built, and workers often move on to the next project. As a result, data centers don’t provide a permanent increase in local employment but can contribute significantly to the local tax base. 

Gregor: Can data centers be considered as anchor investments for local economic development? 

Jonathan: Ireland is a prime example of how tax incentives attracted major investments and lead to a significant increase of local GDP and wealth. When Microsoft and Facebook (now Meta) first expanded to Europe, they chose Dublin. The Irish Development Authority (IDA) encouraged this by offering tax incentives, including lower corporate taxes, making Ireland an appealing destination for large internet companies. Similarly, in the U.S., some jurisdictions have little to no sales tax, which further incentivizes companies to locate there. 

While energy costs are often considered the primary factor in determining where to build AI data centers, the tax environment is equally critical. Servers, which must be replaced every 3–5 years, represent a significant expense and are taxed according to their location. Jurisdictions with low single-digit taxes can provide a considerable cost advantage compared to those with higher rates. This tax consideration is factored into the total cost of ownership over the 15–25 years a data center is expected to operate. 

In Europe, regional tax differences also play a role. While Germany’s tax rates appear to be uniform nationwide, other European countries have more variation. Ireland, for example, capitalized on this early, securing an early lead in cloud computing and social networking deployments by creating an attractive tax environment. 

Gregor: Will Frankfurt remain the German capital for IT infrastructure? 

Jonathan: Restrictions in grid and power supply in Frankfurt are pushing companies to explore Tier 2 locations for local presence. These companies are looking for land and power availability in other areas, but they will only commit if there is clear customer demand. Berlin is often mentioned as a potential second hub, and there is growing interest from major internet companies in exploring its capacity. However, meaningful demand in cities like Hamburg or Munich has yet to materialize, leaving them likely to remain secondary hubs for the foreseeable future. 

This situation is partly explained by the principle of data gravity. Established compute clusters, like those in Frankfurt, already house significant enterprise data. To perform useful computations or AI workloads, the infrastructure must be close to where the data resides. Frankfurt has become a magnet for investment due to this dynamic, and its central role is unlikely to diminish even with constraints on power and land. 

“Data center compute clusters often aggregate where data resides—this is data gravity” 

For example, Mainova, the entity responsible for delivering a new substation to support Frankfurt’s power needs, may not complete its work until 2030—or potentially even 2032. Despite these delays, companies have little choice but to wait or get creative in finding small pockets of availability. Areas east of Frankfurt, such as Hanau and Offenbach, have become secondary zones of activity, but both are reportedly running out of power. Looking further east, locations like Dietzenbach might offer potential, as grid constraints appear less severe there. 

Once Mainova completes its infrastructure upgrades to the west of Frankfurt, a wave of new construction and capacity uptake is expected. While this may seem illogical given the constraints, Frankfurt’s role as a central hub persists due to the clustering effect of data centers. Much like a shopping mall that attracts visitors with a variety of stores, enterprises prefer to situate their operations near a variety of cloud providers. This clustering reduces operational risks and ensures access to multiple service options. 

Gregor: But that means, no other location in Germany will ever have a chance to compete? 

Jonathan: Currently, no other German city has achieved the critical mass necessary to rival Frankfurt. However, Nord Rhine-Westphalia (NRW) shows promise. Microsoft, for instance, has obtained permits to construct a major site in the region, leveraging its historical industrial infrastructure, including access to brown coal and established network systems. This could potentially make NRW a second hub of data gravity for Germany, though construction has not yet begun. 

Berlin also remains a topic of interest. While it has yet to develop into a significant hub, its potential is being explored by some major players. For Microsoft and others, the strategy involves building near existing cloud clusters, ensuring flexibility to pivot from AI-specific workloads to other cloud services if demand shifts. 

Gregor: So, what is data gravity exactly? And is it really that powerful?   

Jonathan: The idea of moving data seamlessly around the world might seem straightforward, but in reality, it’s far more complicated and expensive than most people realize. For example, consider a medium-sized data center with around 1.2 petabytes of data. You might think that transferring this data over cutting-edge fiber optic networks would be quick and simple. However, even with advanced technology from companies like Ciena or Cisco, moving such a volume of data over a distance of just 300 kilometers could take months. Improving the speed and capacity of these networks is technically possible, but it would be prohibitively expensive. 

“Surprisingly, the fastest way to move large data is still by truck, not fiber.” 

Surprisingly, the fastest way to move a large amount of data is not through fiber optics but by physically transporting it via trucks. Companies like Amazon even had dedicated products for this purpose, I think they called it “Snowmobile”. The process involves downloading all the data onto storage devices, transporting them by truck to the destination, and then uploading the data at the other end. Astonishingly, this method remains faster than using the most advanced fiber optic networks available today. 

Gregor: Thank you, Jonathan, for your time!