top of page

AI DC -- Renaissance and New Thinking Required Article 5: Full Circle -- The Machine Is the Building Again, Human at the Controls

  • Writer: datacenterprimerja
    datacenterprimerja
  • 2 days ago
  • 6 min read

James Soh


This article speaks to all three audiences: C-level leadership, design and construction specialists, and operational leaders and the operations workforce.

Let me tell you about a moment of recognition.


Earlier this year I was studying for the NVIDIA NCA-AIIO certification. I came to it as someone with more than 30 years in the data center industry, and before that, as a systems administrator on a DECVax minicomputer cluster. I sat with the material on GPU architecture, NVLink fabric topology, cluster networking, workload scheduling, and inference serving.


And I kept finding the similarity to the time of minicomputer in the 1990s.


Not metaphorically. Structurally. The tightly integrated hardware and software stack. The specialist operator who must know the system end to end. The batch and interactive workload duality. The scheduler as a first-class system component. The liquid cooling as a physical necessity, not a design choice. The rack as a singular computer, not a collection of independent machines.


The AI data center is the original computing model. Rebuilt with 40 years of semiconductor progress underneath it and deployed at a scale that would have been incomprehensible in 1985. But the same discipline. The same integration. The same professional demands.

Seymour Cray was not wrong. The industry spent 30 years in a parenthesis. The parenthesis is closed.


What the Parenthesis Cost

The disaggregation era was productive and necessary. The cloud era democratised compute in ways that created industries and transformed economies. I am not arguing otherwise.

But the parenthesis had a cost that is only now becoming fully visible.


It produced an industry that optimised for the shell and forgot the machine. C-level leaders who start every investment decision with power capacity rather than compute requirements. Design and construction professionals who treat the data hall occupant as a generic tenant rather than the system the building exists to serve. Operations teams whose professional scope stops at the server room door.


It also produced, particularly in Southeast Asia, a structural separation between the data center industry and the IT industry that has calcified into institutional habit. Two communities. Two professional associations. Two career tracks. Two sets of assumptions about where one domain ends and the other begins.


That separation was tolerable when the machine was generic. It is not tolerable when the machine is a 227 kilowatt liquid-cooled GPU cluster running at continuous full load, generating tokens that power enterprise operations for the Fortune 100.


At its root, the parenthesis contracted the knowledge boundary of the industry. The people who planned, built, and operated data centers stopped needing to understand the computer inside them. That contraction was rational for three decades. It is now the defining professional challenge of the AI DC era. Recovering that boundary -- at every level of the industry, across all three audiences this series has addressed -- is what the renaissance requires.


The Token Is the New Batch Job

In the minicomputer era, the batch job was the unit of work. Every cycle the machine processed moved a job forward. Every idle cycle was waste. The operator who managed the machine understood this instinctively. Utilisation was not a metric. It was a professional ethic.

Today the token is the unit of intelligence. Every prompt, every reasoning step, every inference call, every agent interaction generates tokens. The machine exists to process them. An AI data center that does not run efficiently is wasting tokens the same way a poorly managed VAX cluster wasted cycles.


Jensen Huang put it plainly at GTC 2026. Every engineer will have an annual token budget. Tokens will be an entitlement, a resource allocation, a line item in the operating budget of every knowledge-intensive organisation. Not because it is a compelling vision. Because the economics of AI are already moving in that direction.


When that happens -- and it is a matter of when, not whether -- the demand on AI infrastructure will be of a different order of magnitude from what it is today. The Fortune 100 is already running active AI agents embedded in enterprise workflows. Agentic AI systems that spawn sub-agents to execute complex multi-step autonomous workflows are moving from pilot to production. Physical AI -- robotics, autonomous vehicles, industrial automation making real-time decisions in the physical world -- is the next step-change in compute demand.


Using AI to help write code or summarise documents is scratching the surface. It is the equivalent of using the internet to send faxes. The infrastructure, the professional knowledge base, and the organisational discipline must be built for what AI is becoming, not what it is today.


The Charge to Each Audience

This series was written for three professional communities. Each leaves with a specific charge. Underlying all three is the same imperative: recover the knowledge boundary. Know the machine.


To C-level leadership.

The business model built on selling white space to generic tenants is not the business model for an AI data center. The financial model, the power procurement strategy, the construction approach, and the go-to-market proposition must all be rebuilt from the compute outward. Start with the compute. Everything else follows.


The harder task is team composition. The knowledge boundary does not move because the boardroom resolves that it should. It moves when the people executing the programme -- the heads of development, the project directors, the programme leaders -- understand the machine they are building for. Finding, developing, and placing people with AI infrastructure knowledge in those roles is a C-level responsibility. The companies that do it early will have the capacity and capability advantage when agentic AI hits full enterprise production.


To design and construction specialists.

The facility that is designed from the shell inward, by a team that does not understand the AI system it houses, will underperform from day one. Deep knowledge of the DGX system, the liquid cooling circuit, the NVLink fabric topology, and the continuous full-load operating reality are now prerequisites for D&C leadership on an AI DC project.


NVIDIA has published the Vera Rubin DSX AI Factory reference design. It is a publicly available blueprint for how codesigned AI infrastructure must be approached. Read it. Build your team's knowledge around it. Engage with the compute requirements before the design brief is fixed, not after. The profession has the engineering capability. The knowledge boundary is the gap to close. The D&C professional who crosses it is the one C-level leadership is looking for.

To operational leaders and the operations workforce.

The server room door is no longer a professional boundary. In an AI data center, a CDU fault is a compute outage. Megawatt-scale power swings during training are facility events with compute consequences. The thermal margin for error at continuous full load is measured in minutes.


The operational leader who redefines their team's knowledge scope to include the compute layer, who builds the bridging capability between facilities and compute operations, is building the most valuable operations team in the industry. The engineer who develops that bridging capability personally is the most valuable individual in the building. The knowledge boundary that contracted during the x86 era can be recovered. On the operations floor, that recovery is a daily professional choice.


The Deeper Problem

I want to close with the observation that sits behind everything this series has argued.

In Southeast Asia, many professionals in both the data center industry and the IT industry are talking about AI fluently. The vocabulary is there. The acronyms are deployed confidently. The LinkedIn posts reference the right companies and the right products.


But fluency is not depth. Knowing the names of things is not the same as understanding how they work, why they are built the way they are, and what they demand from the people who plan, build, and operate them.


Gen Z enters this industry as AI natives. They have not carried the institutional separation between IT and data center as a given. They have not spent careers optimising for a generic compute world. They will learn the AI DC as the natural environment it is for them, not as a disruption to assumptions they built careers on.


That is a structural advantage. The experienced professional who does not close the knowledge gap will find that advantage compounding against them.


The renaissance this series describes is not optional. It is already underway. The only question is whether the professionals who run this industry will lead it or follow it.


Full Circle

In 1985, the Cray-2 sat in a bath of fluorinert liquid coolant in a purpose-built facility, operated by specialists who knew the machine end to end, processing the most demanding computational workloads the world could generate.


In 2025, the NVIDIA Vera Rubin NVL72 sits in a 100 percent liquid-cooled rack in a purpose-built AI factory, demanding specialists who know the system end to end, processing the most demanding AI workloads the world can generate.


The machine is the building again.


The knowledge boundary that the minicomputer era had, and the AI DC era demands should be well noted. We need to know the machine (GPU as a computer) and everything else that supports this machine. The question is not whether the industry understands this. The question is how quickly the people who plan, build, and operate these facilities will develop the knowledge, the discipline, and the professional depth that the machine now driving everything.


The renaissance has begun, and we better be the driver of this renaissance.

This is the final article in the series AI DC -- Renaissance and New Thinking Required.

Comments


bottom of page