
AI Is the New Normal: Building the AI Factory for Power, Profit, and Scale
12/19/2025 | 1h 2 mins.
As the data center industry enters the AI era in earnest, incremental upgrades are no longer enough. That was the central message of the Data Center Frontier Trends Summit 2025 session “AI Is the New Normal: Building the AI Factory for Power, Profit, and Scale,” where operators and infrastructure leaders made the case that AI is no longer a specialty workload; it is redefining the data center itself. Panelists described the AI factory as a new infrastructure archetype: purpose-built, power-intensive, liquid-cooled, and designed for constant change. Rack densities that once hovered in the low teens have now surged past 50 kilowatts and, in some cases, toward megawatt-scale configurations. Facilities designed for yesterday’s assumptions simply cannot keep up. Ken Patchett of Lambda framed AI factories as inherently multi-density environments, capable of supporting everything from traditional enterprise racks to extreme GPU deployments within the same campus. These facilities are not replacements for conventional data centers, he noted, but essential additions; and they must be designed for rapid iteration as chip architectures evolve every few months. Wes Cummins of Applied Digital extended the conversation to campus scale and geography. AI demand is pushing developers toward tertiary markets where power is abundant but historically underutilized. Training and inference workloads now require hundreds of megawatts at single sites, delivered in timelines that have shrunk from years to little more than a year. Cost efficiency, ultra-low PUE, and flexible shells are becoming decisive competitive advantages. Liquid cooling emerged as a foundational requirement rather than an optimization. Patrick Pedroso of Equus Compute Solutions compared the shift to the automotive industry’s move away from air-cooled engines. From rear-door heat exchangers to direct-to-chip and immersion systems, cooling strategies must now accommodate fluctuating AI workloads while enabling energy recovery—even at the edge. For Kenneth Moreano of Scott Data Center, the AI factory is as much a service model as a physical asset. By abstracting infrastructure complexity and controlling the full stack in-house, his company enables enterprise customers to move from AI experimentation to production at scale, without managing the underlying technical detail. Across the discussion, panelists agreed that the industry’s traditional design and financing playbook is obsolete. AI infrastructure cannot be treated as a 25-year depreciable asset when hardware cycles move in months. Instead, data centers must be built as adaptable, elemental systems: capable of evolving as power, cooling, and compute requirements continue to shift. The session concluded with one obvious takeaway: AI is not a future state to prepare for. It is already shaping how data centers are built, where they are located, and how they generate value. The AI factory is no longer theoretical—and the industry is racing to build it fast enough.

Beyond the Core: Building Data Centers in America’s Next Frontier
12/19/2025 | 31 mins.
This bonus episode of EC&M On Air, sponsored by Wesco, explores how the rapid growth of AI, cloud computing, and digital infrastructure is reshaping the data center market and pushing development beyond traditional core locations. Host Ellen Parson is joined by Wesco subject matter expert David Speidelsbach to examine the forces driving unprecedented demand, the strain being placed on established data center hubs, and why emerging and tertiary markets are becoming the next frontier for development. The conversation dives into the benefits and challenges of building data centers in these new regions, including cost advantages, access to land and renewable energy, tax incentives, and the realities of labor, water, and infrastructure constraints. David also shares guidance for electrical contractors navigating these complex projects, highlighting the importance of early collaboration, strong partnerships, and supply chain coordination. Throughout the discussion, listeners gain practical insight into how contractors and suppliers can successfully support large-scale, mission-critical data center builds in an evolving market.

The Distributed Data Frontier: Edge, Interconnection, and the Future of Digital Infrastructure
12/17/2025 | 56 mins.
As AI workloads push data center infrastructure in both centralized and distributed directions, the industry is rethinking where compute lives, how data moves, and who controls the networks in between. This episode captures highlights from The Distributed Data Frontier: Edge, Interconnection, and the Future of Digital Infrastructure, a panel discussion from the 2025 Data Center Frontier Trends Summit. Moderated by Scott Bergs of Dark Fiber and Infrastructure, the panel brought together leaders from DartPoints, 1623 Farnam, Duos Edge AI, ValorC3 Data Centers, and 365 Data Centers to examine how edge facilities, interconnection hubs, and regional data centers are adapting to rising power densities, AI inference workloads, and mounting connectivity constraints. Panelists discussed the rapid shift from legacy 4–6 kW rack designs to environments supporting 20–60 kW and beyond, while noting that many AI inference applications can be deployed effectively at moderate densities when paired with the right connectivity. Hospitals, regional enterprises, and public-sector use cases are emerging as key drivers of distributed AI infrastructure, particularly in tier 3 and tier 4 markets. The conversation also highlighted connectivity as a defining bottleneck. Permitting delays, middle-mile fiber constraints, and the need for early carrier engagement are increasingly shaping site selection and time-to-market outcomes. As data centers evolve into network-centric platforms, operators are balancing neutrality, fiber ownership, and long-term upgradability to ensure today’s builds remain relevant in a rapidly changing AI landscape.

Uptime Institute's Max Smolaks: Power, Racks, and the Economics of the AI Data Center Boom
12/16/2025 | 33 mins.
In this episode of the Data Center Frontier Show, DCF Editor in Chief Matt Vincent speaks with Uptime Institute research analyst Max Smolaks about the infrastructure forces reshaping AI data centers from power and racks to cooling, economics, and the question of whether the boom is sustainable. Smolaks unpacks a surprising on-ramp to today’s AI buildout: former cryptocurrency mining operators that “discovered” underutilized pockets of power in nontraditional locations—and are now pivoting into AI campuses as GPU demand strains conventional markets. The conversation then turns to what OCP 2025 revealed about rack-scale AI: heavier, taller, more specialized racks; disaggregated “compute/power/network” rack groupings; and a white space that increasingly looks purpose-built for extreme density. From there, Vincent and Smolaks explore why liquid cooling is both inevitable and still resisted by many operators—along with the software, digital twins, CFD modeling, and new commissioning approaches emerging to manage the added complexity. On the power side, they discuss the industry’s growing alignment around 800V DC distribution and what it signals about Nvidia’s outsized influence on next-gen data center design. Finally, the conversation widens into load volatility and the economics of AI infrastructure: why “spiky” AI power profiles are driving changes in UPS systems and rack-level smoothing, and why long-term growth may hinge less on demand (which remains strong) than on whether AI profits broaden beyond a few major buyers—especially as GPU hardware depreciates far faster than the long-lived fiber built during past tech booms. A sharp, grounded look at the AI factory era—and the engineering and business realities behind the headlines.

Beyond the Grid: Natural Gas, Speed, and the New Data Center Reality
12/12/2025 | 56 mins.
In this Data Center Frontier Trends Summit 2025 session—moderated by Stu Dyer (CBRE) with panelists Aad den Elzen (Solar Turbines/Caterpillar), Creede Williams (Exigent Energy Partners), and Adam Michaelis (PointOne Data Centers)—the conversation centered on a hard truth of the AI buildout: power is now the limiting factor, and the grid isn’t keeping pace. Dyer framed how quickly the market has escalated, from “big” 48MW campuses a decade ago to today’s expectations of 500MW-to-gigawatt-scale capacity. With utility timelines stretched and interconnection uncertainty rising, the panel argued that natural gas has moved from taboo to toolkit—often the fastest route to firm power at meaningful scale. Williams, speaking from the IPP perspective, emphasized that speed-to-power requires firm fuel and financeable infrastructure, warning that “interruptible” gas or unclear supply economics can undermine both reliability and underwriting. Den Elzen noted that gas is already a proven solution across data center deployments, and in many cases is evolving from a “bridge” to a durable complement to the grid—especially when modular approaches improve resiliency and enable phased buildouts. Michaelis described how operators are building internal “power plant literacy,” hiring specialists and partnering with experienced power developers because data center teams can’t assume they can self-perform generation projects. The panel also “de-mystified” key technology choices—reciprocating engines vs. turbines—as tradeoffs among lead time, footprint, ramp speed, fuel flexibility, efficiency, staffing, and long-term futureproofing. On AI-era operations, the group underscored that extreme load swings can’t be handled by rotating generation alone, requiring system-level design with controls, batteries, capacitors, and close coordination with tenant load profiles. Audience questions pushed into public policy and perception: rate impacts, permitting, and the long-term mix of gas, grid, and emerging options like SMRs. The panel’s consensus: behind-the-meter generation can help shield ratepayers from grid-upgrade costs, but permitting remains locally driven and politically sensitive—making industry communication and advocacy increasingly important. Bottom line: in the new data center reality, natural gas is here—often not as a perfect answer, but as the one that matches the industry’s near-term demands for speed, scale, and firm power.



The Data Center Frontier Show