
Databank CFO Kevin Ooley on Financing for Scale in the AI Era
1/06/2026 | 21 mins.
In this episode of The Data Center Frontier Show, DCF Editor in Chief Matt Vincent speaks with Kevin Ooley, CFO of DataBank, about how the operator is structuring capital to support disciplined growth amid accelerating AI and enterprise demand. Ooley explains the rationale behind DataBank’s expansion of its development credit facility from $725 million to $1.6 billion, describing it as a strong signal of lender confidence in data centers as long-duration, mission-critical real estate assets. Central to that strategy is DataBank’s “Devco facility,” a pooled, revolving financing vehicle designed to support multiple projects at different stages of development; from land and site work through construction, leasing, and commissioning. The conversation explores how DataBank translates capital into concrete expansion across priority U.S. markets, including Northern Virginia, Dallas, and Atlanta, with nearly 20 projects underway through 2025 and 2026. Ooley details how recent deployments, including fully pre-leased capacity, feed a development pipeline supported by both debt and roughly $2 billion in equity raised in late 2024. Vincent and Ooley also dig into how DataBank balances rapid growth with prudent leverage, managing interest-rate volatility through hedging and refinancing stabilized assets into fixed-rate securitizations. In the AI era, Ooley emphasizes DataBank’s focus on “NFL cities,” serving enterprise and hyperscale customers that need proximity, reliability, and scale while Databank delivers power, buildings, and uptime, and customers source their own GPUs. The episode closes with a look at Databank’s long-term sponsorship by DigitalBridge, its deep banking relationships, and the market signals—pricing, absorption, and customer demand—that will ultimately dictate the pace of growth.

Beyond the Blueprint: The New Realities of Data Center Investment and Site Selection
12/29/2025 | 54 mins.
DCF Trends Summit 2025 Session Recap As the data center industry accelerates into an AI-driven expansion cycle, the fundamentals of site selection and investment are being rewritten. In this session from the Data Center Frontier Trends Summit 2025, Ed Socia of datacenterHawk moderated a discussion with Denitza Arguirova of Provident Data Centers, Karen Petersburg of PowerHouse Data Centers, Brian Winterhalter of DLA Piper, Phill Lawson-Shanks of Aligned Data Centers, and Fred Bayles of Cologix on how power scarcity, entitlement complexity, and community scrutiny are reshaping where—and how—data centers get built. A central theme of the conversation was that power, not land, now drives site selection. Panelists described how traditional assumptions around transmission timelines and flat electricity pricing no longer apply, pushing developers toward Tier 2 and Tier 3 markets, power-first strategies, and closer partnerships with utilities. On-site generation, particularly natural gas, was discussed as a short-term bridge rather than a permanent substitute for grid interconnection. The group also explored how entitlement processes in mature markets have become more demanding. Economic development benefits alone are no longer sufficient; jurisdictions increasingly expect higher-quality design, sensitivity to surrounding communities, and tangible off-site investments. Panelists emphasized that credibility—earned through experience, transparency, and demonstrated follow-through—has become essential to securing approvals. Sustainability and ESG considerations remain critical, but the discussion took a pragmatic view of scale. Meeting projected data center demand will require a mix of energy sources, with renewables complemented by transitional solutions and evolving PPA structures. Community engagement was highlighted as equally important, extending beyond environmental metrics to include workforce development, education, and long-term social investment. Artificial intelligence added another layer of complexity. While large AI training workloads can operate in remote locations, monetized AI applications increasingly demand proximity to users. Rapid hardware cycles, megawatt-scale racks, and liquid-cooling requirements are driving more modular, adaptable designs—often within existing data center portfolios. The session closed with a look at regional opportunity and investor expectations, with markets such as Pennsylvania, Alabama, Ohio, and Oklahoma cited for their utility relationships and development readiness. The overarching conclusion was clear: the traditional data center blueprint still matters—but power strategy, flexibility, and authentic community integration now define success.

AI Is the New Normal: Building the AI Factory for Power, Profit, and Scale
12/19/2025 | 1h 2 mins.
As the data center industry enters the AI era in earnest, incremental upgrades are no longer enough. That was the central message of the Data Center Frontier Trends Summit 2025 session “AI Is the New Normal: Building the AI Factory for Power, Profit, and Scale,” where operators and infrastructure leaders made the case that AI is no longer a specialty workload; it is redefining the data center itself. Panelists described the AI factory as a new infrastructure archetype: purpose-built, power-intensive, liquid-cooled, and designed for constant change. Rack densities that once hovered in the low teens have now surged past 50 kilowatts and, in some cases, toward megawatt-scale configurations. Facilities designed for yesterday’s assumptions simply cannot keep up. Ken Patchett of Lambda framed AI factories as inherently multi-density environments, capable of supporting everything from traditional enterprise racks to extreme GPU deployments within the same campus. These facilities are not replacements for conventional data centers, he noted, but essential additions; and they must be designed for rapid iteration as chip architectures evolve every few months. Wes Cummins of Applied Digital extended the conversation to campus scale and geography. AI demand is pushing developers toward tertiary markets where power is abundant but historically underutilized. Training and inference workloads now require hundreds of megawatts at single sites, delivered in timelines that have shrunk from years to little more than a year. Cost efficiency, ultra-low PUE, and flexible shells are becoming decisive competitive advantages. Liquid cooling emerged as a foundational requirement rather than an optimization. Patrick Pedroso of Equus Compute Solutions compared the shift to the automotive industry’s move away from air-cooled engines. From rear-door heat exchangers to direct-to-chip and immersion systems, cooling strategies must now accommodate fluctuating AI workloads while enabling energy recovery—even at the edge. For Kenneth Moreano of Scott Data Center, the AI factory is as much a service model as a physical asset. By abstracting infrastructure complexity and controlling the full stack in-house, his company enables enterprise customers to move from AI experimentation to production at scale, without managing the underlying technical detail. Across the discussion, panelists agreed that the industry’s traditional design and financing playbook is obsolete. AI infrastructure cannot be treated as a 25-year depreciable asset when hardware cycles move in months. Instead, data centers must be built as adaptable, elemental systems: capable of evolving as power, cooling, and compute requirements continue to shift. The session concluded with one obvious takeaway: AI is not a future state to prepare for. It is already shaping how data centers are built, where they are located, and how they generate value. The AI factory is no longer theoretical—and the industry is racing to build it fast enough.

Beyond the Core: Building Data Centers in America’s Next Frontier
12/19/2025 | 31 mins.
This bonus episode of EC&M On Air, sponsored by Wesco, explores how the rapid growth of AI, cloud computing, and digital infrastructure is reshaping the data center market and pushing development beyond traditional core locations. Host Ellen Parson is joined by Wesco subject matter expert David Speidelsbach to examine the forces driving unprecedented demand, the strain being placed on established data center hubs, and why emerging and tertiary markets are becoming the next frontier for development. The conversation dives into the benefits and challenges of building data centers in these new regions, including cost advantages, access to land and renewable energy, tax incentives, and the realities of labor, water, and infrastructure constraints. David also shares guidance for electrical contractors navigating these complex projects, highlighting the importance of early collaboration, strong partnerships, and supply chain coordination. Throughout the discussion, listeners gain practical insight into how contractors and suppliers can successfully support large-scale, mission-critical data center builds in an evolving market.

The Distributed Data Frontier: Edge, Interconnection, and the Future of Digital Infrastructure
12/17/2025 | 56 mins.
As AI workloads push data center infrastructure in both centralized and distributed directions, the industry is rethinking where compute lives, how data moves, and who controls the networks in between. This episode captures highlights from The Distributed Data Frontier: Edge, Interconnection, and the Future of Digital Infrastructure, a panel discussion from the 2025 Data Center Frontier Trends Summit. Moderated by Scott Bergs of Dark Fiber and Infrastructure, the panel brought together leaders from DartPoints, 1623 Farnam, Duos Edge AI, ValorC3 Data Centers, and 365 Data Centers to examine how edge facilities, interconnection hubs, and regional data centers are adapting to rising power densities, AI inference workloads, and mounting connectivity constraints. Panelists discussed the rapid shift from legacy 4–6 kW rack designs to environments supporting 20–60 kW and beyond, while noting that many AI inference applications can be deployed effectively at moderate densities when paired with the right connectivity. Hospitals, regional enterprises, and public-sector use cases are emerging as key drivers of distributed AI infrastructure, particularly in tier 3 and tier 4 markets. The conversation also highlighted connectivity as a defining bottleneck. Permitting delays, middle-mile fiber constraints, and the need for early carrier engagement are increasingly shaping site selection and time-to-market outcomes. As data centers evolve into network-centric platforms, operators are balancing neutrality, fiber ownership, and long-term upgradability to ensure today’s builds remain relevant in a rapidly changing AI landscape.



The Data Center Frontier Show