Eagle Mountain Data Now Offering High Density AI GPU Capacity at Fully Energized Anaheim Data Center
Eagle Mountain Data Now Offering High Density AI GPU Capacity at Fully Energized Anaheim Data Center
Pre-deployed NVIDIA GB300 clusters provide turnkey AI compute capacity, eliminating traditional 24-36 month infrastructure delays.
IRVINE, Calif.--(BUSINESS WIRE)--Eagle Mountain Data today announced that its OC 1 data center in Anaheim, California is now available for pre-lease, offering immediate access to high-density, AI-optimized GPU capacity across both bare metal and managed service models.
Pre-deployed NVIDIA GB300 clusters provide turnkey AI compute capacity, eliminating traditional 24-36 month infrastructure delays.
Share
Powered by next-generation NVIDIA GB300 systems and direct-to-chip liquid cooling, OC 1 enables customers to deploy large-scale AI workloads without the delays, complexity, or capital requirements of building infrastructure themselves.
The facility is engineered with Eagle Mountain Data–owned clusters, enabling customers to lease turnkey AI compute capacity without sourcing hardware. OC 1 serves as the anchor site for a broader Southern California expansion, representing the first phase of a 45 MW multi-facility development.
As AI demand rapidly shifts from model training to real-time inference, Eagle Mountain Data delivers high-density, low-latency infrastructure with pre-deployed GPU clusters ready for rapid onboarding and immediate deployment.
Leveraging direct-to-chip liquid cooling, the facility is designed to support ultra-dense GPU environments with maximum efficiency, reliability, and scalability. The infrastructure supports up to 135 kW per rack today and is pre-engineered for next-generation 200 kW+ architectures, ensuring long-term compatibility with future NVIDIA platforms.
Strategically located in Southern California, the carrier-neutral facility offers proximity to major technology hubs and low-latency access to West Coast users, while providing a critical alternative to power-constrained markets such as Northern California.
Key features include:
- Up to 135 kW per rack for ultra-high-density deployments
- Forward-compatible cooling for 200 kW+ systems
- Industry-leading efficiency targets of <1.15 PUE and 0.0 WUE (zero-water cooling)
- 100% green utility power availability
- Direct connectivity to AWS Direct Connect and Microsoft Azure ExpressRoute
“Our focus is simple: immediate delivery of infrastructure purpose-built for AI training and inference at scale,” said Alan Niedzwiecki, CEO of Eagle Mountain Data. “We are enabling customers to access next-generation GPU capacity quickly and reliably, without the 24-36 month delays and complexity of traditional deployments.”
This announcement builds on Eagle Mountain Data’s recently announced joint venture with KDC Partnership, reinforcing its ability to deliver a national pipeline of high-performance AI infrastructure.
Customers interested in reserving capacity are encouraged to visit www.eaglemountaindata.com or contact sales@eaglemountaindata.com
Contacts
Media Contacts:
Eagle Mountain Data Inc.
Thomas Schneider
press@eaglemountaindata.com
949-620-1590

