Oracle has begun deploying thousands of NVIDIA Blackwell GPUs across its data centres, marking a significant step in supporting the next wave of agentic AI and advanced reasoning models. The company has completed the first phase of its liquid-cooled NVIDIA GB200 NVL72 rack deployment on Oracle Cloud Infrastructure (OCI), with the systems now available to customers via NVIDIA DGX Cloud and OCI.
Enabling large-scale AI computing
The initial installation features a highly optimised setup that integrates high-speed NVIDIA Quantum-2 InfiniBand and Spectrum-X Ethernet networking. These technologies are designed to deliver low-latency and scalable performance, supported by a full suite of software and database integrations developed jointly by NVIDIA and Oracle. The infrastructure is tailored to support the growing demand for accelerated computing and inference workloads.
As one of the fastest-growing cloud service providers, OCI is among the first globally to bring NVIDIA’s Blackwell-powered GB200 NVL72 systems online. Oracle aims to scale its OCI Superclusters beyond 100,000 Blackwell GPUs to meet soaring AI demand. This expansion follows a period of rapid innovation in AI, with companies such as OpenAI launching new reasoning models in recent weeks.
Powering AI factories for real-world applications
The deployment of NVIDIA Grace Blackwell systems at Oracle aligns with a broader trend of transforming traditional cloud data centres into “AI factories”. These facilities are designed to generate intelligence at scale using agentic AI frameworks. The GB200 NVL72 platform is central to this vision, offering a rack-scale solution that includes 36 Grace CPUs and 72 Blackwell GPUs. It delivers high performance and energy efficiency for tasks such as reasoning model training and autonomous system development.
Oracle’s offering includes flexible deployment models for Blackwell systems, making them accessible to a wide range of customers. These include public and private sector organisations via OCI’s public cloud, government and sovereign cloud options, as well as customer-owned data centres through OCI Dedicated Region and OCI Alloy.
Broad customer interest across sectors
The new racks are already drawing strong interest from a diverse set of customers. These include major technology companies, enterprise clients, regional cloud providers, and government agencies planning to run workloads on the OCI GB200 systems. NVIDIA also intends to leverage the systems for a variety of use cases, including AI model training, autonomous vehicle development, chip design, and the creation of AI development tools.
The OCI-hosted NVIDIA DGX Cloud platform provides customers with a robust environment to deploy AI workloads. It comes with software, technical services, and ongoing support, making it easier to develop and manage AI solutions at scale across multiple industries.