News

Nvidia plans to launch a downgraded HGX H20 AI processor with reduced HBM memory capacity for China by July to comply with new U.S. export rules, if a new rumor is correct.
Nvidia's AI Enterprise suite, which covers a host of AI frameworks including access to its inference microservices (NIMs), would run you $4,500 a year or $1 an hour in the cloud, per GPU.
Generally available as of October 3 on the AI Innovation Cloud, the eight-way HGX H200 provides up to 32 petaflops of FP8 deep learning compute and more than 1.1TB of aggregate HMB3e memory.
Supermicro's NVIDIA HGX B200 8-GPU systems utilize next-generation liquid-cooling and air-cooling technology. The newly developed cold plates and the new 250kW coolant distribution unit ...
The AI server company introduced over 30 solution stacks for Nvidia HGX B200, GB200 NVL72, and RTX PRO 6000 Blackwell Server Edition deployments. This enables rapid time-to-online for European ...
Interconnected with a Nvidia Quantum-2 InfiniBand networking platform, the supercomputer is comprised of 20 Nvidia HGX H100 systems and 160 Nvidia H100 Tensor Core GPUs.
Supermicro's NVIDIA HGX B200 8-GPU systems utilize next-generation liquid-cooling and air-cooling technology. The newly developed cold plates and the new 250kW coolant distribution unit ...