Nvidia’s Ampere GPUs come to Google Cloud – ProWellTech
Nvidia today announced that its new Ampere-based data center GPUs, the A100 Tensor Core GPUs, are now available in alpha on Google Cloud. As the name suggests, these GPUs have been designed for AI workloads, as well as for data analysis and high-performance computing solutions.
The A100 promises a significant performance improvement over previous generations. Nvidia claims that the A100 can improve training and inference performance by more than 20 times compared to its predecessors (although you’ll mainly see 6x or 7x improvements in most benchmarks) and outperforms about 19.5 TFLOP in single precision performance and 156 TFLOP for Tensor Float 32 workloads.
“Google Cloud customers often come to us to provide the latest hardware and software services to help them drive innovation on artificial intelligence and scientific computing workloads,” said Manish Sainani, director of product management. of Google Cloud, in today’s announcement. “With our new family of A2 virtual machines, we are proud to be the first major cloud service providers on the market Nvidia GPU A100, just like we were NvidiaGPU T4. We are excited to see what our customers will do with these new features. “
Google Cloud users can access instances with up to 16 of these A100 GPUs, for a total of 640 GB of GPU memory and 1.3 TB of system memory.