H100$6.39/hr 1.2% 7d
A100 80GB$2.45/hr 0.5% 7d
H200$10.29/hr 0.8% 7d
L40S$1.28/hr 0.3% 7d
T4$0.24/hr 0.6% 7d
L4$0.45/hr 1.1% 7d
H100$6.39/hr 1.2% 7d
A100 80GB$2.45/hr 0.5% 7d
H200$10.29/hr 0.8% 7d
L40S$1.28/hr 0.3% 7d
T4$0.24/hr 0.6% 7d
L4$0.45/hr 1.1% 7d
GoogleTPU

Google TPU v2

TPU v2 architecture · 8GB memory · 45 FP16 TFLOPS

Cloud Pricing Today

About the Google TPU v2

The Google Cloud TPU v2 is a second-generation Tensor Processing Unit designed for accelerating machine learning training and inference. TPUs use a fundamentally different architecture to GPUs, optimised specifically for matrix operations common in neural networks.

Memory (VRAM)
8 GB
FP16 Performance
45 TFLOPS
Architecture
TPU v2

Common Use Cases

ML training on Google CloudTensorFlow workloadsResearch

Key Facts

Manufacturer
Google
Architecture
TPU v2
Accelerator Type
TPU
Primary Use
training
Memory (VRAM)
8 GB
FP16 Performance
45 TFLOPS

Related Accelerators

Track Google TPU v2 pricing over time

Get access to historical pricing data, regional analysis, and custom alerts.