H100$6.39/hr 1.2% 7d
A100 80GB$2.45/hr 0.5% 7d
H200$10.29/hr 0.8% 7d
L40S$1.28/hr 0.3% 7d
T4$0.24/hr 0.6% 7d
L4$0.45/hr 1.1% 7d
H100$6.39/hr 1.2% 7d
A100 80GB$2.45/hr 0.5% 7d
H200$10.29/hr 0.8% 7d
L40S$1.28/hr 0.3% 7d
T4$0.24/hr 0.6% 7d
L4$0.45/hr 1.1% 7d
NVIDIAGPU

NVIDIA A100 40GB

Ampere architecture · 40GB memory · 312 FP16 TFLOPS · 400W TDP

Cloud Pricing Today

About the NVIDIA A100 40GB

The NVIDIA A100 40GB is the standard Ampere-generation data centre GPU with 40GB of HBM2e memory. It was the workhorse GPU for AI training before the H100 and remains cost-effective for many training and inference workloads.

Memory (VRAM)
40 GB
FP16 Performance
312 TFLOPS
Power (TDP)
400W
Architecture
Ampere

Common Use Cases

AI trainingModel fine-tuningBatch inferenceScientific simulation

Key Facts

Manufacturer
NVIDIA
Architecture
Ampere
Accelerator Type
GPU
Primary Use
training
Memory (VRAM)
40 GB
FP16 Performance
312 TFLOPS
Thermal Design Power
400W

Related Accelerators

Investment Tool

Calculate NVIDIA A100 40GB ROI

Estimate payback period, annual returns, and 3-year ROI with live Signwl pricing data.

Track NVIDIA A100 40GB pricing over time

Get access to historical pricing data, regional analysis, and custom alerts.