Learn languages naturally with fresh, real content!

Popular Topics
Explore By Region
Nvidia may debut an AI chip with on-chip SRAM at GTC 2026 to cut latency, but SRAM won’t replace HBM soon.
Nvidia may introduce a new AI inference chip at GTC 2026 using on-chip SRAM instead of external HBM to reduce latency and data movement.
The design integrates large SRAM blocks directly into the chip, potentially boosting performance for low-latency workloads like edge computing.
However, SRAM is far more expensive and space-intensive than DRAM, limiting its use to small, high-speed cache roles rather than replacing HBM for large-scale AI tasks.
Experts say SRAM won’t displace HBM or DRAM in the near term, as the three will likely coexist in a layered memory hierarchy.
The shift is expected to be gradual, with major memory makers like Samsung and SK hynix maintaining market dominance.
Nvidia puede debutar un chip AI con SRAM en el chip en GTC 2026 para reducir la latencia, pero SRAM no reemplazará a HBM pronto.