Learn languages naturally with fresh, real content!

tap to translate recording

Explore By Region

flag Nvidia may debut an AI chip with on-chip SRAM at GTC 2026 to cut latency, but SRAM won’t replace HBM soon.

flag Nvidia may introduce a new AI inference chip at GTC 2026 using on-chip SRAM instead of external HBM to reduce latency and data movement. flag The design integrates large SRAM blocks directly into the chip, potentially boosting performance for low-latency workloads like edge computing. flag However, SRAM is far more expensive and space-intensive than DRAM, limiting its use to small, high-speed cache roles rather than replacing HBM for large-scale AI tasks. flag Experts say SRAM won’t displace HBM or DRAM in the near term, as the three will likely coexist in a layered memory hierarchy. flag The shift is expected to be gradual, with major memory makers like Samsung and SK hynix maintaining market dominance.

3 Articles