为了减少延迟时间,Nvidia可能在GTC 2026上首次推出带有芯片内SRAM的AI芯片,但SRAM不会很快取代HBM.
Nvidia may debut an AI chip with on-chip SRAM at GTC 2026 to cut latency, but SRAM won’t replace HBM soon.
Nvidia可能会在GTC 2026引入一个新的AI推断芯片,使用在芯片SRAM而不是外部的HBM来减少潜伏和数据流动。
Nvidia may introduce a new AI inference chip at GTC 2026 using on-chip SRAM instead of external HBM to reduce latency and data movement.
该设计将大型SRAM区块直接融入芯片,有可能提高边缘计算等低延迟工作量的性能。
The design integrates large SRAM blocks directly into the chip, potentially boosting performance for low-latency workloads like edge computing.
然而,SRAM比DRAM更昂贵,占用空间更大,其使用限制在小型高速缓存角色上,而不是替代HBM进行大规模AI任务.
However, SRAM is far more expensive and space-intensive than DRAM, limiting its use to small, high-speed cache roles rather than replacing HBM for large-scale AI tasks.
专家表示SRAM不会在近期取代HBM或DRAM,
Experts say SRAM won’t displace HBM or DRAM in the near term, as the three will likely coexist in a layered memory hierarchy.
预计这种转变将是逐步的,主要记忆制造者如三星和SK Hynix将保持市场支配地位。
The shift is expected to be gradual, with major memory makers like Samsung and SK hynix maintaining market dominance.