News
As data centers face increasing demands for AI training and inference workloads, high-bandwidth memory (HBM) has become a ...
Micron HBM4 features a 2048-bit interface, achieving speeds greater than 2.0 TB/s per memory stack and more than 60% better ...
As mass production of sixth-generation HBM4 nears, South Korean chip giants Samsung Electronics and SK Hynix are aggressively ...
AMD confirms its next-gen Instinct MI400 AI GPU has double the compute performance, a huge 432GB of next-gen HBM4 at 19.6TB/sec and will launch in 2026.
2h
ExtremeTech on MSNAMD Launches Instinct MI350X and MI355X AI GPUsThe processors typically appear in sets of eight in the Instinct MI350 series platforms. AMD also provided platform ...
Micron expects the HBM total addressable market to grow from about $16 billion in 2024 to nearly $100 billion by 2030, ...
AMD announced its new open standard rack-scale infrastructure to meet the rising demands of agentic AI workloads, launching ...
AMD issued a raft of news at their Advancing AI 2025 event this week, an update on the company’s response to NVIDIA's 90-plus ...
AMD confirms that it's new Instinct MI350 series AI accelerators use Samsung's latest HBM3E 12-Hi memory, with up to 288GB HBM3 inside of its new AI chips.
AMD has unveiled its latest Instinct MI350 series GPUs designed specifically for data center AI tasks. These new GPUs use the CDNA 4 architecture and are built mostly on TSMC’s cutting-edge 3nm ...
There was a great deal of focus on the official launch of their Instinct MI350 and higher-wattage, faster performing MI355X GPU-based chips, which AMD had previously announced last year.
14h
The Chosun Ilbo on MSNSamsung secures AMD contract for HBM3E 12-stack, clears defect concernsSamsung Electronics has secured a key supply deal with AMD, with its fifth-generation 12-layer HBM3E memory selected for the chipmaker’s upcoming MI350 AI accelerators. The move marks a breakthrough ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results