Home GADGETS Micron ships production-ready 12-Hi HBM3E chips for next-gen AI GPUs — up...

Micron ships production-ready 12-Hi HBM3E chips for next-gen AI GPUs — up to 36GB per stack with speeds surpassing 9.2 GT/s

Micron ships production-ready 12-Hi HBM3E chips for next-gen AI GPUs — up to 36GB per stack with speeds surpassing 9.2 GT/s


Micron ships production-ready 12-Hi HBM3E chips for next-gen AI GPUs — up to 36GB per stack with speeds surpassing 9.2 GT/s

Micron formally announced its 12-Hi HBM3E memory stacks on Monday. The new products feature a 36 GB capacity and are aimed at leading-edge processors for AI and HPC workloads, such as Nvidia’s H200 and B100/B200 GPUs.

Micron’s 12-Hi HBM3E memory stacks boast a 36GB capacity, 50% more than the previous 8-Hi versions, which had 24 GB. This increased capacity allows data centers to run larger AI models, such as Llama 2, with up to 70 billion parameters on a single processor. This capability eliminates the need for frequent CPU offloading and reduces delays in communication between GPUs, speeding up data processing.

Source link