Micron introduces dense 256GB LPDDR5x module aimed squarely at AI servers Eight SOCAMM2 modules can push server memory capacity to a massive 2TB AI inference workloads increasingly shift performance ...