Design trade-offs in using DDR4 memory for enterprise applications

By Luke Collins |  No Comments  |  Posted: August 30, 2016
Topics/Categories: Blog - IP  |  Tags: , , ,  | Organizations:

How do you design large memory arrays for servers, storage and networking? Some of the trade-offs involved in building such arrays for high-performance enterprise equipment using DDR4 memories are explored in a recent webinar.

Marc Greenberg, director of DDR IP marketing at Synopsys, argues in the webinar that the memory requirements of servers are rising in part because of a trend to hold complete databases in system memory for maximum performance.

Adding memory for such applications involves more than just plugging more DIMMs onto the same channel on a server’s PCB. Doing so adds capacitive load, which demands more power, as well as slowing signals and hence limiting operating frequencies. The extra DIMMs also create more discontinuities in the impedance of the signal lines, which must be countered by more sophisticated equalisation in the PHYs driving the lines.

The webinar goes on to compare three alternative ways to add more DRAM, with unbuffered, registered or load-reduced DIMMs, before taking a look at the promise of die-stacking techniques to increase capacity without boosting line loading. It also explains why just building bigger memory die may not be as straightforward a solution as it appears.

There is a discussion of some of the advanced equalisation techniques necessary to transmit and receive data at the 3200Mbit/s data rates of DDR4, and a look at the challenges of building effective controllers for high-speed multibank systems that use these fast memories.

Enterprise systems have much tougher requirements for reliability (not failing), availability (keeping going even after a failure) and serviceability (support for diagnostics) than consumer-grade computers. Error-correcting codes (ECC) can be a useful way to support reliability, availability and serviceability goals, and the webinar explores some of the trade-offs involved in various ECC implementations.

It also covers some of the challenges of implementing memory refresh, the advantages of DDR4’s databus inversion feature in limiting the impact that large numbers of simultaneous bit transitions have in creating crosstalk, and techniques to make command and address signals more robust.

There’s also a look at Synopsys’ range of DDR4 IP for enterprise applications, which has been designed to address many of the issues described above.

Watch the webinar here.

Leave a Comment

PLATINUM SPONSORS

Synopsys Cadence Design Systems Mentor - A Siemens Business
View All Sponsors