What is emulation?
Emulation is the use of a specialist computer (‘emulator’) automatically to map a system-level or RTL representation of a design to its internal, often custom, programmable gate-array for use in the functional verification of the hardware and software in a design.
The emulation market is currently undergoing major growth. Analyst Gary Smith has been quoted as saying it will rise in value from $118M in 2008 to $211M in 2012. Several factors lie behind this growth:
- The increasing gate count, even for today’s medium-sized system-on-chip designs, is leading to what many project managers consider unacceptably long RTL simulation software runtimes.
- Time-to-market pressures mean that strategies that enable more efficient hardware/software co-verification are becoming increasingly attractive. Emulation offers this despite a comparatively high upfront cost.
- Today’s emulators are architected both for use in the cloud (they can thus be shared by geographically spread design teams) and scalability (several can be connected for use on the same project to further reduce run-times).
- Emulators offer shorter compile times as well as far greater debug and waveform visibility than FPGA prototyping.
- The system-level capabilities of today’s emulators enable their use earlier in design projects than before to assist in decisions made at higher levels of abstraction. This also helps justify their cost.
Why do companies use emulation?
Emulation used to be a ‘rich man’s game’. Emulators would cost upwards of $1M, and RTL simulation in software could deliver similar results and debug visibility at lower cost within an acceptable time.
Today, emulators typically run at 1MHz (though speeds of up to 5MHz are claimed) while simulators run at around 100Hz for large designs (100M+ gates). This implies that you need 10,000 simulators to run a verification project at the same speed as one emulator – in reality, most simulators will run across many CPUs in a compute farm, but the runtime differences are marked. What takes days in simulation, can take only hours in emulation.
At the same time, emulators can provide waveform outputs for debug. While RTL simulation offers more granularity, emulators include features that can guide the design team to those waveforms that are likely to be more sensitive.
By comparison with FPGA prototyping, emulators provide far greater debug visibility and can be stopped and restarted from the point where a bug or waveform of interest is encountered. Historically, FPGA prototypes had to complete their run, and then be restarted from the beginning once any issue had been encountered, although one recent innovation in the rival sphere claims to have overcome this issue.
Nevertheless, many project managers still feel that major designs should have simulation, emulation and FPGA prototyping stages, with the ‘raw’ FPGA being brought into play once the RTL is very mature.
Finally, the increasing importance of embedded software is perhaps the main source of the current growth wave for emulation.
The advanced emulators provided by the market’s three main players – Cadence Design Systems (Palladium XP), Mentor Graphics (Veloce) and Synopsys (ZeBu) – are all capable of handling both system-level (e.g. C, C++, transaction-level models) and RTL designs.
This enables software to be ported to the emulator early in the verification process, and to run relatively quickly, so that code can be written and debugged well ahead of its final analysis on an FPGA prototype. This can be a major advantage in terms of time-to-market.
As a result, emulators and emulation farms are becoming common in design companies of all sizes.
What other advantages does emulation offer?
In addition to the speed, abstraction, software verification and visibility benefits, emulators are now being sold as ‘green’ devices.
As well as being quicker than compute farm-based simulators, it can be argued that using an emulator consumes far less energy than a software simulation approach, – on the order of 2,000 times less, according to one vendor’s analysis.
Scalability means that while emulators continue to represent a high initial investment, they can be maintained and operated for much longer than the original standalone boxes. One obvious limitation is that when features are added to a new generation, they often may not be portable across the legacy infrastructure.
Another increasing area of activity and innovation regards the interplay with tools elsewhere in the design flow.
In May 2012, Cadence announced a much closer integration of its PalladiumXP emulator with other parts of its verification offering within a new look System Development Suite. It allows essentially the same design files to be moved across interconnected simulation, emulation and prototyping (virtual and FPGA) platforms. The initiative appears to be a clear response to methodologies that address increasing complexity by drawing on all the verification and prototyping techniques as needed, rather than making either/or decisions across them.
In April 2012, Mentor announced a new generation of emulator, Veloce II. One key feature is closer integration with test and simulator program creation. This approach also looks toward multi-stage verification and prototyping strategies but rather identifies the emulator as a “hub” for the various activities.
In October 2012, Synopsys announced that it was directly entering the emulation market with the acquisition of EVE, which developed the ZeBu suite of emulators. It will now integrate these products with its existing prototyping and verification products. This guide will add further details on how that integration is executed in due course.
What are the limitations of emulation?
RTL simulation continues to be the preferred environment for low-power analysis, one of the dominant metrics for many of today’s most complex SoCs. Such simulations may, to save time, be undertaken on blocks rather than entire designs. Indeed, simulators continue to offer the greater granularity of view for debug generally. However, some projects use a combination of emulation and simulation as a design matures, to balance views into the code against acceleration options (see below).
An emulator will not run at real-world speed. Nor, for many typical designs, will an FPGA prototype, but it will get a lot closer. As such, the FPGA may be needed for those designs where there is a need to simulate end-use in portable devices or those likely to encounter harsh conditions.
Emulation does still require a larger initial investment than an FPGA prototyping board, although its proponents say this can be offset by use across multiple projects and also because compilation carries less risk than trying to take mature RTL and port it to a ‘standard’ array.
Innovation in emulation
In-circuit emulation (ICE) has been an important growth area for the technique. It allows the emulator to serve as a replacement for the intended device and have peripherals (USB, PCIe, etc) connected to it. This is widely seen as another way in which emulators seek to secure some of the ground taken by FPGA prototypes in replicating real-world conditions.
Another area is hardware-assisted acceleration. Here, according to Cadence’s Rav Avinun, “The design is typically running on the hardware while a simulation testbench runs on the workstation. If the testbench is transaction-based, the result is transaction-based acceleration, or as some say, co-emulation.” Thus, you draw on integration on the simulation and emulation boxes (most are designed to offer this feature).
With its Veloce II launch, Mentor Graphics introduced the concept of VirtuaLAB, to replace external peripherals with RTL models of technologies like USB and Ethernet. The advantage of these models is that the appropriate model is always ‘on hand’ and an environment can be created to which numerous disparate members of a design team have access, rather than only those who can physically connect peripherals.