Hardware description languages such as SystemVerilog use the symbol 'X' to describe any unknown logic value. If a simulator is unable to decide whether a logic value should be a '1', '0', or 'Z' for high impedance, it will assign an X. This causes problems for two reasons. The first is that an X may be converted inadvertently to a 'known' state by overly optimistic simulation code. The second is that gate simulators can generate excess X values because they generally apply more pessimistic rules.
The situation is not helped by the traditional use of the X to express 'don't-care' conditions for the purposes of synthesis, as well as for an unknown state in simulation.
X states can propagate through a simulation, multiplying uncertainty and potentially hiding bugs.
This has become a bigger issue in recent years because of the use of power-gating architectures to save energy. In this design approach, when the logic is powered down, it no longer provides a reliable signal to downstream logic, which itself may not have been designed to cope properly. This propagates errors through the chip when the blocks move between power states. However, any logic block that has not been properly reset may also generate X values.
For example, a simulator will apply an X to any memory location that has not been initialized. Reset logic is expensive, particularly in terms of routing overhead, so it will rarely be feasible to apply a reset to every memory element at restart. Logic that has been designed with uninitialized memories and a weak reset strategy may be prone to more undiscovered bugs due to X propagation.
A common source of unwanted X optimism is when downstream logic states are assigned using 'if-then-else' or 'case' statements. Because the X state will not satisfy the logic test, the block will be assigned the default case. This may convert the X to a 'known' value or propagate it further into the simulation, masking a bug.
X pessimism can happen when signals converge, for example in a multiplexor. The simulator has to assign the X value to the output if presented with an X on an input that is not overridden by other known signals feeding into the block.
There are a number of ways of dealing with X propagation. One is to analyze the waveform generated by a simulator – many simulators color-code these signals to make them easier to pick out. However, this involves painstaking manual inspection and design insight to work out whether the X is dealt with correctly or not.
Some simulators can be set up to generate random values in place of Xs, on the basis that differences in behaviour with otherwise identical input vectors should point to X propagation issues. However, the errors caused by X propagation can be subtle and only turn up in rare cases, which may not be encountered during most simulation runs. Another possibility is more exhaustive simulation.
The 'xprop' simulation technology employed by recent versions of Synopsys' VCS will replace every X it encounters with both a 0 and 1 to calculate all possible values, and then merge them to decide which value should be driven to the output. VCS employs a number of merge techniques to reflect different expectations, including a pessimistic approach more akin to gate-level simulation, and a more hardware-like scenario in which any output that cannot be merged to a known value is converted to an X.
The CVC simulator from Tachyon-DA takes the approach of changing default Verilog semantics to a situation in which it works on the assumption that any X should be treated as a 0, 1 or X.
On the basis that prevention is better than cure, Mike Turpin's seminal 2003 SNUG Boston paper "The dangers of living with an X" contained a number of recommendations for writing HDL that is more likely to avoid X propagation, as well as advice on verification techniques:
"For one-hot logic on a critical path, write the RTL directly in a sum-of-products form (rather than case) and add a one-hot assertion checker.
"Avoid using if-statements, as they optimistically interpret Xs. Instead use ternary (that is, conditional ?) operators or priority-encoded case statements.
"For case statements, cover all reachable 2-state values with case-items and always add a default (but only use it to assign Xs, to avoid X-optimism)."
However, for complex control logic, it can be difficult to be sure that these coding technique lead to an X accurate, rather than X optimistic or X pessimistic, design.
Some of these issues can be detected by linting tools such as Ascent Lint. Cadence Design Systems is working on a combination of formal techniques that will yield what the company currently calls "super linting", and which will form part of a range of verification 'apps'.
Formal verification can be used to exhaustively prove whether Xs in the design are unreachable, or will not have an impact on the correct operation of the logic. Techniques vary but in the main they separate into automatic property checking to prove that an X is unreachable, or interactive property checking to prove that the X is not stored in a register. Another possibility is to use automatic property checking to investigate code coverage reports.
Vendors have added specific support for X propagation checks to their formal-verification suites. For example, Mentor Graphics' Questa AutoCheck is designed to automatically identify common errors on a push-button basis. For issues such as X propagation, deadlock, combo-loop and overflow, the tool generates the assertions that perform the necessary coverage checks without input from the user.
As part of its range of apps based on its core formal-verification technology, Jasper Design Automation has introduced one for X propagation checks. The app treats X-prone signals as a 0 and 1 to mimic hardware behavior. The tool provides a set of flows that checks the clocks and resets, control structures to see which areas of the code Xs can propagate to, and whether Xs can be propagated out of the design.
Real Intent has proposed a methodology that combines design-oriented checks, with the assistance of linting tools, with formal tools such as Ascent XV focused on the verification stages. The methodology is described by Lisa Piper in this article at Tech Design Forum.