At the CICC conference earlier this week, held online because of the Covid19 outbreak, Intel director of circuit research Vivek De described the need for IoT-oriented SoCs to become much more adaptable to power, temperature and ageing but at the same time not fall victim to the kinds of physical hacks that such devices might suffer.
Similar to the Razor system developed at the University of Michigan more than a decade ago, Intel has been working on prototypes that use error tracking and adaptive clocking to try to remove the margins that lead to excess energy consumption in low-voltage designs. In effect, the pipeline manager is continually tuning the clock and voltage to keep critical paths running just fast enough to beat the clock signal. Should a voltage droop lead to a missed deadline, error detection kicks in re-executes the affected instruction. If there are too many errors, the clock speed drops a notch or it pushes up the voltage.
Similar controls can help deal with the ageing effects. De proposed the deployment of numerous on-die sensors to track not just temperature, voltage and current but the degradation of cores based on stress and ageing effects, all feeding into an onchip controller. As the degradation suffered by cores will depend on how they are used in the field, one possibility is to deploy a number of cores in parallel and tune how they are used over time. As one becomes overstressed, others take over in a manner similar to that of wear-levelling in a flash memory. Alternatively, more work might be distributed across cores over time so that individual processors can operate at a lower voltage and with less stress on the circuitry. That calls for a lot of interaction between the software, firmware and hardware as well as a very fine-grain power-delivery system.
Intel has shown a number of onchip voltage regulator designs and components at conferences like VLSI Circuits Symposium, IEDM, ISSCC over the years. These need to be brought onchip so as to be able to tune voltage and current for individual cores rather than the larger islands targeted by offchip power management units. That leads to issues of process compatibility and how it affects the power architecture. Intel’s current thinking is a two-stage delivery system based on a digital buck regulator architecture coupled with finer-grained digital LDO units. “These convert a low-current supply to a high current as close to the point of use as possible,” De said. The buck has higher overall efficiency and a better output range but the DLDO is easier to integrate and deploy over individual execution units.
Such fine-grained control could provide attackers with detailed clues as to what the processor is doing and, by analyzing that information pull out information that should be secret, such as cryptographic keys. De said that though these attacks tend to require physical access the nature of the IoT is that there is a good chance an attacker will be able to get this. “The devices must be protected against tampering and side-channel attacks. If they aren’t they can compromise the security of the entire system.”
As well as building IP that resists active, glitching-type attacks, Intel’s engineers worked on a way of controlling the power regulators themselves to disguise their operation to an attacker observing the system passively. “The voltage regulator control loops provide small signal transformations,” De said, effectively desynchronizing incremental voltage changes from crypto-operations. “It makes it harder for the attacker to measure power for information extraction,” De said, adding that the approach makes it possible to trade performance against attack resistance.
The question is when to run the attack countermeasures. One possibility is to to train a system to detect the kinds of activity that point to side-channel attacks. As they are statistical techniques, they need a lot of data to work, a factor that might be used against the attacker. A heavily loaded device might start to slow down operations – a technique often deployed on glitching attacks – and beef up countermeasures.
Some side-channel attacks might even be detected directly. De described the use of inductance sensors next to the individual cores to determine whether an EM probe has been deployed close to a cryptocore. Once detected, the device could activate its full countermeasures but save power by leaving most disabled at other times. In doing so, the area of the average IoT device will creep up but its power consumption kept to a manageable level.