Chiplets may have to prove themselves for secure operation

By Chris Edwards |  No Comments  |  Posted: November 8, 2021
Topics/Categories: Blog - EDA, IP, PCB  |  Tags: , , , ,  | Organizations:

How much can you trust the chiplets inside a multichip package? Not nearly as much you would like, explained University of Florida professor Mark Tehranipoor in an online seminar organized by packaging trade group MEPTEC in early November.

Trust will be a particular problem for devices bought off the shelf that need to work in a sensitive environment, Tehranipoor argued: “When you receive a chiplet from the market and you want to use it in your system, you have very limited ability to perform security and trust verification.”

Though purchasers have the option to run various penetration and fuzzing tests to try to trigger any covert hardware Trojans, he argued that most will be designed to only to react to specific triggers which will make them tough to detect using these techniques.

On the other hand, teams responsible for a device produced by an untrusted supply chain can at least compare the design they produced with the physical copies they receive and can check they behave as expected or proceed to decap and analyse samples to check the layout matches the GDSII. This method is also not perfect as Trojans that are implanted to weaken key generators have been shown to be extremely difficult to detect even with a full physical analysis. However, the assumption is that triggerable covert circuits will stand out in a one-to-one comparison simply by virtue as appearing to be additional wiring and transistors.

Chiplet provers

One option Tehranipoor put forward is to bring that element of physical comparison to commercial off-the-shelf products. His group at Florida and the startup he founded, Caspia Technologies, has worked on backside imaging to make it easier to analyse the physical layout without going through a full decapping process. The substrate of the silicon does need to be etched down to a very thin layer in order to improve the resolution of the imaging but Tehranipoor said it could be a relatively quick process on the order of hours.

What the technique does need is cooperation from the chiplet vendor and their willingness to provide the source layout as sent to the foundry. “It requires that chiplet companies come in and be part of this effort,” he said.

However, rather than demand the vendor provide detailed and proprietary design data to its security-conscious customers, Tehranipoor envisages third-party “provers” acting as the interface. They would receive the GDSII under non-disclosure terms and perform the imaging to compare the designs. The process envisaged by Tehranipoor uses a combination of image processing and machine learning to generate suitable high-resolution data that can be compared against the original design.

“The whole process takes around 30 hours today. Our goal is to bring it down to 10 hours. That requires more efficient image processing and pattern recognition but I believe it’s possible to do that,” he claimed.

Chiplet-based design provides an opportunity to rethink the way hardware is assembled and used. Instead of having to trust that devices are authentic and reliable, devices can be forced to verify themselves at runtime. Tehranipoor proposed using extensions of techniques that have been considered for PCB-level integration and for systems that need to verify upgrades and spare parts in the field.

Locked logic

In this model, chips are expected to implement logic locking: in which they do not activate unless they can verify the environment in which they are running. This generally takes place using some kind of authentication process based on the public-key infrastructure. Under this model, each chiplet that needs to be security verified contains an IP core that implements the in-package protocol as well as functions such as an odometer to determine whether the chiplet has been activated before and for how long it has operated. The odometer functions may work in concert with a blockchain or some other kind of shared-ledger system to ensure that compromised parts cannot have their clocks wound back should an attacker find a way to reset counters.

Each chiplet is then interrogated by a chiplet hardware security module (CHSM) that in Tehranipoor’s model is loaded into an FPGA at boot time. The use of an FPGA provides a degree of flexibility in how I/Os can be routed to and from the CHSM as well as the ability to run a number of runtime functions. It may also contain tamper-detection sensors in addition to those that might be integrated into the core chiplets as well as the logic-unlocking code needed to fully activate the multichip module. The CHSM may also route signals to the CSIP from foundry or assembly house machines.

“Every one of those chiplets can communicate with a HSM in the foundry environment,” Tehranipoor said. “And that HSM can communicate with the cloud. It allows you to control the test process remotely: it does not allow any overproduction or remarking of the parts to happen.”

As a result, Tehranipoor sees the mechanisms enforced by the CSIP and CSHM combinations as being useful weapons against part recycling, a common trick used by counterfeiters where faulty or simply old parts are recovered from existing hardware and put back into the supply chain as ’new’ parts. Another possible avenue is protection against ransomware attacks, implemented using a form of side-channel analysis carried out in the FPGA that determine whether firmware has changed from its reference implementation.

Comments are closed.

PLATINUM SPONSORS

Synopsys Cadence Design Systems Siemens EDA
View All Sponsors