The International Test Conference got into full stride on Tuesday (September 20th) and inevitably the word you heard a lot was ‘bottleneck’.
There’s no questioning the importance of test in terms of time-to-market and its an issue that splits two ways: technology and management. For a necessarily quick blog post, I’m going to set aside technology here to offer a few quick observations on the management side.
An interesting presentation from a team at NXP Semiconductors looked at the problematic issue of analog test. Its work observes that AMS/RF test represents somewhere in the region of 60-80% of test cost (by contrast, research from Infineon Technologies suggests that memory and digital test may represent only 13% combined).
That obvious brake on time-to-market is one that NXP is now looking to address through a fast sensitivity analysis and in applying its strategy to an automotive IC (a market that demands zero defects) believes that it could halve the necessary test run from 400 cases to 200 even with the quality constraints it faces.
We’ll look more closely at the specifics of how it achieved that in a more detailed technical article once the conference closes, but one overarching observation one can make is that as much as the company uses technology to analyze the value of its various cases and then winnow them down, it is being done outside the tool arena and more in terms of a control strategy.
A presentation from Qualcomm and OptimalTest had a very similar flavor. Tools take you so far but then you have to manage the supply chain and the mushrooming data that today’s highly integrated designs create.
Again, the ‘push-pull’ model that Qualcomm uses here is interesting. It pushes tested die to a die bank and its SATs then pull those die and build them to the fabless giant’s specifications.
What that requires is visibility across the supply chain, compatibility in the exchange of data and sharing in ways that engineers, managers and subcontractors can use as quickly as possible to improve yield and catch problems early.
And there’s this staggering number that Qualcomm offers in that the companies designs are now generating something in the region of 4TB of data, every quarter. This volume is only likely to increase and places a huge internal IT management burden on the company, not only to curate that data on a day-to-day basis but also to maintain systems that will allow it to continue to do that in the middle-to-near term without having to repeatedly rethink its quality and engineering processes.
Issues such as analog test (and increasingly the further burdens imposed by proliferating I/Os and the arrival of ‘true’ 3D design in all its variants) will be ‘hot buttons’ for some time to come. They will dominate attention, but behind them the complexity of control over test procedures is the one that appears to be pushing itself to the front of users’ thoughts.