IC Manage expands big-data work
IC Manage is expanding its work on big data in EDA with the creation of a labs program that aims to work with clients on novel ideas for analyzing the gigabytes of output from chip-design tools.
At the Design Automation Conference in 2015, IC Manage launched its Envision product, which shows progress on projects based on the data pulled from design-management tools. To this, the company is adding one that monitors functional verification results with plans to expand into physical verification and timing analysis among others.
“We believe that big data will have more impact on the semiconductor design business sooner than the AI or machine-learning market will. The evolution of big data and technology is much further along,” said Dean Drako, president and CEO of IC Manage.
The Envision Verification Analytics tool takes in design data from the EDA tools. The data on the design hierarchy and test cases is used to help organize the more unstructured data that the tools produce during operation.
“The reality of this is that it’s mostly unstructured data. Most of the big data [in EDA] is log files: big long lists of what happened. You are supposed to sift through them and find that useful.”
In practice, the only point the log file gets much attention is when things go wrong and a series of grep operations then try to locate when and how. Similar to ARM’s work on a private cloud for verification data, IC Manage aims to pull much more usable information out of the data to indicate where projects may be running out of problems or potential for speedups.
“What we have done is take the big data technology and marry that to the structured data to create a hybrid,” Drako noted.
To try to find more applications, IC Manage has set up its Big Data Labs. “When we developed the first big data product, we learned how to collect data from the tools. Getting the data into some kind of data warehouse that we can use is a big job. Then we said we are going to develop our second big-data product. We also said we could see this as a very repeatable model. Folks are coming to us with ideas on how to deploy this technology,” Drako said. “We can do lots of these tools using the same underlying technology. We believe it will be applicable to functional verification, physical verification, timing analytics, and power among other things.”
The analytics software generates customizable, interactive reports of verification progress. “You can drill down based on the design hierarchy,” Drako said. “Ask ‘what changed yesterday? What broke all the regression tests?’ for example.”
“In timing analysis, you will be looking at slack times. I’ve got a massive numbers of timing nodes to deal with but I want to see how convergence is going. Maybe there are 100,000 blocks and half of them have converged. I can see at what rate the rest are converging. Team X may be showing a high rate but another group is moving more slowly. We’ve got a problem. Maybe they are not getting server time to run their jobs or it may be to do with different design styles,” Drako explained.
Leave a Comment
You must be logged in to post a comment.