Mentor Graphics has release a version of its Flotherm heat flow-analysis tool that makes it possible to build accurate models of heat flow through electronic systems from initial concepts through to prototypes without the extensive manual rework that these analyses have needed traditionally.
“The issue of heat is becoming more critical in electronics,” said Keith Hanna, director of marketing and product strategy at Mentor Graphics’ mechanical analysis division, pointing to the data that has come over the years from sources such as Mentor’s own PCB awards. These show that area consumed by components has halved since 1995 and component density risen by 3.5 times.
At the same time, the use of forced-air cooling has become more problematic. Consumers do not like the noise and data centre operators do not like the air-coniditioning bills. In automotive, the situation is even more serious. Hybrid and electric vehicles need high-density power electronics but excess heat has led to a number of expensive, high-profile vehicle recalls.
Roland Feldhinkel, product line director for Mentor Graphics’ mechanical analysis division, said the problem today is that heat simulation normally gets left to the end of a project, largely because the job is hard to do earlier.
Although a part of the Flotherm XT launch is the ability to transfer data more easily between the electrical and physical domains, the key problem the company had to overcome to get to some form of pushbutton analysis lay in the computational fluid dynamics (CFD) calculations that are needed for accurate thermal analysis.
“CFD is quite complex technology and can’t be applied automatically in each stage of the design process,” said Feldhinkel.
As with Spice simulations, convergence can be troublesome in CFD but repairing the problems is even more involved. CFD requires two, often manually intensive, stages of data preparation. One is to build a mesh that describes the physical and heat-transfer properties of the physical system. The second is to get a mesh that will support convergence during simulation. A third less apparent problem is that simulation accuracy depends on the quality of the mesh – too coarse resolution at a key point and the results may miss a critical hotspot.
Electronics designs compound the problem through their use of many tiny components. They have the benefit of being mostly rectangular, which is a good for CFD meshes, but they are being squeezed into curved packaging to make them look more attractive to consumers, a factor driven by “the Apple aesthetic”, said Hanna.
This has a knock-on effect on the componentry used on PCBs. Heatsinks are being machined to fit underneath curved surfaces and DRAM SIMMs are being tilted to reduce system height. These changes stop the components from being described as objects with edges that lie on the pure Cartesian grid fluid-flow specialists would like to use.
The result, using traditional techniques, is a much more complex mesh because each cell needs to contain a space that contains either fluid or solid. Hanna explained that these approaches derive from concepts developed by Professor Brian Spalding at Imperial College, London at the beginning of CFD technology. But an alternative approach was taken by researchers in the Soviet Union, starting in the 1980s, that made greater use of empirical experiments to gauge how fluid flows around complex objects.
Feldhinkel said: “They had a tremendous lack of computer resources but practically unlimited access to all testing facilities in the Soviet Union. The CFD data was calibrated against nature from the beginning.
“The result was that the boundary layer does not need to be solved with a mesh,” Feldhinkel added.
Instead the tool can use ‘wall functions’ – formulas that take account of the cell’s resistance to transporting momentum and heat – derived from the empirical data. As it decomposes the design, the tool matches the 3D structure to a mesh of reasonably regular cells that contain different wall functions.
“The software does not necessarily converge immediately but the tool will re-identify the solver parameters and start again if it cannot converge. It works with a knowledge base inside the software,” said Feldhinkel.
“People can do 30 or 40 simulations a week with this approach,” Hanna claimed. “It means they can do a lot more conceptual designs.”
For initial concept designs, the tool can take fairly primitive data – a rough and ready layout that shows the position of critical devices such as processors and their estimated power consumption. “With a representation of the enclosure, you can do a thermal simulation,” said Feldhinkel.
As the design progresses, real PCB data can replace the space models, exchanged with Flotherm XT using the CCZ file format supported by Mentor Expedition. The company aims to add support for other tools such as Cadence Design Systems’ Allegro and Zuken’s PCB software.