Major players in embedded development tools debate some of the key issues facing themselves and their customers.
Embedded systems is one of the most dynamic markets in engineering. We corralled three senior executives from major players in the sector—specifically Mentor Graphics, MontaVista Software and Wind River—to quiz them on some of the main trends in the sector. In alphabetical order, let’s meet our contributors.
Dan Cauchy is vice president of marketing at MontaVista, and has held a number of managerial roles within the company ranging across its activities in embedded Linux, mobile and mobile Internet devices, in-vehicle infotainment (IVI) and carrier/service provider products. He is also chairman of the Carrier Grade Linux Work Group at the Linux Foundation and represents MontaVista at the SCOPE Alliance and the GENIVI Alliance. Dan holds a bachelor’s degree in electrical engineering (with a computer engineering major) from the University of Ottawa.
Tomas Evensen is vice president and general manager of the Wind River Tools product division and also the company’s chief technology officer. He joined Wind River as part of the acquisition of Integrated Systems in 2000 and before that was vice president of engineering at Diab-SDS. Tomas has more than 24 years of experience in the OS and embedded tools area and received his MSEE from the Royal Institute of Technology in Stockholm, Sweden.
Glenn Perry is general manager of the Embedded Software Division at Mentor Graphics. He has more than 20 years of experience in the electronics industry, focused in the simulation and analysis of systems and in IC design. Prior to joining Mentor, Glenn held engineering and management positions at Analogy, Harris Semiconductor, Sandia National Laboratories and the United States Air Force Weapons Laboratory. He studied electrical engineering in the USAF and at the University of New Mexico.
State of play
We began by asking our panel for their feelings on the economy for embedded systems, and the prevailing view was relatively positive. It even included the suggestion that some areas held up well during a severe recession, the strongest being perhaps those less exposed to the sudden collapse in consumer spending.
“There have been pockets of continued activity,” said Evensen. “Fortunately, Wind River has fared well despite the recession and ended 2009 with one of its best quarters in nearly two years. This is an encouraging indication of the future growth potential of the embedded market segment.”
Ongoing forecasts—particularly coming out of VDC Research—have pointed to opportunities being out there, even though companies are cutting back engineering resources and delaying projects.
“One interesting statistic they have is that the embedded processor market grew more than 7% in 2009,” noted Mentor’s Perry. “That was nearly twice what general purpose processors achieved. The embedded market did well during the recession relative to other markets, and it’s showing signs of recovering.”
The obvious follow-up to observations like this was to ask where some of the sweet spots—or at least more resilient ones—might be. MontaVista’s Cauchy said that three readily came to mind.
“New initiatives like the recently announced MeeGo [uniting Intel’s Moblin software platform and Nokia Maemo user interface (UI) framework], the continued adoption of [Google’s] Android [platform], and the increased use of multicore in designs will all help drive new projects this year,” he said.
Evensen cited the growing potential for multicore to take more of the embedded market, having already reached preeminence in the general purpose processing business.
“We expect 2010 to be the year of the multicore ‘hockey-stick’ effect,” he said. “The market is beyond the education and discovery
stage; early adopters have jumped on the virtualization bandwagon; and those pioneers will drive the acceleration of adoption resulting in unprecedented innovation in 2011 and beyond.”
Perry said his company is also watching these areas closely, particularly multicore and Android, but added that he could see a further overarching trend at the functional/UI level.
“The display technology that’s now mainstream in the mobile space—thanks to the iPhone or any of the Google devices—has reached high enough volumes that the price points are quite low. Meanwhile, the software behind it has become very sophisticated and enough people have been exposed to the technology that it has also raised the bar in all embedded markets that have any type of display,” he said. “Solutions that enable this adoption of advanced UIs with 3D graphics on touch displays are a tremendous growth area.”
Google’s open source platform was originally targeted at mobile phones but its potential in broadening the Linux infrastructure—where it is occasionally described as ‘Linux without tears’—has raised expectations for Android in the broader marketplace. All the panelists noted its potential, but added that the technology is no universal panacea.
“One of the biggest challenges with Android is the commercialization aspect,” said Cauchy. “Like any open source project, you can’t simply download it, install it, and ship it. There is a lot of work to make Android commercially viable in a production device. This includes custom hardware support, application integration, bug fixing, as well as testing and verification.”
MontaVista has already launched an Android Commercialization Services offering that combines these types of tools and services. Such ventures are now common among the bigger embedded players. However, Perry noted that some emphasis must still be on how Android makes life easier in itself.
“The analogy I would draw comes from PC software development in the days before Windows hit the market. The effort I had to make to get from DOS to a UI was huge. And when I was done, that effort—spent on essentially a commodity technology—was totally disproportionate to what I put into the important stuff, the ‘secret sauce’. Then when you looked at my application compared to some other company’s application there was virtually no compatibility,” he said.
“Along came Windows, and suddenly there was this nice thick layer that sat on top of DOS, which obfuscated a lot of those problems and commoditized the things that probably should have been commoditized. In some ways, I view Android in the same way today on a Linux platform in that it brings a layer on top of Linux that obfuscates a lot of the complexity of Linux, has a rich UI environment, and it gives you the kind of connectivity and Internet-type capabilities people expect in an application today.”
The reality is that Android is still very much a newcomer, particularly beyond handsets. That is not bad news for the vendors. It means that there is complexity to manage and the scope to explore and promote new application areas with new products. Evensen noted as much, and he also picked out some of the differences between the markets for which the platform was designed and those use-cases being considered today.
“For example, set-top boxes (STBs) are frequently being explored as a new frontier for Android. As stationary devices for digital media, STBs require different technology from mobile phones. Technology differences range from Android’s native support for digital video standards to simply supporting different screen sizes,” he said.
“The use of Android for IVI devices also presents unique challenges. The key issue for consumer electronics power management centers on saving battery life. However, IVI power management—or more accurately, power state management—defines the device’s behavior given differences in power states. If the driver unlocks the car door, often the IVI device will begin to boot—if the engine isn’t started, the power state manager defines what to do with the boot process. Similarly, if the engine is turned off, often the device is not fully powered down right away and the power state manager defines when and how. In neither situation is a power state management equivalent found in the mobile handset industry.”
There may be several reasons why the embedded systems market is only now beginning to adopt multicore processing technologies. Some point to its inherent conservatism, particularly for safety-critical applications. Others note that difficulties persist in programming for multicore platforms. And then, there is an architectural contrast where general purpose processing has followed a homogeneous model (multiple identical cores) whereas the performance optimization required in embedded systems may favor a heterogeneous model (multiple different cores ‘tuned’ for their tasks).
“At the top level, I think it splits into a couple of different use-cases and you get different problems depending on which of those you’re facing,” says Perry. “If you’re trying to use multiple cores on top of a single OS and distribute your processing needs across them, that’s a really difficult challenge.
“In networking applications, our customers are telling us they spend as much as six months optimizing from first bring-up to what they believe they can achieve in terms of performance. When you ask them what’s happening there, why is it taking so long, they say the problem is they have legacy code. The legacy code has assumptions in it that don’t inherently allow the code to use the processing power that’s available, and it’s not particularly easy to understand those in profile and understand what to change to get a truly distributed processing platform.”
Perry then described another case where there are multiple OSs running on a single platform and distributed across multiple cores.
“So, you could have several cores serving one OS and a single core serving another OS. We see opportunities with very heterogeneous, very different types of OSs in a single platform. Multicore really enables that,” he said. “Although you do have communication challenges between them, peripheral management, driver management, power management and so on. However, from our point of view, those are the kind of challenges that people will need tools and services to help them with.”
Evensen believes that tool vendors have also made progress more recently in giving embedded developers more realistic access to multicore technologies after a “relatively slow and steady” start.
“The technology, expertise and products have improved in the last couple years. For example, multicore-aware tools with JTAG debugging now provide in-depth visibility into system behavior. Additionally, the development of efficient, real-time, embedded hypervizors can now provide virtualization on multicore platforms, allowing customers to create the ideal architecture for their needs,” he said.
“But there is still considerable work to be done on behalf of embedded software vendors to ease the challenges of the multicore transition. Tools can still be improved to provide more visibility to system behavior while reducing the complexity of the information so that software engineers can more easily understand it. Ultimately, vendors will need to take the time to educate the customers and shepherd them through this transition.”
Standards may also play an important role here. One initiative that has attracted much attention is the Multicore Association’s drive to promote best practices in programming for multicore platforms.
“The work done by the Multicore Association is definitely a step in the right direction. Encouraging customers to adopt open standards gives them the benefit of portability and predictable outcomes,” said Evensen. “However, much of the technical risk in migration to multicore still stems from existing source code that may not adhere to particular standards—say, POSIX—or to good concurrent programming design. In the future, however, standards will pay off significantly.”
The panelists then offered some high-level thoughts on the current trends in Linux generally. According to VDC, 18% of embedded developers are already using the OS, while 27% say that they plan to. Mentor’s Perry said that this kind of data points towards a significant shift in the user base.
“If you look at it historically, Linux has been for folks who are more accustomed to working in bare metal command line,” he said. “But as usage has grown and you’ve started attracting more developers who don’t come from that gene pool, I think they are going to need more help in terms of tools and services.”
MontaVista’s Cauchy sees the development model also playing a key role here, both for new and existing users.
“One of the changes we see is a move toward source-based development, away from binary distributions. This provides developers much more flexibility in the development process, allowing them to easily customize their code,” he said.
“This started with open-source tools such as bitbake and the OpenEmbedded Community. With our latest release, MontaVista Linux 6, we’ve commercialized bitbake in the MontaVista Integration Platform [MVIP] and moved to a source-based delivery model. We now deliver market-specific distributions [MSDs] designed for specific processors as source code, not binaries, allowing developers to use our distribution, plus pull in their own code and/or community code via the MVIP. Also, the MSDs are based on the semiconductor’s Linux technology, so they are feature-compatible with what you receive from your board vendor, only now they are also of commercial quality with added features.
“We call this ‘Aligning the Embedded Linux Supply Chain’. It makes the transition to commercial-quality Linux easier and requires less effort and rework than using traditional commercial Linux distributions.”
Evensen’s Wind River has its own version of this kind of strategy. “Our Linux product line follows a ‘pristine-source’ approach, providing full visibility into the patches and modifications applied to any open source component of the distribution,” he said. “This enables developers to update specific patches easily, saving time and effort spent deconstructing binary and partially binary distribution solutions, and allows seamless interaction with open source developer communities.”
If there is one all-encompassing conclusion to draw from these executives’ thoughts, it is that not only is the embedded market vibrant, but it also continues to generate a need for the tools and services that these vendors provide. These days, such reasons for optimism are more than enough for many in business.