What’s more complex than a system? Well, multiple systems connected together seems like a logical next step. With the combination of technology node reduction and new fields of application, we are starting to see that the systems on chip (SoCs) of yesterday are being combined into the systems of today. Basically, a collection of SoCs each with different interfaces coexisting to deliver a common application.
Furthermore, with the evolution of automotive, mobile, artificial intelligence (AI) and other markets, the verification task is becoming critical and very difficult. This is particularly true for sensitive application areas like automotive, mil aero, healthcare and industry. The difficulty is not only linked to the complexity and size of the systems, but also to the always critical pressure of time-to-market.
What is a system-of-systems design?
Imagine several systems in the same piece of equipment –– a car, for example, is a system-of-systems design. There are multiple, highly complex and different systems talking to one another in a very entangled environment, each with a specific function that contributes to a greater whole. In addition to the traditional digital environment seen in SoCs, system-of-systems designs combine physical-world constraints like sensor inputs and mechanical interactions. A perfect example is found in the validation requirements for autonomous vehicles.
Hierarchical view of autonomous vehicle development
Developing an autonomous vehicle (AV) involves several technological domains including the mechanical, sensor technologies, AI, connectivity, electrification, big data and cloud computing. Embedded software is the vital fluid that glues together the domains.
Taking a close look, we can see that an AV design is carried out on four hierarchical levels, as shown in Figure 1.
At the bottom lies the integrated circuit (IC) level that may include one or more SoC designs. SoC development involves three stages from input to output: sensing, computing and acting.
Sensing captures real-world images, videos, vehicle conditions, environmental conditions and geographical coordinates and then converts them into signal data for processing. Computing elaborates the signal data to produce actions that the acting stage implements.
Sensing is implemented via a large set of different types of sensor that monitor effects, such as sight, sound, temperature, humidity, pressure, wind, rain, snow, day/night, road conditions, vehicle status, and more. Computing is implemented in state-of-the-art SoCs that may or may not integrate sensors. Acting is realized via micro-controllers that carry out activities that include steering, breaking and accelerating.
The next level up is the system level. It consists of multiple interconnected electronic control units (ECUs) that encapsulate a few or several ICs and supervise vehicle operations under different driving scenarios. For example, one ECU controls the engine, others handle vehicle transmission, the braking system, heating system or dashboard operations, and so on.
Above the system level sits the vehicle level that encompasses the entire car. This comprises the environment that impacts levels of connectivity, traffic that impacts what is happening on the car, vehicle performance and dynamics driven at the system and vehicle level. All converge at the vehicle level.
On top of the hierarchy, a connectivity or mobility system level encompasses all three levels below. Connectivity could be V2V for vehicle-to-vehicle-level connectivity, V2M at the vehicle-to-multivehicle level, 5G for 5G interconnection between cars, or V2C for vehicle-to-cloud. There are other possibilities.
All these levels are interconnected and feed back to the IC level.
Two key and critical aspects of an AV development are safety and security. These aspects must be addressed early in the design cycle. Thoroughly testing a design for safety and security requires massive validation and verification cycles because of the number and variety of sensors.
The role of AI in autonomous vehicles
Envisaging the creation of a self-driving vehicle is not a new endeavor, but dramatic advances in AI during the past few years have made it a concrete possibility.
At the core of the AV SoC lies a neural network (NN) that needs to be trained. The task is mindboggling since the NN ought to learn from a vast spectrum of driving scenarios, and a massive number of variants. For example, a simple scenario occurs when a pedestrian unexpectedly crosses the road against an incoming self-driving vehicle. The AV must react quickly and safely. The scenario may have many variations, including but not limited to weather conditions, such as sunshine, rain, snow, fog, daylight or darkness; road conditions that range from dry or wet to wet and slippery; traffic conditions; vehicle conditions including speed, tire, brakes, faulty electronics; the nature of the pedestrian who could be young and athletic, old and sick; and many more variations.
In general, training a NN is a compute-intensive task. In the case of an AV, the computing power necessary to train a NN for an AV is far higher because of the numerous driving scenarios.
Designing and verifying an AV chip
Designing an AV chip starts with collecting specifications –– requirements generated at the four hierarchical levels that are passed down to the IC level. The amount and complexity of these requirements are voluminous. This makes design a daunting challenge.
But designing the chip is intimidating, verifying and validating it together with the embedded software against a comprehensive and exhaustive real-time scenario represents an astounding task.
Is it possible to test an AV design with real time scenarios? Akio Toyoda, CEO of Toyota, has estimated that 14.2 billion of kilometers of testing is needed, which implies that the testing should be performed on a physical test car loaded with real silicon. At a theoretical average speed of 50 miles/hour, the time required would approach 300 million hours or more than 30,000 years. Obviously, this is not possible.
Instead, what is needed is a high-performance verification/validation platform that operates on accurate digital models of AV chip designs exercised by a virtual environment that mimics real-time scenarios. The key here is a virtual environment made up of computer models.
That environment is what Siemens has developed under the name PAVE360. Built on the concept of a digital twin, it consists of a complete AV verification and validation environment modeled at the system level that represents a twin image of the physical vehicle and its on-the-road surroundings. The digital twin comprises digital models of the entire AV environment, including sensors, processors, actuators, ECUs, connectivity networks and driving scenarios.
Hardware emulation: the ideal AV verification and validation platform
The verification/validation technology that makes building a digital twin possible is state-of-the-art hardware emulation technology. While hardware emulation has been used in the semiconductor industry for 30 years, it has evolved and improved radically since then.
Today‘s best-in-class hardware emulator can verify designs of any size and complexity at megahertz speeds –– 10,000 times faster than software-based logic simulation –– with considerable throughput to process lots of incoming data in a short amount of time and low latency.
The design-under-test (DUT) can be exercised by a virtual test environment connected through a broad range of interface protocols modelled via transactors that are software models of actual hardware protocols.
A modern emulator supports full DUT internal visibility for reading its activity and access for overwriting its status to perform what-if analysis and, ultimately, accelerate design debug. Embedded software can be integrated with underlying hardware and validated on the DUT mapped inside the emulator, shortening time-to-market and reducing the risk of creating faulty silicon.
By splitting and running the driving scenarios on multiple copies of the AV digital twin, a DUT can be verified long before silicon availability for greater efficiency. Remote and 24/7 access to the digital twin supports geographically dispersed teams collaborating on pre-silicon verification and validation. Pre-silicon testing tasks such as power consumption, power analysis, security, safety and performance benchmarks can be accomplished in real-time.
A virtual test environment for AV verification and validation
The key to implementing a digital twin of an AV is the ability to interface the emulator to a software or virtual test environment that removes all dependences from/to the physical world. The setup mixes an accurate register transfer level (RTL) description of the DUT, mapped inside the emulator, interfaced to sensors and actuators described via high-level languages, such as C/C++. The emulator provides total visibility in the RTL activity to perform design debug and quickly zero-in onto design anomalies.
The virtual test environment implements driving scenarios captured and modelled through an ad-hoc software program and replayed via the sensors. It is possible to replay traffic flows and mimic vehicle interactions with other vehicles or with the surrounding environment. Actuators modelled in software are simulated via a high-level simulator fed with the responses of the DUT. The entire setup is a closed-loop environment, and is shown in Figure 2.
Pre-silicon validation of an autonomous vehicle design is a concrete possibility today. Siemens’ PAVE360 proposes such a validation platform. Based on a virtual test environment built around a state-of-the-art hardware emulator, a digital twin of an AV and its surroundings can complete its through verification and validation ahead of silicon, drastically reducing the need to drive an AV physical prototype for 14.2-billion kilometers.