What hardware verification can learn from software development

By Jim Thomas |  No Comments  |  Posted: July 10, 2015
Topics/Categories: Embedded - Integration & Debug, EDA - Verification  |  Tags: , , ,  | Organizations:

Jim Thomas is director of software testing at specialist test and verification company TVS.Jim Thomas is director of software testing at specialist test and verification company TVS. He has more than thirty years' experience in the software industry with a focus on high-integrity software development.

What can hardware designers learn from software development teams? Attendees at the June Intelligent Testing conference, held in Bristol, UK, heard about a number of software development strategies that hardware designers could adapt and adopt, including two presentations on applying agile techniques to code testing.

Keynoter Rob Lambert of NewVoiceMedia, which develops contact centre software, explained how a need to respond more quickly to changing requirements led to a shift from annual to weekly software releases. The team did this by adopting agile methodologies, prioritizing work, bringing development and operations together, instilling a culture in which ‘everyone tests’, and using data to ensure that testing is relevant.

Whatever data was available from live operations was used to direct testing. Much of the testing was automated and supplemented by manual exploratory testing, with an emphasis on discovery instead of ‘blind’ testing. The transition to this new approach wasn’t achieved overnight but, with backing from senior management and a collective desire to change from the majority of development, test and operations engineers, it was successful. Pervasive testing is now firmly embedded in the culture.

Colin Deady of Capita spoke about using behaviour-driven development (BDD) with a zero known defect (ZKD) approach to deliver rock-solid software, without any known issues, into production environments.

The Capita team wanted to improve the quality of its software releases and reduce the duration of system testing. An agile mindset, with close collaboration between analysts, developers and test engineers, was central to this. BDD enabled the team to build up a large number of automated regression test scenarios that could be executed, under continuous integration, within five minutes.

The team wanted to go further than just passing regression tests, moving to a ‘fix every defect’ culture that would minimize rework and instill greater pride in the quality of the software within the team. In practice this meant trying to fix every known defect by the end of each day, guaranteeing to fix every known defect by the end of the week and always before delivery, and never allowing more than three known defects in the code. The result has been much cleaner software builds, no defect triage meetings, and much improved team morale.

Returning to the embedded systems theme, Guillem Bernat of Rapita Systems explained how to achieve 100% code coverage. Structural coverage analysis (also referred to as code coverage) is an important component of critical systems development. Many standards/guidelines, such as DO-178C (in aerospace) and ISO 26262 (in automotive), recommend or mandate the use of code coverage analysis to measure the completeness of requirements-based testing strategies.

It can be difficult to achieve complete (100%) coverage in some situations, for example when there are missing requirements or tests; dead or deactivated code; defensive programming techniques have been used; there are impossible combinations of events; or compiler-introduced errors.

It may be possible to add or fix requirements or tests, remove dead code and so on, but there are often circumstances where the only option is to justify why code can’t be executed or tested. Managing these justifications across software releases is usually a manual process, distinct from the coverage analysis results produced by tools and prone to error. Rapita Systems’ on-target coverage analysis tool addresses this by incorporating justifications to produce a single coverage report that can provide 100% ‘explained’ code coverage. It’s a neat solution to an old problem.

Other talks at the fourth annual Intelligent Testing conference, organised by Test and Verification Solutions in Bristol on 18 June, covered On-Target Testing in the Simulink Model-Based Design Environment, Testing of Avionics and the Security Development Lifecycle.

Copies of the slides and recordings for all the talks can be found on the TVS website at: http://www.testandverification.com/conferences/intelligent-testing/intelligent-testing-conference-2015/

Author

Jim Thomas is director of software testing at specialist test and verification company TVS. He has more than thirty years of experience in the software industry with a focus on high-integrity software development. After beginning his career as a developer, Thomas progressed into project and operational management roles where he was responsible for the successful delivery of complex software-based systems, which were typically mission- or business-critical, to both private and public sector clients across a range of sectors including aerospace and defence, transportation, finance and telecoms.

Comments are closed.

PLATINUM SPONSORS

Synopsys Cadence Design Systems Siemens EDA
View All Sponsors