Verification and Validation: An Editorial Primer

By Jack Callahan

Better, Faster, Cheaper: Pick any two. In the face of decreasing budgets and shortened schedules, the first of these criteria is often the loser. While software quality will always be a highly desirable property, it is much more difficult to plan for and measure than dollars and milestones. It is far easier to meet cost and schedule deadlines by rushing to produce code, reducing testing, and eliminating reviews. Even in high- assurance systems, quality is under tremendous pressure despite the serious consequences of catastrophic failures. The proof is in the pudding: high rates of default on large and small software projects, poor quality, and a general state of dissatisfaction with the state-of-the-art of software development. Many solutions seek to solve the quality vs. productivity problem, but building complex, high-quality software systems takes hard work not a silver bullet. To paraphrase Fred Brooks' ICSE 95 keynote speech, it will take many silver bullets working together to solve these problems.

Verification and validation (V&V) is a process that applies many different techniques to analyze software projects at all stages of development. V&V employs whatever techniques are cost-effective, necessary and appropriate to provide timely analysis to the software developer and customer. In all cases, V&V seeks to work directly with the software developer to provide constructive critiques and guidance. V&V can be performed in-house on small efforts or contracted to an independent vendor on large projects. In the latter case, it is called Independent Verification and Validation (IV&V) because it provides unbiased reviews that are technically, managerially, and financially separate from the development contract. In all cases, V&V allows customers and developers to better assess the risks and progress of their efforts.

In 1993, a National Research Council report to the U.S. National Aeronautics and Space Administration (NASA) [Leveson, N., An Assessment of Space Shuttle Flight Software Development, National Academy of Sciences, 1993] strongly recommended that V&V should continue to be performed on Space Shuttle flight software upgrades and called on NASA to institute V&V on other software-intensive projects. In response to this report, NASA's Office of Safety and Mission Assurance established the IV&V Facility in Fairmont, WV (currently under association with NASA's Ames Research Center in Sunnyvale, CA). Today, IV&V efforts on the Space Shuttle, International Space Station, and many other NASA projects are conducted and managed at Fairmont.

In conjunction with the V&V analysts, a team of researchers is conducting applied research in software analysis. Such work includes aspects of requirements engineering, design analysis, measurement, and testing. One important theme of the applied research in V&V is to combine and apply existing analysis techniques and inject them into the development-V&V dialogue. For example, in one project we combined SCR-style specifications, model checking techniques, and specification-based testing to help V&V track requirements analysis and subsequent testing activities. This approach leverages existing work in specific areas but integrates it into the development lifecycle with other development and analysis activities.

V&V is no panacea. It takes hard work, dedicated and trained personnel who have a critical eye for common problems in software projects. If applied early and effectively, it can help maintain balance in the cost-schedule-quality equation throughout the development process [SEES Report, NASA Langley Research Center, 1996]. If applied late or reluctantly (so-called "11th hour V&V"), it can fail to have any effect and be cost-ineffective [The Cost Effectiveness of IV&V, Technical Report, NASA Jet Propulsion Lab, 1985]. In the most cases, we have observed that V&V serves a fundamental and inseparable role in software development by providing continual, synergistic guidance to the software developers. Indeed, we have found it difficult to measure V&V effectiveness in general due to the "Hawthorne" effect of its presence on a project.

We are planning on holding a V&V Joint Workshop at the International Symposium on Software Testing and Analysis (ISSTA '98) in January of 1998. We invite participants who are interesting in software verification and validation, specifically applied/empirical software research and integrated software engineering and analysis methods. Visit our WWW site or the NASA/WVU Software Research Lab.