Review Tool

There are a number of initiatives, such as CWE, MISRA, CERT-C and ISO xxx, to improve the quality, reliability and security of software. These initiatives each specify a set of rules, proscribing the aspects of the structure, implementation or behavior of software, providing a checklist against which to evaluate source code for conformance with the standard.

The Review Tool supports a guided review of your software with respect to such checklists. With the tool, you're able to methodically identify those specific portions of your software that pertain to the checks you've elected to review. Imagix 4D's general analysis and visualization functionality help you to assess whether those specific portions are a concern or violation.

Using the Review Tool in a team environment, concerns and assessments can be recorded by the reviewers and then commented on by the software’s authors. An audit trail for documentation or submission is naturally created as part of the process.

In addition to guiding you through this process, the tool automates much of the identification and assessment activity, as well as the documentation.

Components / Terminology

The Review Tool, and this section of the user manual, use some terminology which is unique to the guided review process, and is not shared with other parts of Imagix 4D. Described in more detail later, these terms used in the Review Tool include:

Check A description of a certain style or behavior of the software that is a potential concern for correctness, consistency, compliance, performance, security or other desired property of the software, together with a set of ordered steps which lead to the identification and assessment of those portions of your software tied to this property. There is typically a direct correspondence between a rule in some standard or guideline, and the check in the Review Tool.

Checklist A set of checks corresponding to the set of rules that make up a given standard or guideline. Checklists are project-independent resources from which the specific checks used in a given project / review are selected.

Review A set of checks against which the software in an Imagix 4D project is evaluated. In addition to the check definitions from the checklist, the review includes project-specific data that is collected and recorded about each check.

Step A specific action in identifying or assessing some portions of your software. Each check is made up of an ordered list of such steps.

Probe An artifact tied a specific portion of your software. The probe might identify a symbol, a line in a source file including some symbol, a note, or a result of some automated analysis. A step results in a set of probes being created. Probes can also serve as the input to a step, resulting in the creation of some downstream probes.

Rating An assessment of whether a given probe is in conformance with the rule being evaluated. The final step in each check typically identifies the probes directly related to the rule being reviewed. It is these probes that are assigned a rating.

Note A text string associated with some item - a review, check, step or probe. Notes provide the ability to comment on each item, and add to the document record being built up.

Available Checklists

A number of existing checklists are available for defining a specific review. Individual checks from across this series of checklists can be combined into a review, and can be supplemented by checks of your own design.

CWE The Common Weakness Enumeration standard contains a large set of rules for guiding review of software, in order to improve its security, quality and reliability. The Imagix CWE checklist includes checks for over 200 of these rules, focusing on those rules that can't be fully tested automatically via static analyzers, and are most efficiently tested by static analysis augmented by source code analysis and inspection. Checklists are available for CWE 2.8 and later.

MISRA C 2012 MISRA C, from the Motor Industry Software Reliability Association, was originally developed for reviewing embedded software in automotive applications. Proven highly effective, its use has expanded into a wide variety of safety critical industries including the rail, aerospace, military and medical sectors. The Imagix MISRA C 2012 checklist provides guided checklist review of the complete set of directives and rules in the current version of the standard (MISRA C:2012).

MISRA C++ 2008 Building on the widespread adoption of MISRA C standards, the Motor Industry Software Reliability Association produced a set of guidelines for the use of C++ in critical systems, similar to those that were produced for “C”. The Imagix MISRA C++ 2008 checklist provides guided checklist review of all of the rules in the current version of the MISRA C++:2008 standard, with the exception of the rules in chapter 14.

AUTOSAR C++ 2014 Extending the MISRA C++ 2008 standard to both newer versions of C++ and to additional coding guidelines, AUTOSAR produced a set of guidelines for the use of C++ especially for automotive applications. The Imagix AUTOSAR C++ 2014 checklist provides guided checklist review of the current version of the AUTOSAR C++14 standard.

HIS + MISRA C The Hersteller Initiative Software standard defines a base set of metrics for the assessment of embedded, realtime, safety-critical software, originally developed by a consortium of German automotive manufacturers. The Imagix HIS + MISRA C checklist includes checks for the full set of HIS Metrics, consisting of 15 metrics that apply to a software project plus 3 metrics that apply to 2 revisions of software projects. Of the 15 metrics, 2 are actually a summary of rule violations of a subset of MISRA C. The checklist also includes individual checks for the 96 underlying MISRA C directives and rules.

Imagix Checks Information displays in Imagix 4D present the results of a wide range quality checks and analyses, ranging from software metrics, to checks of generally agreed upon design and coding practices, to verification of global variable and intertask data flow. In addition to being directly displayed and explored in the gui, these results can be imported into the Review Tool, to create a review. This review can then be used to guide a methodical examination of all results that are out of spec. And once these results are brought into the Review Tool, SARIF export can be used to share the results with external tools.

Imagix Delta Analysis Delta Analysis identifies structural differences between versions of your software. The normal Delta Analysis reports can be used to manually guide quality assurance activities and design testing of the changes. But by using the checklist to automatically generate a review with an entry for each change, the Review Tool will enable you to enhance the rigor of your whole process.

SARIF Import Through the Static Analysis Results Interchange Format, results from complementary static analysis tools can be imported into Imagix 4D. The process creates a review containing information about each defect identified by the external static analyzer. Then, leveraging Imagix 4D's unique functionality for program understanding, you can use the Review Tool to fully understand each identified defect, assess the severity and urgency of a fix, and record the review results to compare against future runs of the static analyzer.

Creating and Performing a Review

The review process starts by choosing a base checklist, and then selecting the specific checks from that checklist to apply against the software in your current project. The resulting review initially consists of just the selected checks, with no probes data, ratings or notes.

The process of performing the review consists of selecting each check and performing each step. You're guided through this process in the Check display, which indicates each step and its associated action. Many of the steps can be completed automatically. For the steps that require human intelligence to study the code, instructions in the step explain what to look for, and menu items leverage the visualization engine of Imagix 4D. As each step is completed, its resulting probes are passed along to any downstream steps requiring those results.

The final step of each check is typically a review step, where the probes directly related to the check have been identified. In the review step, you're able to assess each probe and assign a rating of whether the probe is not an issue, is a concern, or is a violation with respect to the check.

Throughout this process, artifacts are automatically recorded and stored by the Review Tool. These include the probes that are identified and notes that are attached, along with a record of when and by whom each action was completed. Progress can be tracked in the Review widget, where the overall list of checks for the review, along with the progress on each check, is presented. At any time throughout the process, the current review summary and/or the details of selected checks can be exported. When SARIF (.sarif) format is selected for this, the resulting export of the review contains review-wide detail.

Advanced Operation

Described in later sections, topics in using the Review Tool include:

Multi User Operation – Passing Reviews Between Participants For collaboration between the person(s) performing the review and others, such as code owners, whose comments are becoming part of the review record, strategies for locking and releasing the review improve the teamwork.

Multi User Operation – Partitioning Reviews Between Participants For medium to large size projects where the review tasks are shared across multiple participants, strategies for applying the Review Tool's partitioning and combining functions lead to greater efficiency.

Managing the Evolution of Reviews The Review Tool tracks the review over time. You’re able to compare with previous review states or return to previous states.

Reviewing Later Versions of Software Part of the benefit of reviewing your software with the Review Tool is the leverage it provides for reviews of subsequent versions of your software, or reviews of similar software. You’re able to carry old results forward as well as use them for comparison with new results.

Creating / Writing Own Checklist In addition to using checks from the checklists supplied with the Review Tool, it is possible to define your own checks / checklists, supporting your own rules against which you'd like to review your software. See the Checklist API for details.