Code Review

Code
Review

There are a number of initiatives, such as CWE, MISRA, AUTOSAR, CERT-C and various ISO standards, to improve the quality, reliability and security of software. These initiatives each specify a set of rules, proscribing aspects of the structure, implementation or behavior of software, and providing a checklist against which to evaluate source code for conformance with the standard.

The need for software review is also triggered by the ongoing changes made to source code throughout the software lifecycle. At specific milestones during the development process, such as before source code check-in or before release of an update to manufacturing, rigorous review of software changes is an important contributor to software quality.

Conducting a software review to determine and document standards compliance or change control can represent a significant effort, requiring resources from several parts of a software development team. Through its guided checklist reviews, Imagix 4D's Review tool reduces the overall effort, in several important ways:

  1. Automates many of the steps in review process
  2. Seamlessly integrates static analysis results with any necessary source analysis and visualization
  3. Creates documentation and an audit trail automatically as part the process
  4. Distributes effort among reviewers and developers, producing in a single, unified output
  5. Facilitates review management, partitioning tasks and tracking progress as well as results
  6. Generates intellectual property that can be reused to simplify later reviews of same or similar software

Software reviews of
standards and code changes

Automated Checklists

rules for Common Weakness Enumeration
The Review tool uses guided checklists to automate as much of the process as possible. The checking for a given rule is broken down into a series of steps. This decomposition is designed so that 1) steps requiring human interaction are minimized, focusing on items that require the reviewer's intelligence and judgement, and 2) complementary steps can be automated.

In addition, where possible the checks use common building block steps so that work can be done once and shared across the checking of a series of rules.
Checklists include:
  • MISRA C anf C++,AUTOSAR C++ and HIS for assessing embedded, realtime, safety-critical software
  • CWE for testing compliance with Common Weakness Enumeration software security rules
  • Delta Analysis for reviewing structural changes between versions of source code
  • SARIF Import for analyzing results from external static analysis tools

Integrated Visualization

Review integrates visualization results
The software visualization and analysis central to Imagix 4D is ideal whenever static analysis needs to be supplemented with source code inspection. Early steps sometimes involve manually identifying particular portions of the software, such as resource manipulation functions. And the final step in all rule checks is inspecting the identified source code, assessing whether a violation occurs.

Supplementing this, the review tool records the results and observations of each reviewer, incorporating those results into the review repository where they can leverage automated downstream analysis.
Parts of the source code requiring manual inspection and identification include:
  • Initialization functions
  • Signal handlers
  • Command interpreters
  • Resource locking functions
  • Functions decompressing data
  • Interrupt protection schemes
  • Resource manipulation functions
  • Tasks / Threads

Audit Trail

Automated documentation of review activities and results
Throughout the review process, reviewers identify areas of the software relevant to the rule being checked, and assess potential violations. The Review tool captures the results of each process step into the review repository, creating documentation and an audit trail. The reviewer, time and any associated comments are automatically recorded.

While the actual review results are the primary objective of the review process, the documentation and audit trail are often required as additional deliverables.
Both user actions and automated calculations are recorded for future reference:
  • Results of static analysis runs
  • Comments from developers
  • Identification of Probes
  • File Calls
  • Assessment of Probes
Info that can be integrated:
  • Results from other tools

Shared Tasks

Partitioned reviews aid load sharing
The magnitude and knowledge necessary for a review often demands that the overall review effort be distributed among multiple reviewers, with additional support provided by developers who 'own' specific portions of the source code.

The Review tool supports sharing such efforts across your review team. The ability to partition a review simplifies the assignment of individual pieces to specific reviewers. Using the repository to share and track the assessments and comments being made supports the teamwork.
Software architect (typical tasks):
  • Define tasks, interrupt protections that exist in code
Code reviewer:
  • Identify code that meets check criteria
  • Inspect code that implements feature
Code owner:
  • Comment on assessments by reviewer
  • Identify corrective actions

Review Management

Management of software review activites
Software reviews can represent a significant effort, involving multiple reviewers and technical responders from development and QA teams. Partitioning the review into multiple subreviews facilitates distributing the effort across these resources.

Progress and results can be tracked both the level of the individual subreviews and as the overall review. This information can be used to both redirect review assignments and to target corrective actions with respect to the software itself.
Progress measures:
  • Checks started
  • Checks completed
Results measures - total and by check:
  • Total probes
  • Assessed probes
  • Probes rated as concern
  • Probes rates as violation

(checks ~= rules)
(probes ~= source code locations)