© Kasto80 | Dreamstime.com
660d82ba071ad8001e9f726f Dreamstime M 94489162

How to Achieve Higher Levels of the Capability Maturity Model Integration (CMMI) Through Traceability

Sept. 20, 2023
The Capability Maturity Model Integration (CMMI) is a recognized standard for engineering best practices that reduce the risk of defects, delays, cost overruns, and recalls. Learn the best practices, how to progress, and the benefits of traceability.

Download this white paper in .PDF format by clicking the download button below.

The Capability Maturity Model Integration (CMMI), developed at Carnegie Mellon University’s Software Engineering Institute, is a recognized standard for engineering best practices that reduce the risk of defects, delays, cost overruns, and recalls. Organizations that choose to adopt CMMI strive to progress up the five levels in the maturity model by implementing sequentially more advanced best practices spanning the engineering development process.

Jama Software is honored to be chosen by Carnegie Mellon as the primary tool used in its Master of Science in Software Engineering to train the next generation of software engineering leaders in best practices for requirements management, reviews, verification, validation, and process performance management.

The CMMI defines its best practices in terms of goals, practices, and artifacts. The CMMI does not address the underlying systems and data architecture required to enable these practices, deliver these artifacts, and achieve these goals. The systems architecture reality for most engineering organizations is highly fragmented with the necessary data to manage the engineering product and process (user needs, system-level requirements, approvals, component-level requirements, model designs, component requirement decompositions, interface definitions, test cases, test results, risk analysis, validations, traceability analysis, etc.) spread across hundreds of siloed tools, spreadsheets, emails, and chat tools with high degrees of uncertainty that any information reflects the latest version continually updated with all interdependencies.

The main reason for this landscape of siloed tools is that each engineering discipline is empowered to choose a best-of-breed tool to optimize engineer productivity within their team. The breadth of functionality covered in total by all of these tools—spanning all engineering disciplines—precludes the potential for a single software vendor to provide one software tool that could replace all these best-of-breed tools to the satisfaction of every engineer across disciplines.

Generally speaking, each engineering field uses its chosen best-in-class technology to accomplish its objectives. That said, the data needed to achieve CMMI goals, practices, and artifacts is unstructured, unrelated, unconnected, and unmeasurable, which poses a serious challenge when it comes to achieving goals, practices, and artifacts that must span multiple disciplines to control, manage, and improve the engineering process.

In order to advance along the maturity model, each engineering organization (regardless of size) needs a unified data model architecture and automated synchronization spanning best-of-breed tools. Without these improvements, most engineering organizations struggle to achieve Level 2 (Managed) and can only do so in a highly manual, after-the-fact manner that generally fails to deliver the desired outcome benefits.

Let’s take a look at a few specific examples from CMMI to demonstrate the need for a unifying data model and an overview of how to achieve it. The first one we will examine is a core practice from the Requirements Management section for Level 2 (Managed) that specifies bidirectional traceability from high-level requirements through decomposed requirements and work products across engineering disciplines to generate and maintain a traceability matrix.

There are two ways companies can approach achieving this traceability practice: after-the-fact traceability or Live Traceability.

  • After-the-fact traceability occurs after the product has been developed and is typically a highly manual effort to try and re-create artifacts to demonstrate traceability that should have occurred during the development process but did not. This effort is undertaken solely to comply with industry standards and satisfy auditor requests for demonstration of process maturity.
  • Live Traceability occurs in real time as the product development process progresses to improve overall productivity (by ensuring engineers across disciplines are always working off the most recent and correct versions) and to reduce the risk of negative product outcomes (delays, defects, rework, cost overruns, recalls, etc.) through early detection of issues. The benefits of early detection of issues are significant. Research by INCOSE found that issues not found until verification and validation are 40 to 110 times more costly than if found during design. For this reason, most companies want Live Traceability but are stuck with legacy tools and spreadsheets that do not support it. Since each engineering discipline is allowed to choose its own tooling, the result is a large number of tools with no relationship rules or mechanisms to create Live Traceability across them.

So How Do You Achieve Live Traceability?

STEP 1: Define a Traceability Model

Live Traceability requires a model of the key process elements and their relationship rules to monitor during the development process. Below you see a sample relationship rule diagram from Jama Connect that defines a common data model that spans best-of-breed tools which enables engineering organizations to manage traceability in real-time and improve process performance. Relationship rules vary by industry and company-specific requirements. Best practice templates are provided to comply with industry standards and configured to meet client-specific needs. The definition of a traceability model forms the foundation for model-based systems engineering (MBSE) since it defines model elements and their relationship to each other in a consistent manner across the entire system architecture.

Step 2: Setup Continuous Sync for Siloed Tools/Spreadsheets

Once the relationship rules are defined, the next step is to set up continuous sync with best-of-breed tools and spreadsheets used by the various engineering disciplines. The traceability diagram below shows a typical example of best-of-breed tools and where they sync in the Jama Connect relationship model to deliver Live Traceability.

Most companies prioritize the areas of the traceability model that are most prone to lead to costly issues in the absence of a continuous sync. Most commonly, these areas are:

  • Software task management: Directly linking the decomposition of requirements into user stories enables Live Traceability through the software development process through testing and defect management.
  • Test automation: Test cases are managed in Jama Connect to align to requirements and ensure traceability across all engineering disciplines with the test automation results synced to the traceability model at the verification step.
  • Risk analysis (DFMEA/FMEA): Most often conducted in multiple Microsoft Excel spreadsheets and the assumption has been that Live Traceability was not possible with Excel. Jama Connect is the first requirements management solution to enable Live Traceability with Excel functions and spreadsheets. Risk teams can now work in their preferred spreadsheets AND for the first time achieve live traceability to stay in sync with changes made by any engineering team.
  • Model-based systems engineering (MBSE): The first step in MBSE is to define a relationship model between all product requirements. Once a relationship model is defined, then specifications can be determined through modeling. Jama Connect uniquely provides model-based requirements to sync logically with a SysML modeling tool like Cameo No Magic.

Step 3: Monitor for Exceptions

Live Traceability provides the ability, for the first time, to manage by exception the end-to-end product development process across all engineering disciplines. The traceability model defines expected process behavior that can be compared to actual activity to generate exceptions. These exceptions are the early warning indicators of issues that most often lead to delays, cost overruns, rework, defects, and recalls. Below is a sample exception management dashboard in Jama Connect.

Benefits of Live Traceability

The main benefits of Live Traceability across best-of-breed tools are as follows: 

  • Reduce the risk of delays, cost overruns, rework, defects, and recalls with early detection of issues through exception management and save 40 to 110 times the cost of issues identified late in the process.
  • Achieve CMMI Level 2 maturity for Requirements Management with no after-the-fact manual effort.
  • Eliminate disruption to engineering teams that continue working in their chosen best-of-breed tools with no need to change tools, fields, values or processes.
  • Increase productivity and satisfaction of engineers with the confidence that they are always working on the latest version, reflective of all changes and comments.

Another core goal of CMMI Level 2 is to involve stakeholders in the requirement review and approval process (see table below). Let’s examine how companies achieve this goal either through meetings or online reviews.

CMMI Level 2 (Managed) Requirements Management

There are two ways to implement this practice: meetings or online reviews. Most engineering organizations still address stakeholder approvals through large and lengthy meetings that involve all relevant engineering disciplines scrolling through the requirements document for feedback. This is a highly inefficient approach that negatively impacts engineering productivity and morale, and fails to capture relevant comments, feedback, revisions, and approvals from stakeholders given the format. More mature engineering organizations have brought the review and approval process online to improve the quality and timeliness of feedback, capture all version and approval histories online, and improve engineer productivity and morale. Let’s examine how companies have brought reviews online with Jama Connect Review Center.

Review Center allows teams to send product requirements for review, define what’s required, invite relevant stakeholders to participate, collaborate, and iterate on resolving issues and approving agreed-upon requirements. By simplifying the revision and approval process, Review Center streamlines reviews and facilitates collaboration, giving stakeholders easy access to provide feedback where required. Jama Connect enables both informal and formal online review processes to support this CMMI best practice.

Formal Reviews

The formal review process enabled by Review Center is shown below:

Review Center enables teams to define a review, invite participants, gather and incorporate feedback from relevant project stakeholders, iterate, track a review’s overall progress, monitor progress,  and capture approval signatures if required. Reviewers can respond to a conversation that’s taking place, as well as mark items as “Approved” or “Rejected” to complete the review. Inside Review Center, reviewers can also add electronic signatures to reviews in order to comply with regulatory standards. Jama Connect captures the date and time of completed reviews for auditing, tying each signature to the document under review.

Informal Reviews

Organizations that still want the quality review aspects of Jama Connect but are not bound by producing formal documents of requirements may take a more iterative approach. A “rolling” review is a review that changes the scope of which requirements are included in each revision. For example, each requirement has a “state” field—Draft, Ready for Review, or Approved.

On the project side of Jama Connect, requirement owners will mark requirements they feel are “Ready for Review.” Moderators can also edit requirements directly in the review based on feedback from Approvers. Using a Jama Connect Advanced Filter, a review will be started by pulling in only requirements that are marked “Ready for Review.” Using this methodology, the review is much smaller in scope and can typically be completed faster.

On a regular cadence, the moderator will review feedback, make changes to requirements as necessary, or potentially update the requirement status to “Approved” if the required stakeholders have approved the requirement. When publishing a new revision, Jama Connect will pull new requirements into the review and cycle out requirements that are “Approved” (these requirements no longer meet the filter criteria of state = "Ready for Review)

This allows teams to review requirements on a regular cadence—or sprint—and cycle requirements into the review when they are ready for feedback and out of the review when they are “Approved.” Almost any item of content you create in Jama Connect may be sent for a review, including requirements, design, test cases, test plans, and test cycle results.

"Review Center is facilitating communication. It has ensured a shared view of the world and agreement from all stakeholders. There are no surprises anymore. Jama Connect enables us to review documents and make decisions easily with everyone coming to a shared conclusion. If we compare it to reviewing the spreadsheets and Word documents versus doing a review in Jama Connect Review Center, it’s about an 80% reduction in time, for sure." — Craig Grocott, Head of Systems Engineering.

Achieving CMMI Level 2 requires defining a development process and adhering to it. Below is a core goal for CMMI Level 2—evaluate adherence to the requirements management process.

CMMI Level 2 (Managed) Requirements Management

Achieving this goal requires the ability to decompose requirements across engineering disciplines and maintain traceability up and downstream as the project progresses with significant changes and rework. Without an underlying system architecture and common data model, this goal becomes unattainable for most organizations.

Attempts to manage through Word and Excel become unwieldy and unable to meet the requirements for Live Traceability, leading to defects, delays, cost overruns, and recalls.

Below, you can see how easy it is to manage traceability and view up and downstream multiple levels in a trace view of requirements in Jama Connect. Jama Connect’s Traceability Model defines the data model across best-ofbreed tools to capture actual behavior for traceability and management by exception.

Achieving CMMI Level 3 requires defining a development process and adhering to it. Below is a core goal for CMMI Level 3—establishing a verification process and adhering to it.

CMMI Level 3 (Defined) Verification

Companies are achieving this goal through Jama Connect by establishing a Traceability Model that requires test verification for requirements and managing by exception through dashboard reporting to ensure verification happens across all requirements.

Below is a sample verification dashboard to achieve this goal with customer-specific info redacted. Here you can see how the Verification Leader manages their function through exception management. Specific widgets on the dashboard track requirements without tests, failed tests, tests without requirements linked to verify, bugs without tests, and risks without upstream or downstream traceability.

The Traceability Model established in Jama Connect defines the expected behavior against which all activity can be compared to generate exceptions that can be managed through the dashboard. Without this system architecture and data model, managing by exception becomes extremely manual and productivity-killing, if not impossible.

CMMI Level 4 requires organizations to have developed predictive scores and benchmarks that enable management to identify product development risks early and remediate them at a much lower cost than if not identified until late in the development process or after product release into the market.

The table below shows the definition of this core, Level 4 goal.

CMMI Level 4 (Quantitatively Managed) Process Performance

Leading companies are achieving this goal by applying Jama Software’s Traceability Score and benchmarking engineering projects internally and externally against peer companies. Jama Software is the first to measure traceability thanks to our client’s participation in a benchmarking dataset of over 40,000 complex product development projects spanning aerospace, automotive, consumer electronics, industrial, medical devices, semiconductors, space systems, and more. All of this is made possible by our core product, Jama Connect, which enables the largest community of engineers using requirements management SaaS (Software as a Service) in the world.

To formally measure traceability, we have established the Traceability Score. The Traceability Score measures the level of actual process adherence to the expected traceability model and can be used to compare performance across projects, teams, divisions, and companies. This score can also determine impacts on schedule, budget, cycle times, risk, and quality.

Traceability Score definition

Traceability Score = # of established relationships among model elements as specified by the project’s traceability model.

The following diagram provides an illustration for the buildup of the calculation:

  1. At the individual requirement level, we can identify each expected relationship defined in a project’s traceability model (i.e., user needs defined by requirements, further refined by sub-requirements, and test cases that should verify the requirement, etc.). We can then identify how many of these relationships have been established to get an individual requirement’s traceability.
  2. As we go one level higher and measure traceability within a particular element type (e.g., user needs, requirements, tests, etc.) we can sum up the number of expected and established relationships across the set of items, giving us traceability at the element type level.
  3. Finally, we can sum up the number of expected and established relationships across all element types, giving us the project’s total Traceability.

Correlations & Hypothesis Test Results

As a process management tool, the value of a Traceability Score is to quantify actual adherence to the specified approach. To determine best practices from the data, statistical tests were run to understand how differing levels of project adherence to Live Traceability can impact desired outcomes.

As we have shown, the Traceability Score measures actual adherence to the defined traceability model. The systems engineering discipline, the V model, quality engineering, and more—all rely on the intuition that this approach will yield better results. Anecdotal evidence abounds to support this intuition, but the dataset has been lacking to conduct statistical tests to test this hypothesis.

Using our dataset, we were able to determine that Traceability Scores exhibit statistically significant correlations to the following outcomes and rejected the null hypothesis that these correlations were purely random.

1. Faster time to market

The first three tests focus on how Traceability Scores impact cycle time. Do higher Traceability Scores lead to faster test case execution and defect identification? This is a fundamental value asserted by systems engineering and the V-Model—that earlier detection of defects leads to fewer delays and much lower cost to correct.

We measured the following times below and noted performance improvements in top versus bottom performers of 2.1x to 5.3x. Higher Traceability scores were found to lead to faster test case execution and defect detection having passed both of our statistical tests.

  1. Median Time to Execute Test Cases (2.6x faster)
  2. Median Time from Test Start to Defect Detection (5.3x faster)
  3. Median Time to Identify the Set of Defects (2.1x faster)

2. Higher quality

The last three tests focus on how Traceability Scores impact quality. Do higher Traceability Scores lead to a higher-quality product? This is a fundamental value asserted by systems engineering and the V-Model – that a commitment to test case creation and execution leads to a higher degree of requirement verification and product quality.

We measured the following aspects of testing and verification below and noted performance improvements in top versus bottom performers of 1.9x to 2.9x. Higher Traceability scores, having passed both of our statistical tests, led to more tests being completed and a higher percentage of passed tests.

  1. Percent of Requirements with Verification Coverage (1.9x higher)
  2. Percent of Requirements Verified (2.1x higher)
  3. Initial Test Case Failure Rate (2.4x lower)
  4. Final Test Case Failure Rate (2.9x lower)

Conclusion

The CMMI defines its best practices in terms of goals, practices, and artifacts. The CMMI does not address the underlying systems and data architecture required to enable these practices, deliver these artifacts, and achieve these goals. The systems architecture reality for most engineering organizations is highly fragmented with the necessary data to manage the engineering product and process (user needs, system-level requirements, approvals, component-level requirements, model designs, component requirement decompositions, interface definitions, test cases, test results, risk analysis, validations, traceability analysis, etc.) spread across hundreds of siloed tools, spreadsheets, emails, and chat tools with high degrees of uncertainty that any information reflects the latest version continually updated with all interdependencies.

As we have shown, it is extremely challenging if not impossible to move up the CMMI maturity model without addressing the underlying systems architecture and data model. Carnegie Mellon has chosen to use our software to train their students and leading companies have deployed Jama Connect in the ways noted above to achieve their CMMI objectives.

For those interested in exploring this topic further, we encourage you to reach out and have a conversation with one of our experts.

Sources
https://www.cmmi.co.uk/cmmi/cmmi.html https://resources.jamasoftware.com/whitepaper/requirements-traceability-benchmark