Categories
Requirements

Looking for help to get started with IBM DOORS NG?

Looking for help to get started with ?

is a web-based management solution for complex software and systems engineering environments. This playlist is for anyone who wants to learn about IBM Rational . The topics are at a high level, providing an overview of key use cases and product features.

More information: http://goo.gl/eZ0cI Using DOORS NG with OSLC: https://jazz.net/library/article/1197 Latest DOORS NG information: https://jazz.net/products/rational-do… Intermediate list of videos: https://www.youtube.com/playlist?list… Advanced/Administration list of videos: https://www.youtube.com/playlist?list… Configuration management videos: https://www.youtube.com/playlist?list…

Categories
Requirements

What is IBM ELM/ALM?

What is /ALM?

IBM® Engineering Lifecycle Management (ELM) integrates ELM products to provide a complete set of applications for software or systems development. ELM includes IBM Engineering Management DOORS® Next (DOORS Next), IBM Engineering Requirements Management DOORS (DOORS), IBM Engineering  (EWM), and IBM Engineering  (), IBM Engineering Systems Design Rhapsody® – Model Manager (RMM), and IBM Engineering Lifecycle Optimization – Engineering Insights (ENI) with Jazz™ Team Server. This solution is designed for requirements analysts, developers, systems engineers, and testers. 

The following diagram shows the development lifecycle that the solutions support. To see overviews of the applications that are represented in the image, click the boxes. For example, clickValidate and verify to see an overview of ETM.

This image shows the functions in the development lifecycle supported by the solutions.

To support the development lifecycle, ELM products let you link artifacts across applications, as shown in the following figure and examples: Figure 1. ELM connects analysts, developers, and testers

This graphic shows the relationships between ELM disciplines, as described by the following examples.

Requirements:

  • Requirements are implemented by iteration plans and validated by test plans.
  • Requirements are elicited, documented, elaborated, and validated by analysts. Their implementation progress is tracked in work items, and their correctness is validated by test cases.

Implementation:

  • Project managers and development managers use iteration plans to implement requirements in the context of a development schedule.
  • Team leads plan the iterations using iteration plans, where the work is divided further into tasks.
  • Developers work on defects that are submitted by testers as a result of test execution.

:

  • The test team links requirements to test plans and test cases. 
  • Testers link test cases to work items to ensure coverage of the implementation.
  • Testers run test cases and submit defects for test failures.

ELM integrates the work of analysts, developers, and testers by providing the following cross-application features:

  • Link between artifacts across applications: For example, you can link test cases to work items and requirements.
  • Hover over a link to see details about the link target: For example, testers can monitor the status of a defect that they reported to the development team.
  • Track status across projects by adding widgets from different applications to a dashboard: For example, you can add a widget that shows the defects that are blocking testers.

Jazz Team Server

The Jazz Team Server provides the foundational services, such as user management and license management, that enable the ELM applications to work together as a single logical server. In this way, the Jazz Team Server serves as an integration hub for the applications. After you install the ELMproducts, you install product license keys into the Jazz Team Server to permit access to the capabilities provided by the applications. For details about the topologies supported for new or upgraded installations, see Planning the deployment and installation.

Products and applications

For a detailed overview of the products and applications in ELM, see the following topics:

Part of the Jazz community 

ELM products are developed transparently on the open and extensible Jazz platform. On Jazz.net, you can download the products and their milestones, track development schedules, join discussion forums, open enhancement requests, and interact with the product developers. To learn more about the products, see the developer-written articles in the Jazz.net library or the topics about complex deployment scenarios on the Deployment wiki.

More information

To learn more about ELM, see these resources: 

  • ELM on Jazz.net: Learn about the new features, read the release notes, and download the binaries to install the solution.
  • ELM videos: These videos highlight the configuration management capabilities of the solution. 
  • ELM sandbox: You can try a series of exercises in an online sandbox to learn more about a broad range of capabilities across the application development lifecycle.
  • ELM on Jazz.net: Learn about the new features, read the release notes, and download the binaries to install the solution.
  • ELM demo series: This set of recorded demonstrations offers a full lifecycle walk-through, and videos that highlight specific industry needs, in-depth tools, and practice topics.
Categories
Requirements Design Testing

What is Digital Twin?

Categories
Requirements Design Testing Workflow

What’s New in Engineering Requirements Management DOORS Family 7.0.1

Building on the themes of 7.0, our next release is just around the corner. Since 7.0.1 has come so soon after 7.0, we recommend that anyone planning to upgrade should go straight to 7.0.1. This blog covers releases of all Management tools including:

  • Engineering Requirements Management DOORS Next 7.0.1
  • IBM Engineering Requirements Management DOORS 9.7.2
  • IBM Engineering Requirements Management Requirements Quality Assistant

DOORS Next 7.0.1

One of the themes of DOORS Next V is to extend the overall scale of data that can be managed using a DOORS Next RM server.  has continued in V7.0.1 and we can now support up to 1,000 concurrent users working on a single RM server using an Oracle database.

It is often the case that when using changesets to modify requirements, dependencies between changesets are created when multiple people change the same requirements or are making changes in the same module when there are changes to the structure of the module.  DOORS Next V7.0.1 allows for dependencies to be overruled when selecting changesets for delivery.

Trace column information can be tailored to be more succinct in the information that is displayed, including the use of traversable Link Indicators rather than displaying more verbose URLs.

ReqIF has been improved in the way attachments and graphical elements are used as part of Requirements information. Where possible, DOORS Next will now import OLE elements from applications like DOORS directly into attachments in DOORS Next. DOORS Next will export graphical elements such as Diagrams in a format that can be seen, but not edited in other requirement tools, including DOORS.

Imran Hashmi IBM ELM engineering lifecycle management

DOORS 9.7.2

DOORS V9.7 was introduced to enhance usability while focusing on integrating DOORS more closely with the IBM Engineering (ELM) portfolio as a whole.

  • Extending support for collaboration with ELM and Global Configurations, enabling the DOORS user to see and create trace links to ELM (OSLC) data placed under configuration management
  • We have made a number of improvements for reporting; most significantly is the ability to report on requirements volatility using the Jazz Reporting System Report Builder.
  • ReqIF interaction with IBM Engineering DOORS Next has been improved to allow OLE data from DOORS to be directly sent to DOORS Next as attachments without the need for wrapping elements. It is also possible to see DOORS Next diagrams for review in DOORS.
  • The DOORS database explorer shows modules and their baselines to make it easier to open the correct version. Optionally, the module explorer can limit the display to active objects covered by an active filter.

Requirements Quality Assistant

Requirements Quality Assistant is a hosted solution with updates typically released monthly.

  • RQA can be added to existing deployments or can now be purchased as part of a dedicated DOORS Next SaaS environment, with the power of AI
  • through Requirements Quality Assistant
  • Requirement Managers get the full capability of DOORS Next Analyst. to optimize communication & collaboration across teams
  • Requirement Authors receive coaching from RQA to improve the quality of a requirement as it is being written
Imran Hashmi IBM ELM engineering lifecycle management

RQA scores requirements against criteria consistent with the INCOSE Guidelines for Writing Good Requirements.  The tool is pre-trained to detect 11 quality issues and can be extended with more through the support of IBM services.

Imran Hashmi IBM ELM engineering lifecycle management

After analyzing requirements, see the issues found by RQA in the list of attributes in DOORS Next & DOORS.  Use the issue guidance to modify requirements and reduce ambiguity.  For more information, see Checking DOORS Next requirements with RQA

In recent releases, we have refined the accuracy and scoring and you can now measure the quality of your project or module and use Dashboard views to provide insights on problem projects.

These are only a few of the improvements introduced with DOORS Next, DOORS and RQA.  A full list can be found in the product documentation: DOORS NextDOORSRQA

Thank you for your continued support and stay safe.

by Richard Watson

Categories
Design Requirements Testing Workflow

The Good, the Bad, and the Ugly in software and system testing

This article was written by Bartosz (Bart) Chrabski of SmarterProcess – with some minor contributions from yours truly.

You may remember an old western movie titled The Good, the Bad and the Ugly starring Clint Eastwood. can sometimes feel the same way.

Imran Hashmi IBM ELM engineering lifecycle management

In today's dynamic product development arena keep changing and evolving. There is a relationship between three dimensions – namely cost, quality and time.

Ideally, we would have sufficient budget, time, and – product teams would be able to implement a high-quality product. In practice, it's rarely the case that projects have enough budget and time. Often projects run over budget and experience time constraints. As a result, testing efforts are rushed and the quality of the product suffers.

Verification and validation of products – software and hardware – are some of the most critical steps in the development process and typically consume 30% to 35% or more of the total cost and effort in most projects.

Testing is often the least planned part of the development lifecycle. This lack of rigor can lead to the delivery of lower quality products and applications, which have a negative impact on customer satisfaction.

This article outlines some of the most common challenges encountered in testing efforts and presents several recommended practices. These practices will improve the efficiency and accuracy of your testing processes.

Problems resulting from poor testing processes have many different sources, often from poor planning or execution of testing tasks.

Below are some of the causes we have observed in real world projects.

• Lack of an independent test team
Most small project teams do not have an independent test team. The people who act as testers are at the same time developers, engineers, or analysts – often focused on other responsibilities, which may lead to incomplete or ineffective testing. Depending upon your industry, regulations may require a separate dedicated testing team.

• Limited understanding of the testing process
In small projects, often a project manager also acts as a test manager and may lack of appropriate knowledge and skill to plan and execute test activities – and their responsibilities of the two roles may conflict.

• Poor test planning
Planning the testing process is rarely perfect – testing is often only done to the extent that time is available. Sometimes testing is like an exorcism designed to get rid of evil spirits in a project. Often the person responsible for planning the testing may not be familiar with the testing process and may miss important steps.

• Lack of qualified resources
It can be difficult to find staff with the motivation and skills to do the testing. Since locating the appropriate skills can be challenging, positions are often filled with inexperienced testers, which may lead to incomplete or ineffective testing.

• No test data (no data-driven testing)
In our experience clients often underestimate the importance of good test data. Frequently, test data does not cover all possible conditions occurring in the application. In some cases, no test data is delivered at all. The result is that not all scenarios will be tested, which leads to quality issues.

• Lackluster test environment or product configuration
Some practitioners involved in testing underestimate the importance of getting the test environment and its configuration set up correctly. It's standard practice to treat development, test, and production systems differently – mostly because they have differing security, data, and privacy controls. Testing in production can lead to corrupted or invalid production data, leaked protected data, overloaded systems and more.

If there is a separate dedicated test environment it may not match the proposed production environment. The wider the gap between test and production, the greater the probability that the delivered product will have more defects. It is common for test teams to clone the production data and use it for testing purposes. This approach can be time-consuming, error-prone and may not meet the data protection policies.

• Poor release management
Many projects do not have a well-documented release management process for testing purposes. This lack of rigor may lead to inconsistencies. Often, we have seen situations where a patch designed to correct a problem injects new ones, which may lead the system to fail.

• Inadequate defect management
In small organizations defects are sometimes not tracked centrally or are manually tracked using spreadsheets or email. This approach leads to inaccuracies or failures to correct defects. Manual operations are also burdened with a considerable amount of work to maintain the process.

• No central repository of test cases
Many legacy products/systems may have been used for years and often a test case repository is not available or maintained. If they are maintained, they usually contain just the latest requested changes and not the complete functionality of the product/system. New team members will struggle to learn the full functionality and to perform tests without reference to past test cases. This can lead to incomplete and error prone testing.

• Incomplete regression tests
When test case repositories do exist they are often outdated or incomplete. As functionality changes test cases should also be maintained and updated to match those changes. Regression tests that are limited to new capability results in poor test coverage and subsequently leads to new defects being injected to a partially tested product.

• Limited testing automation
Often, repetitive testing is performed manually. Automation can aid in creating the test environment building as well as functional regression testing, load testing, coverage testing and release management.
We suggest you evaluate the automation you have available inhouse and from 3rd parties. Assess the value of using automation versus the cost and effort. Is this a product/system that will be around for a long time and the effort of automation will continue to be used? Or is this a short-term fix with a limited life?

• Lack of training
Members of the project teams may not be familiar with the tools that are available to them or may not be trained on how to use them. Consequently, although the organization has tools they are not used. Lack of tool usage/knowledge can result in the limited ability to effectively track errors, supervise the entire process, or define measures and metrics to manage system development effectively.

• Lack of knowledge of available methodologies
Many project teams lack documented and understood testing methodologies and processes. Lack of following existing methodologies/processes or the absence of a documented approach leads to an inefficient testing process. The adopted methodologies and methods do not always have to be written down in the form of official documents, but they must be understood and applied in practice.

• Measures and metrics
Often, organizations collect data on ongoing projects but sometimes do not analyze the testing processes to seek improvements. For example, if it is uncovered that poorly documented or understood requirements are creating testing errors, the need arises to improve requirements elicitation and documentation process. This postmortem analysis provides an opportunity for iterative improvement of the testing phase.
We recommend testing retrospectives to see what the team has learned and what can be done to improve the overall development process.

Best Practices

The following are the best market practices. These practices are recommendations that are not applicable in every test team and not in every organization. Based on our practical experience we suggest:

  • Have an independent test team, whenever possible.
  • Plan the participation of the test team at an early stage of product development. It's important to have the test team participate in the analysis stage and aid in assessing the functional and non-functional requirements in terms of their validation capabilities and the associated testing workload.
  • Define a strategy for testing the software with the customer.
  • Provide for collaboration among engineers, architects, designers, project managers, developers and testers in planning activities related to the testing process.
  • The preparation of test data should take place together with the construction of test cases. The data created should be subject to versioning and creating base lines.
  • When test environments use production test data, special attention should be paid to disguising or modifying the test data in order to ensure compliance with data protection legislation.
  • Create separate environments for testing and development. The test team should keep the test environment as compatible as possible with production.
  • Focus on the preparation and verification of the test environment before the start of each test phase.
  • Use tools to support configuration management, error tracking and requirements management to facilitate the work and increase its efficiency.
  • Build and maintain a test case repository that can be accessed by the project team.
  • Maintain traceability between tested functionality (requirements) and test cases in a matrix or other method. This provides information on the functionality being retested, limiting the time and scope of testing. Traceability relationships will reduce the number of regression tests by using the ability to track the relationship between requirements and test cases.
  • Maintain a repository for unit tests; use component simulation tools to support team sharing.
  • Use measures and metrics in the project to analyze the results. The data collected should help to improve the software development process and the efficiency of the team.
  • In our experience, a good testing process is one of the most important activities to ensure the delivery of value to both the development team and client.

Regardless of the type of project, testing should be given special attention. Testing must be well planned, executed in a repeatable documented manner by qualified and trained people. Without this supervision it will be difficult to call tests effective.

The IBM Solution

solutions can help you avoid common software development traps.

Lack of planning, lack of metrics, and collaboration with stakeholders, ineffective test management, and lack of test automation, all lead to problems. When we don't measure how we're doing and do not continually make improvements, the risk escalates and the project can get out of control.

IBM software test management solutions incorporate many best practices that help you avoid these common traps and enjoy the benefits. IBM Test Management is a collaborative, web-based, quality management solution that offers comprehensive test planning and test asset management from requirements to defects. It enables teams to seamlessly share information and use automation to speed project schedules and provides metrics for informed release decisions. It is available both on-premise or as a SaaS solution.

Key capabilities include:

  • Communications support – Support communication among teams that are geographically dispersed using features such as event feeds, integrated chat, review, approval and automated traceability.
  • Automation tools integration – IBM Test Manager integrates with many test automation tools including 3rd party tools, homegrown scripts and more. Execute tests with all kinds of tools and collect test results—all from a central location.
  • Advanced reporting capabilities – Address the needs and concerns of quality managers, business analysts and release management using advanced reporting capabilities – making it easier to assess readiness for delivery.
  • Comprehensive test plans – Provides test plans that clearly describe project quality goals and exit criteria, while tracking responsibilities and prioritized items for verification and validation.
  • Risk-based testing – Provides risk-based testing for prioritizing the features and functions to be tested based on importance and likelihood or impact of failure, supporting risk management best practices.
  • Requirement tools integration – Test Manager works with the IBM Requirements tool. You can link test cases and mark them as suspect whenever requirements are modified.