19. Polarion for Testing and Quality Assurance

This chapter covers the Polarion features that support quality assurance and testing activities. The main features and capabilities include:

Test Management

The topics in the section explain how to find and begin using Polarion's test management and QA support features.

Test Management Projects and Templates

Polarion products with QA/testing support provide one or more project templates that provide support for creating, managing, and executing test cases. The project templates are pre-configured to provide Work Item types, link roles, and workflows that support quality assurance and testing. Some project templates may also provide support for defining and managing requirements, so that these may be easily linked to the test cases that verify them. This section briefly describes the QA and testing support provided in several project templates.

Project Templates

This section introduces the standard project templates that include quality assurance and testing support. For a complete listing of project templates with more detailed description, see User Reference: Default Project Templates.

Software Specification Project

Use this to template for projects emphasizing creating and managing software requirements and test case specification documents.

Software QA Project

This template supports development of software requirements, risk management, and test management specifications. There is a variant of this template that has standard Java project structure set up for automated build process managed by Maven-2, enabling you to automatically run unit tests as part of the build process.

Software Project

This template supports the full software application lifecycle based on a V-model process, including testing and quality assurance. There is a variant of this template that has standard Java project structure set up for automated build process managed by Maven-2, enabling you to automatically run unit tests as part of the build process.

Agile Software Project

This template supports the full software application lifecycle based on the SCRUM methodology, including testing and quality assurance. There is a variant of this template that has standard Java project structure set up for automated build process managed by Maven-2, enabling you to automatically run unit tests as part of the build process.

System QA

Project template to support system engineering/mechatronics analysis, planning, development, testing and maintenance with emphasis on quality assurance.

Exploring Template Capabilities

The best way to become familiar with these project templates is to work with a Polarion administrator to set up a "sandbox" folder in the repository in which you can create experimental projects from the different templates with QA and test management support. (For a list of the default project templates distributed with Polarion products, see User Reference: Default Project Templates.) Your administrator may give you permissions to create projects in this folder, or you may have to work with the administrator to actually create your experimental projects in the sandbox depending on your administration policy.

The procedure for creating projects is described in the Administrator's Guide. See Creating & Managing Projects: Project Administration. In your sandbox area, try creating a project with each of the foregoing project templates, selecting them on the relevant new project wizard screen. Note that you need to be using a license that provides the template(s) you wish to try. Keep in mind that you can always download an evaluation copy of Polarion to try the various project templates, as the evaluation license provides all of them. Remember too that any of the "stock" project templates can be customized, or have copies created which can be customized to suit your exact needs.

See also:

Introducing the QA Project Template

A project based on the "QA Project" project template has a space named Testing (under Documents & Wiki in Navigation) which contains the following:

  • Document: Test Specification. This is a placeholder Document that you can optionally use to write up test case specifications. You can mark portions of the content as Test Case type Work Items which are then tracked in the tracker and managed with the project's workflow.

    For information on working with Documents, see User Guide: Documents and Wiki Topic.

  • Page: Test Case Traceability. This page is a comprehensive report that shows which Requirements have (and do not have) linked Test Cases.

Process Overview

There are 3 main activities involved in a "QA Project" project:

Following sections of this chapter explain each of these activities.

Creating and Managing Test Specifications

In Polarion, specifications for individual tests are broken down into "Test Case" type Work Items. Test cases may be aggregated in a Document, or created individually in the integrated tracker. With either approach, the test cases are managed with the project's workflow through each step of your process. You can import existing test cases you have written up in Microsoft Office documents, or you can author them "from scratch" using Polarion tools. Once you have Test Case type Work Items created, you can link them for traceability. You can link Test Cases to other Test Cases, to Requirement type Work Items verified by the Test Cases, to Defect type Work Items that you may create during testing, or any other way to create exactly the traceability you need.

Importing from Microsoft Office

If you have test cases written up in Microsoft Word or Microsoft Excel, you can import existing documents to Polarion, creating Test Case type artifacts in projects which can then be tracked and managed with project workflows. If you import from Microsoft Word, you create a new LiveDoc documents, creating Test Case type Work Items according to rule you set up to recognize such artifacts from the original document content. If you import from Microsoft Excel, you can either import into a LiveDoc Document, if you prefer a document-based approach, or into the integrated tracker, if you prefer a tool-based approach.

For more detailed information on importing information to create Work Items, please see User Guide: Work Items Topic: Importing Work Items.

Importing Multivalued Data

You may want to specify data in an Excel worksheet to be imported into multivalued fields in Polarion. For example, the Category field can contain multiple values. Your project might also have some multivalued custom fields. If properly formatted in the Excel sheet, the Polarion importer will recognize multiple values and import them correctly. There are 2 options for formatting multiple values:

  • Specify each value in a cell under the column for the field, and merge that column to span all the values

  • Specify all values in a single cell under the column for the field, separating each value with a comma character.

Figure 19.1. Multivalued Data Import

Multivalued Data Import

Alternatives for formatting Excel data for import to Polarion multivalued fields

Recognition of multivalued data can be toggled on or off in the import wizard using the Separate multiple values option. When this option is checked, delimited values in a mapped column are imported to Polarion as multiple values in the target field. Recognized delimiters are comma (,) and semi-colon (;) characters. Detection of merged heading cells in Excel is automatic.

For reference information on supported fields, see the User Reference topic Import from Excel: Multivalued Data and Fields.

Defining Test Steps in Excel

When defining Test Cases having discreet test steps, you can format your Excel worksheet in such a way as to facilitate import into Polarion. Here are a few tips:

  • Specify a column with a name like "Title" to identify each Test Case. You can include another column such as "Description" for the general description of each Test Case.

  • Define columns corresponding to those defined in your projects Test Step Columns configuration. For example: "Step Name", "Step Instruction", "Expected Result".

  • When writing up the test steps, use one row for each step. Then, when these are complete, vertically merge the Title and Description columns so that they encompass the test step rows (see figure below).

    You may wish to leave an empty row between test cases. This can give you an additional import rule condition when importing your test cases.

Figure 19.2. Test Steps in Excel

Test Steps in Excel

One way to format test cases with test steps to facilitate import into Polarion.

When you import the Excel workbook, you can then map test steps to the Test Steps field of Test Cases. For more information on importing from Excel and mapping columns to fields, see the User Guide topic Importing from Excel.

The following sections discuss creating new Work Items in existing Documents and the Work Item Tracker.

Defining Test Cases in Documents

Polarion's LiveDoc Documents ("Documents") are often preferred for defining test cases by teams who are accustomed to work with document-based test case specifications. Content in Documents can be marked as Test Case type Work Items, which can be tracked and managed with project workflow yet still appear as content in a document.

When you create a Document for specifying test cases, be sure to select Test Case as the Work Item type the new Document should contain. (If you accidentally select a different type, or you want to have multiple Work Item types in the Document, you can configure the Document after is it created.)

To learn all about how to work with Documents, please go through the User Guide chapter: Documents and Wiki Topic. You can also find several video tutorials on working with Documents on the Polarion Software web site.

TIP

Projects can be configured by an administrator to include a table of manual test steps in Test Case type Work Items. When Test Steps support is enabled, discrete test steps can be defined in this special Test Steps field. The results of individual test steps can then be logged when testers execute manual tests.

When configured this way, the Document's Work Item presentation will display the Test Steps table in Work Items. Rows can be added to the table using the Add Row button, which appears when the cursor is located inside the test steps table when the Description field is in Edit mode.

If you do not see the Test Steps in Test Case items in a Document, it is possible that another user changed the Work Item presentation to some setting that does not display test steps. To show the Test Steps table in Document-based Work Items, you can set the Document's Work Item Presentation. Click the drop-down control beside the Work Item icon in the Document Editor toolbar, and on the menu choose Configure. In the dialog, select one of the options that includes Test Steps.

For more information, see Executing a Test Run and the Administrator's Guide topic Enabling Manual Test Steps.

Exporting Test Documents to Word

You can use the Round-trip for Microsoft Word feature to export test specification Documents to Word for review, comment, and collaboration. For more information, see the User Guide topic: Sharing Documents Using Word Round-trip.

You should note the following when exporting test specification Documents containing Test Steps in Test Cases:

  • A Document that is configured to show Title, Description and Test Steps, or Title and Test Steps, can be exported for Review, Prioritization, or Collaboration.

  • When exported for Review or Prioritization, the Test Steps can be commented. Comments will be written to the Description field upon re-import to Polarion.

  • The content of test steps cannot be edited in the exported file.

Defining Test Cases in the Tracker

If your team prefers a tool-based approach to defining test cases, you can create Test Case type Work Items directly in the integrated Work Item tracker. If your project is based on one of the standard Polarion project templates with QA/testing support, the Test Case type is pre-configured. If your project's configuration has been customized, the Work Item type corresponding to a test case may have some different name, which you should substitute for "Test Case" in the procedure that follows.

To define a new Test Case in the tracker:

  1. Log in and open the relevant project. Your user permissions for the project must allow you to create new Work Items.

  2. In Navigation, select the Work Items topic.

  3. On the toolbar of the Work Items page, click the Create icon and choose Test Case on the menu. A new empty Work Item form appears in the viewer/editor pane of the page.

    (You cannot create new Work Items in the Matrix and Time Sheet views of the Work Items topic.)

  4. Fill in the fields, making sure to supply a value for all required fields. Optionally create links to other Work Items, specify hyperlinks to relevant resources, add attachments, etc.

  5. When finished, click the Create button on the viewer/editor toolbar to create the new Test Case item.

TIP

Projects can be configured by an administrator to include a table of manual test steps in the Description field of Test Case items. When Test Steps support is enabled, the results of individual test steps can be logged for manual tests. If there are multiple tables in the Description, the table of test steps must be the first table in the field. Rows can be added to the table using the Add Row button, which appears when the cursor is located inside the test steps table when the Description field is in Edit mode. For more information, see Executing a Test Run and the Administrator's Guide topic Enabling Manual Test Steps.

Linking Test Cases

In the default configuration of projects based on a Polarion project template that provides QA/testing support, Test Cases can be linked to each other and to other Work Item types such as Defects or Requirements/User Stories. Linking is vital for future traceability and impact analysis. You can link items to others in the same project, or in different project. (In systems configured for multiple concurrent Polarion servers, it is possible to link to items residing on a different server, but the linking is not as robust as when items are linked within or across projects residing on the same server.)

Typical links in QA/testing projects include:

  • Linking Test Case items to Requirement items with a "verifies/is-verified-by" link role (relationship).

  • Linking Defect items to Test Case items with a "is-triggered-by/triggers" link role (relationship). When using test execution features of Polarion's QA/testing project templates, Defect items with such link roles are created automatically upon failure of tests.

Linking of Work Items is discussed in more detail in User Guide: Managing Work Items: Linking Work Items.

Specifying Manual Test Steps

You can optionally define discrete Test Steps in Test Cases. Your project administrator can enable this functionality in the project configuration. (See Administrator's Guide topic: Enabling Manual Test Steps ).

When correctly configured, a table of test steps appears in the Custom Fields section of new Test Case items (unless the administrator opted to display the table separately). Testers can specify each test step as a row in the Test Steps table. The table is rendered as executable test steps in a panel in the Execute Test Case section of the Test Case Work Item when the Test Case is executed during a Test Run.

Figure 19.3. Test Case with Test Steps Configured

Test Case with Test Steps Configured

Testers can log results of individual test steps

The advantage of this approach is threefold:

  • Any failure can be linked and easily traced to the test step in which it occurred.

  • Test Steps data is separated from the Description field, which can be then reserved for general information about the Test Case.

  • If a Test Run is exported to Excel for manual execution of tests off line, only test step and results data are involved in the round-trip. Other Test Case data, such as the Description cannot be modified by external testers.

Editing Test Steps in Documents

When Test Steps are enabled in your project, they are a field of your Test Case type Work Items, so the steps can be edited in Test Cases defined in Documents. You need to set the Presentation Configuration in the Document, selecting either Title and Test Steps or Title, Description and Test Steps in the Content column of the Configure Work Item Presentation dialog (see User Guide topic Multiple Work Item Types for more information on this dialog).

Creating and Managing Test Runs

Test Runs log instances of the execution of a defined set of Test Cases. Test Runs may log manual tests or automated execution of unit tests run as part of a build. Test Runs also contain information about the status of the Test Run, results of instance of test execution the Test Run represents, the test environment, and the build tested.

An individual Test Run is based on a Test Run Template. Polarion products that support quality assurance and testing come with pre-defined templates which can be customized or used as the basis for custom Test Run Templates.

Creating a Test Run

To create a new Test Run in your project:

  1. In Navigation, click on Test Runs. The Test Runs page loads displaying a table of existing Test Runs.

  2. In the toolbar of the Test Runs table, click the Create button to launch the Create Test Run dialog.

  3. In the ID field, enter a unique identifier. (The value you enter must not be used by another Test Run in the project.)

  4. In the Template field, select the desired Test Run template from the list of available templates.

  5. Click the Create Test Run button to complete the procedure.

Selecting (Planning) Test Cases

In a Test Run based on a Test Run Template that has been configured for manual selection of Test Cases (see Specifying Test Case Selection), you must manually select the Test Cases before executing the Test Run. You can either select the test cases in the Work Item Tracker, or in a Document.

Tracker Approach

To begin the selection operation:

  1. Open your project if not already open.

  2. In Navigation, select Test Runs, and in the upper pane of the Test Runs page, select the Test Run for which you want to specify Test Cases.

  3. In the Test Run detail pane, click Actions and choose Select Test Cases.

This action opens the Table view of the Work Items topic and displays a special Test Run Planning panel.

Figure 19.4. Select Test Cases for a Test Run

Select Test Cases for a Test Run

Manual selection of Test Cases for a Test Run - add or remove Test Cases

You can browse the Test Cases in the view and select those that should be executed when the Test Run is executed. Your selections are added to the count of waiting Test Cases for the Test Run you are populating. Items in the table that are currently added to the waiting Test Cases show a colored highlight at their left border (see figure above).

By default, all Work Items of all types are shown in the view. You can use a query to reduce the number of items displayed. For example, you might query for items of type Test Case, having a given Severity and/or Status.

To add a Test Case to the queue of waiting Test Cases, click the Plus (+) icon in its row in the table. To remove an added Test Case, click the Minus (-) icon that appears in its row (again, refer to the previous figure).

When you have added all Test Cases that should be executed when the Test Run is executed, click the Save Test Run button at the bottom of the Test Run panel. You can optionally close the panel using the Close icon on the panel header.

If you want to execute the Test Run you just populated, select the Test Runs topic in Navigation, select the Test Run in the list of Test Runs, and click the Execute Test button.

TIPS

  • Click the Test Run ID shown in the Test Run panel to navigate to the Test Run

  • In the Test Run panel click on Waiting to filter the table and show just the waiting Test cases. Click on Executed to filter the table and show only the executed Test Cases.

  • You can select multiple items in the table (use the check box on each row) and add or remove them from the Waiting list all at once using the respective buttons in the Test Run panel.

Document Approach

To begin the selection operation:

  1. Open your project if not already open.

  2. In Navigation, select Test Runs, and in the upper pane of the Test Runs page, select the Test Run for which you want to specify Test Cases.

  3. In the Test Run detail pane, click Actions and choose Select Test Cases.

    The Table view of the Work Items topic opens and displays a special Test Run panel.

  4. Go to Navigation and select the Document containing the Test Cases to be added to the Test Run. The Test Run panel will remain open in the Document sidebar when the Document opens.

You are now ready to add Test Cases from the Document to the Test Run. Please refer to the following figure:

Figure 19.5. Select Test Cases in a Document

Select Test Cases in a Document

You can optionally populate a Test Run with selected Test Cases contained in a Document

  • To add a Test Case, select it by clicking anywhere in its content (selection is indicated by the gold left border), then either click the Plus (+) icon in the left margin, or the Add button in the Test Run panel.

    The count of waiting Test Cases is incremented in the Test Run panel.

  • To add multiple Test Cases, select all of them (Shift + Click in most browsers), then click the Add button in the Test Run panel.

    The count of waiting Test Cases is incremented in the Test Run panel by the number of selected items. Note that multiple items must be contiguous in order to be selected at once.

  • To remove a previously added item from the queue of waiting Test Cases, select the Test Case in the Document body, and click either the Minus (-) icon in the left margin, or the Remove button in the Test Run panel.

  • After adding all the Test Cases you want to be tested in the Test Run, click the Save Test Run button in the Test Run panel.

    Testers can now execute the Test Run.

TIP

In large Documents with many Test Cases, you can invoke Operations & Filter and create a query to show only a subset of the Test Cases contained in the Document.

For example, a query like severity:smoke would limit the display of Test Cases to only those having the specified Severity level for a Smoke Test type of test.

Editing an Existing Test Run

After creating the new Test Run (or at some other time if information changes) you may want to edit the Test Run details such as the test environment. After selecting the Test Run you want to edit:

  • Click the Properties button to edit the Test Run properties

  • Click Actions button > Customize Test Run Page to edit wiki content of the Test Run,

Adding Attachments to a Test Run

You can attach any type of file to a Test Run, either while in the process of creating it, or later on when setting the result. For example, you might attach a file containing test result output from some external testing tool. To add or manage attachments, click the Attachments icon in the Test Run toolbar to scroll the page to the Attachments field.

If your system is configured to support search on attachment properties and content, you can search on properties of Test Run attachments. In the Test Runs topic, you can search on attachment title, attachment author ID or name, attachment file name, content, date updated, and size.

In site search (in Navigation), you can search attachments to Test Runs via full-text search.

Executing a Test Run

Automated tests are run by the Polarion build tool according to the build configuration, so you do not need to explicitly invoke Test Run execution. This section discusses manual execution of Test Runs in Polarion. If manual testing will be done externally to Polarion, see Executing Test Runs Externally in this chapter.

As previously mentioned, a Test Run represents an instance of executing a set of test cases, which have been defined in Test Case type Work Items. At some point after creating a Test Run, a tester manually performs the testing steps define in the test cases of the Test Run. By invoking Execute Tests on a Test Run before manually executing the test cases, the tester set the stage for logging the results of the testing and creating records of testing activity.

Even you use some external tool to execute test cases, testers can still use a Test Run to log the results of external testing so that the history and traceability are maintained in Polarion.

To execute a manual testing Test Run:

  1. Open the Test Run's page by clicking on the Test Runs in Navigation, and selecting the target Test Run in the table of Test Runs. If there are many Test Runs, you can use the filter control in the pane to zero in on the page you want.

  2. Click the Execute Test button. An instance of the tracker opens in a new browser page or tab, with the Table view current, and the Test Case type Work Items of the Test Run listed in the table.

    Figure 19.6. Launching a manual test

    Launching a manual test

    Launch manual test execution on the Test Run page

  3. Select the first Test Case in the table. It's detail loads in the lower half of the page.

  4. Scroll down to the Execute Test section. If your project is configured for Test Steps support, this panel contains individual test steps from the test steps table in the Description field, and you will be able to log the results of each test step:

    Figure 19.7. Test Case with Test Steps

    Test Case with Test Steps

    Log results of individual test steps and overall Test Case result

    If your project is not configured for test steps, then the panel contains only the Test Case Verdict field and buttons to mark the overall Test Case result. You can also add attachments to the Test Case Verdict using the Add Attachment icon in the panel.

    Figure 19.8. Test Case Without Steps

    Test Case Without Steps

    Log only result for Test Case as a whole

If the Execute Test panel does not have individual test steps:

  1. Click Start Test Case.

  2. Perform all the test steps to determine the result.

  3. Optionally enter a comment in the Test Case Verdict field and click on the button marking the overall testing result: Passed, Failed, or Blocked. If there are untested Test Cases in the table, and the Skip to the next Test Case after execution option is selected, the next Test Case in the Test Run is automatically selected and ready for testing to begin on that Test Case. As long as the option is checked, this will happen until all Test Cases in the Test Run have been executed.

If the Execute Test panel does have individual test steps:

  1. Click Start Test Case.

  2. Execute the first test step, note the result of the step in the Actual Result field, and click the button that indicates the result of the step: Passed, Failed, or Blocked. Note that you can select from a list or previous results for the current step by clicking on Recent.

    You can optionally attach a file (a screenshot image, for example) to the test step. An icon Add Attachment appears when you click on the Actual Result field. For more information, see the User Reference topic: Test Step Attachments.

  3. Repeat the above for all test steps listed in the section.

  4. Optionally enter a comment in the Test Case Verdict field and click on the button marking the overall testing result: Passed, Failed, or Blocked.

    If there are untested Test Cases in the table, and the Skip to the next Test Case after execution option is selected, the next Test Case in the Test Run is automatically selected and ready for testing to begin on that Test Case. As long as the option is checked, this will happen until all Test Cases in the Test Run have been executed.

    Note that it is not required to mark results for any of the test steps. You can mark all, some, or none. Actual Result of every step is written to the test record even though the test step has no Status.

Deleting Old Test Runs Manually

At some point you may decide you don't need to preserve all the Test Runs that have been executed. This is especially likely if you have integrated automated test execution with Polarion's building feature, creating new Test Runs with every build. You can delete one or more Test Runs from the Test Runs topic. You must have permissions for this action, otherwise you will see a message and the delete operation will be canceled.

To delete one or more Test Run(s):

  1. Select the Test Runs topic in Navigation.

  2. In the table of existing Test Runs, select the Test Run(s) you want to delete. To select multiple Test Runs for deletion, check the box on the row of each Test Run.

  3. If you are deleting a single Test Run, go to the lower pane, click Actions and choose Delete.

    If you selected multiple Test Runs to delete, a button labeled Delete [N] selected Test Runs appears in the lower pane of the Test Runs page (where [N] is the number of Test Runs selected in the upper pane). Click the button to delete all the selected Test Runs.

Automated Cleanup of Old Test Runs

If you have many Test Runs as a result of automated testing, it may not be practical to delete old Test Runs manually. In this case, Polarion provides an automated job that will delete Test Runs you don't need to keep. For information, see the Administrator's Guide topic: Configuring Cleanup of Old Test Runs.

Preserving Test Runs in History

Every Test Run has a field "Keep in History", accessible in the Test Run's properties. When checked, the Test Run will be preserved in Baselines. Before starting automated cleanup of Test Runs, you should mark those Test Runs that should be preserved in history for future reference. For example, you might keep all Test Runs for verification tests that were executed at some project milestone. Project managers should be aware of when the automated cleanup job is scheduled to run, and should mark any new Test Runs that should be preserved in history before the next scheduled Test Run cleanup.

Marking is simple when there are only a few Test Runs and only one or two to be marked. You can browse your project's Test Runs and mark each Test Run individually in its Properties. If there are many Test Runs in the system, and/or many to be flagged for preservation in history, then you will need to use bulk marking of Test Runs with the Keep in History flag.

To bulk mark multiple Test Runs:

  1. In Navigation, open the Test Runs topic.

  2. On the Test Runs page, in the table of Test Runs, check the box on the row of every listed Test Run that you want marked with the "Keep in History" flag.

    If there are many Test Runs, you can run a query to filter the table. If your query returns only Test Runs that should be marked, you can mark all Test Runs in the table by checking the box in the table header.

  3. Once yo have selected the Test Runs in the table, then in the selection panel in the lower portion of the page, click the Keep in History button to set the "Keep in History: flag on all selected Test Runs. (The selection panel appears when more than one Test Run is selected in the table.)

Reviewing Test Execution Records

Polarion maintains a record of all instances of the execution of a Test Case during any Test Run which includes the Test Case. You can review the most recent records for any Test Case in the Test Records section of the Work Item Editor when a Test Case is selected. You can view the past test execution records up to the limit set in the system configuration, selecting the period in the drop-down list provided. The Test Records section shows the results of the reported test executions, and provides links to each of the Test Runs. The current content of the Test Records section is included if the Test Case is sent to Print.

Each Test Record is linked to the revision of the test case that was executed during the testing session that resulted in the creation of the record.

Viewing Test Run History

To view the history of a Test Run:

  1. With the relevant project open, click Test Runs in Navigation, and in the table of Test Runs, select a Test Run by clicking on its row. The detail of the Test Run loads in the lower part of the page.

  2. On the toolbar of the Test Run detail, click the History toggle button. A table of the historical revisions to the test run replaces the Test Run detail information.

You can access the detail of any revision by clicking on the revision number. To exit the history and return to the detail, click the History toggle button again.

Customizing Test Run Templates

Test Run Templates are accessible in the Test Runs topic (Navigation > Test Runs). Click the Manage Templates button on the toolbar of the table of Test Runs. The table is replaced by a table of Test Run Templates.

You can customize any existing Test Run Template, by modifying the Test Run Properties, or by modifying the Test Run page (in which the test run is displayed in browsers), or both. Alternatively, you can create a copy of an existing Test Run Template with a new ID and name, and customize the copy. The latter approach is recommended when customizing the default Test Run templates provided by Polarion.

To create a copy of a Test Run Template to customize:

  1. Select an existing Test Run Template in the table of Test Run Templates.

  2. On the toolbar of the detail (bottom) pane of the selected template, click Actions > Save as Template.

  3. In the dialog, provide a unique ID for the new Test Run Template, and a descriptive name.

After creating the duplicate Test Run Template, you can proceed to modify it as described in the next sections. You can modify the Test Run Templates properties, and the Test Run Template page.

Editing Test Run Template Properties

The simplest modification is to edit Test Run properties. Properties include such data as Test Run type, and testing platform and environment. Every Test Run based on a modified Test Run template will have the properties values specified in the template. A simple graphical user interface is provided for editing the template properties.

To modify Test Run Template properties:

  1. Access a Test Run Template as previously described.

  2. With the desired Test Run Template selected in the table of templates, click the Properties button on the toolbar of the template detail pane (lower half of the page).

You can edit property fields in place, or click the Edit button to place all non-read only fields into edit mode.

Note that the values for some properties like test Type are defined in the global or project Testing configuration in Administration. You can select from configured values, but you cannot change the values themselves. For more information, see Administrator's Guide: Configuring Testing.

Specifying Test Case Selection

In the Test Run Template properties, it is important to specify the method of populating new Test Runs (created by users and based on the template) with Test Cases to be executed. You do this by setting a value for the Select Test Cases field. The field presents the following list of choices:

  • Manually - users will manually select the Test Cases to be executed in all new Test Runs based on the template.

  • By Query on Create - Test Cases will be selected by a query which will automatically run when a new Test Run based on the template is created. The set of waiting Test Cases of the Test Run is static and will not change if new Test Cases that meet the query criteria are added after a Test Run is created but before it is executed, nor will any be added to the set if the Test Run is executed multiple times, even if new Test Cases that would meet the criteria have subsequently been added in the system since the last execution of the Test Run.

    Specify the Lucene query syntax in the Query field (required) of the Test Run template properties.

  • By Query on Execute - Test Cases for Test Runs will be selected by a query which will automatically run when a Test Run based on the template is executed. The set of Test Cases waiting to be executed is not static. Waiting Test Cases are taken from the current set of Test Cases in the system meeting the query criteria at the time a user executes the Test Run, and could even change during the execution each time the view refreshes.

    Specify the Lucene query syntax in the Query field (required) of the Test Run template properties.

  • From LiveDoc on Create - Test Cases for Test Runs will be read from the specified Document when a new Test Run based on the template is created. The set of waiting Test Cases of the Test Run is static and will not change if new Test Cases are added to the Document after a Test Run is created but before it is executed, nor will any be added to the set if the Test Run is executed multiple times.

    Specify the Document in the required Document field of the Test Run template properties. Enter Document Name if the Document resides in the project's Default Space, otherwise enter Space Name/Document Name. You can optionally specify a query in the Query field to select a subset of the Test Cases contained in the specified Document. For example, you might query for Test Cases with Status value that is not "Draft".

  • From LiveDoc on Execute - Test Cases for new Test Runs will be read from a Document when a Test Run based on the template is executed. The set of Test Cases waiting to be executed is not static. Waiting Test Cases are taken from the current set of Test Cases in the Document at the time a user executes the Test Run, and could even change during the execution each time the view refreshes.

    Specify the Document in the required Document field of the Test Run template properties. Enter Document Name if the Document resides in the project's Default Space, otherwise enter Space Name/Document Name. You can optionally specify a query in the Query field to select a subset of the Test Cases contained in the specified Document. For example, for a Smoke test, you might query for Test Cases with a Severity value of "Smoke".

  • Automation - Test Runs based on the template will be created by an automated process, and results of test execution will be imported to Polarion.

Modifying Test Run Template Page

Test Runs display as wiki pages. You can use the full range of wiki mark-up language to change the column layout, element styling, and content of Test Run pages based on a Test Run template you have customized.

To modify a Test Run Template's wiki page:

  1. Access a Test Run Template as previously described.

  2. With the desired Test Run Template selected in the table of templates, click Actions > Customize Test Run Page.

An instance of the Wiki Editor appears on the detail panel of the Test Run Templates page, displaying all wiki mark-up that renders the look and feel of Test Runs based on the template, including macros and scripting code. You can modify the page according to your needs. For more information, see the User Guide topic Editing Wiki Pages.

Integrating with External Testing Tools

Polarion provides the option of using external testing tools and using Polarion as the central repository for tracking and reporting the results of testing. For example, you might use Eclipse TPTP for automated load and performance tests, and Selenium for automated user interface testing, importing results of every test execution to Polarion where the results are visible to stakeholders, and history of testing is maintained. You can use Polarion in this way with any external tool that can export test results to the xUnit XML format.

To integrate xUnit test results from an external testing tool with Polarion, you configure each of your external tools to write an xUnit file to a folder accessible to Polarion, and configure Polarion to periodically check each folder for the presences of a test results file. A scheduled job named xUnitFileImport is provided in Polarion which checks a specified folder, and if any file is found there, the test results it contains are imported to an automatically-created Test Run. The test results file is deleted from the disk so that the folder is empty, and will be empty until the external testing tool executes another round of testing and writes another results file to the folder.

For information on scheduling and configuring jobs, see Administrator's Guide: System Maintenance: Configuring the Scheduler. The job may also be invoked explicitly by a user in the Monitor.

Job Configuration Syntax

The following listing shows the job configuration syntax. The cron expression is just an example of one you might use to specify when the job should run. (For more information on cron expressions, see Administrator's Reference: Scheduler cron information and examples.)

<job cronExpression="0 * * * * ?" disabled="false" id="xUnitFileImport" name="xUnitFileImport" scope="system">
    <path>c:\Temp\xunit</path>
    <project>playground</project>
    <idPrefix>TR</idPrefix>
    <userAccountVaultKey>xUnitFileImportUser</userAccountVaultKey>
</job>					
				

<path> is the folder where Polarion should check for a test results file.

<project> is the name of the project into which the test results data are to be imported.

<idPrefix> - is a value to be prepended to the automatically-generated ID of the Test Run which is created when the test results file is imported. Note that the Group ID field of the Test Run is filled with the name of the imported test results file (minus extension).

xUnitFileImportUser references keys in the User Account Vault file (more info follows). If no user is specified in these keys, or if the <userAccountVaultKey> element is empty, then the job can only be run by a user invoking it from the Monitor, and the job runs on that user's account. It will fail if allowed to run automatically according to the cronExpression schedule because such jobs run on the system user account which does not have write access.

In order to run the job automatically, it is necessary to configure a user in the User Account Vault on who's behalf the job will run. The User Account Vault is a properties file stored in the file system of the Polarion installation at: data/workspace/user-account-vault/accounts.properties. The file contains the following 2 keys, which are referenced by xUnitFileImportUser when that value is specified in the <userAccountVaultKey> element of the job configuration (see example job listing above):

					xUnitFileImportUser.user=[USER_ID]
					xUnitFileImportUser.password=[USER_PASSWORD]
				

...where [USER_ID] and [USER_PASSWORD] are the ID and password, respectively, of the user on whose behalf the job should run. The specified user must have write permissions for the project specified in the <project> element.

See also: User Reference: xUnitFileImport Job Parameters, Job Example, and xUnit File Example.

Executing Test Runs Externally

You have the option to export Test Runs to Microsoft Excel, manually execute test cases and associated test steps externally to Polarion, and then import the results into Polarion via the Excel file previously exported, automatically creating test records in the system. This approach can be useful if you have external contractors who perform tests but do not have access to your Polarion system.

You can also use the same method to manually record the results of externally-run automated tests in an exported Excel file, importing the results back to Polarion. However, if you use automated testing tools, you may prefer to set up automated import of automated test results into Polarion. See Integrating with External Testing Tools for information on this.

Begin by exporting the Test Run to be executed externally to a Microsoft Excel workbook file.

  1. Open the project containing the Test Run and Test Cases you want to execute.

  2. In Navigation, select Test Runs. On the Test Runs page, select the Test Run you want to manually execute or for which you want to manually record results of external automated tests.

  3. Click the Export Tests to Excel link and save the exported Excel file to your local file system. You may send this file to someone else who will actually perform the manual tests (or manually record the results of externally-run automated tests).

You or your colleague or contractor can now execute each of the Test Cases in the Test Run in whatever way that is normally done, performing the steps by hand, or running some tool and logging the result of each Test Case. When manually executing tests, as each Test Case is executed, log the result of each test step in the Step Verdict column of the Excel sheet for each Test Case. Record the overall result of each Test Case in the Test Verdict column. Optionally add or change the content of the Comment column. (These columns are the only ones you can edit... all others are protected. If you try to edit other columns or add rows to the table, the action is disallowed and you will see a message dialog.)

After all tests for the Test Run have been performed and their results entered in the Excel sheet, the tester returns the saved Excel file to the Polarion user who exported the Test Run, who then imports the results file back to Polarion so that the results can be recorded and tracked there.

  • Be sure all changes in the Excel file are saved before initiating the import.

  • Access the pertinent Test Run in Polarion if it is not still open in your browser.

  • In the lower pane of the Test Run page, click the Import Results from Excel link.

  • In the Import Results from Excel dialog, click the Browse button and select the previously exported Excel file on your local file system.

    Optionally check Overwrite conflicts. If checked, the values in the Excel file you are importing will overwrite any conflicting values in the Polarion Test Cases.

    Click Start to launch the import process.

When the process is finished, the dialog informs you of the result - successful import, or import failure. Assuming success, it displays statistics about the number of Test Cases executed and the number that passed. In every case, there is a link to the import log file which displays the process log in a new browser instance.

A successful import creates a new Test Run record. You can access records via the Actions menu button in the header of the Test Run detail pane on the Test Runs page.

Note

An Excel workbook exported for external test execution does not allow adding new test cases. You can use to Import from Excel feature to define new test cases in Excel and import to create new Test Case type Work Items in Polarion.

Test Export Templates

Polarion provides a default template for exporting Test Cases to Microsoft Excel for external testing. It should be sufficient for most situations. However, administrators can download the default template file which can then be used as the basis for one or more custom export templates. An administrator can upload customized template files which users can subsequently select as templates for Excel files generated by exporting Test Runs to Excel. For information, see Administrator's Guide: Configuring Testing: Configuring Test Export Templates.

Reporting Testing Status and Results

Polarion provides visibility and information about the status of testing and testing results. The two main vehicles are:

  • Polarion Live Dashboards (Polarion ALM and Polarion Reviewer)

  • Polarion Live Report pages

Dashboards and reports access the most current project and testing data in the repository at the time they are updated. Dashboards are updated according to the configuration of the scheduled job that updates dashboards, and can be explicitly updated to show up-to-the-minute information. Report pages are updated when viewed, and can be exported to PDF any time. When you use a product license that supports it, the Quality topic appears in the Topics section of the Navigation panel. This topic provides a number of dashboards summarizing key quality-related information and statistics. For more information, see User Guide: Quality Dashboards, Audits, and Metrics.

You can create custom report pages in the wiki, especially for quality assurance and testing reports. The Testing space, available in projects based on one of Polarion's project templates that provide it, is a good place to create your report pages. QA and testing project templates provide some standard reports such as Test Case Traceability and Testing Dashboard. You can modify any of the default report pages, or copy and paste their content to a new page which you modify to suit your needs.

Test Management Macros

Polarion products with QA and testing support provide a number of macros designed to support testing activities and reporting. For example, the {import-automated-test-results} macro enables page users to launch import of test results.

You can find a complete reference to these macros, including syntax and usage examples, in the Test Management section of the Wiki Syntax Help, which is accessible when you are editing any wiki page. You can also access it at the following URL on your Polarion server:

http://yourpolarionserver.yourdomain.com/polarion/#/wiki/Doc/SyntaxHelp#HTestManagement ...

Note

Beginning with version 2012 several test management macros were deprecated in favor of new features and removed from the Wiki Syntax Help. However, these macros can still be used. See Appendix: Deprecated Macros: Test Management.

See also: User Reference: Querying for Test Record Work Items.

Reporting Test Results from Baselines

Test results and records are included in Baselines. So, for example, if a Baseline is created at the time of a release, you could go into the Baseline to see what test cases were executed in time of release. In baseline view, the Test Runs topic shows data as they were at the time the Baseline was created.

When working in a Baseline:

  • The Test Run detail page is read-only.

  • Test Run properties are not present.

  • Test Run detail pages can be printed or exported to PDF

  • If you open any Work Item referenced by the Test Run, it is opened in baseline view provided it does not point to a particular revision of the item.