This guide will make sure your PractiTest account is configured in alignment to best practices. The guide is built in a sequential order of steps to follow in order to ensure your account is properly configured.
Before you start with the configurations, we recommend you to take a few minutes to watch the below video, which covers the entire PractiTest flow.
If you need help with anything, please don't hesitate to let us know! You can reach out to our support team directly from the platform.
Getting Started Checklist:
- Set Up Your PractiTest Projects
- Configure Bug Tracker Integration
- Create Tests Infrastructure Using Fields
- Import/Create Tests
- Structure Tests Using Filters
- Create and Organize Test Sets
- Import/Create System Requirements & User Stories
- Run Your First Test Cycle
- Manage Issues
- Dashboards & Reports
- Test Automation Integration
Set Up Your PractiTest Projects
PractiTest projects are independent entities. By default, your account is created with one project. The first thing you need to consider is if you need to work with multiple projects, or within a single project framework.
As best practice, we recommend working within a single project framework when testing is done by the same team and on the same product. Using dynamic fields and filters, you can organize your project data in hierarchical structure and display data in different views based on different types of product components, features, modules, and even products.
The main reason why we consider working with a single project framework as best practice is reporting. Each project has its own unique set of dashboards and reports. Therefore, to create a consolidated dashboard for all testing activities of the team/product, it is best to work with a single project.
These are the main reasons why you may want to have more than one project in PractiTest:
- When you have multiple independent teams working on separate projects
- When your company has multiple development projects with different external customers and you don’t want them to see each other’s information
- You are testing 2 or more completely separate products that have no tests and issues overlapping whatsoever
Adding New Projects
You can add new projects from the Project Management tab of the Account Settings. When adding a new project, you have 2 options:
Clone from project
If you choose to clone from an existing project, you will have the option to choose from which project you would like to clone and what configuration settings of this project you want to have in your new project.
You can independently choose to copy your fields (custom and system), your user & permissions groups and workflow, and/or your users and their groups’ assignments.
Choose from template
When selecting the “choose from template” option, you will have the option to choose the method according to which you want your project structure to be.
The following options are available:
Traditional - supports Waterfall methodology or similar
Traditional + Automation - same as Traditional with a few additions to support automation
Agile - supports Agile, DevOps and Agile-like methodologies
Agile + Automation - same as Agile with a few additions to support automation
Start from scratch - clean and empty project
Demo data - you can also decide to add demo data to your project, and thus see an example setup for a project that includes data. To add demo data, tick the ‘Add demo data to my project’ checkbox at the bottom of the page. If you are planning on working with this project, we don’t recommend adding data to it.
Renaming your project
Once your new project is created based on the selected template, make sure you edit the default name assigned to your project to reflect the project goals, and avoid unnecessary confusion in the future. You can edit project names from the project management tab of account settings, by clicking the pencil icon next to the project name.
Configure Bug Tracker Integration
If you are working with an independent bug tracker such as Jira, start by configuring the integration between PractiTest and the bug tracker you are using.
You can find all our integration help guides here.
Create Tests Infrastructure Using Fields
Before you start importing or creating your tests in PractiTest, we highly recommend that you take the time to consider how you want to structure and categorize them, and set up your fields based on the categories. By setting up the fields before you start creating your tests, you can ensure your project will be organized from the very beginning, and that will make test set creation and reporting much easier later on.
Here are a few examples for tests categorization options you can use in your project:
- By Modules
- By Components
- By Features
- By Test Levels (Sanity, Regression, etc..)
Once you have the structure in mind, start by setting up fields based on your categories.
System Fields
System fields will be available in every PractiTest project by default. You can edit your system fields, and decide where you want to display them from the fields section of settings.
Depending on the project template you selected, a few custom fields will be created for your project by default. Please note that you can edit and remove those fields as well, from the custom fields section of the fields page.
Custom Fields
To create a new custom field, navigate to settings - fields. Scroll down the page, and click ‘Create new custom field’. We highly recommend using list type custom fields in most cases. Using list fields, you can pre-define a set of values that can be selected to populate the field.
After adding the values, make sure you link the field to the ‘Test’ entity, which applies for tests in the Test Library by ticking the box next to ‘Tests’. After saving the field, you will see it in the screen of your tests, and could populate it for every new test you create or import.
You can find a detailed step-by-step guide for setting up your custom fields here.
Import/Create Tests
Importing Tests
When you are done with the initial setup for your fields, you can start importing your tests from Excel/Google Sheets. To access the import page, click on Settings - Import & Export. Then click on ‘Import tests’. In the tests import screen, all the fields you created for your project will appear.
Instead of populating them for each test individually, you can populate the fields directly from the import process. In the import page, you need to map the columns of your spreadsheet, to the fields shown on the page.
For example, if in my spreadsheet test names are listed in column ‘C’, I will delete the ‘A’ letter that is there by default, and will replace it with a ‘C’.
Please note: if your spreadsheet contains a row of headers, you will need to tick the ‘ignore first row’ box.
You can find the full import guide here.
Creating Tests
There are three types of tests you can create in PractiTest - Scripted, Exploratory and Automated.
Scripted Tests
Scripted tests contain predefined steps for the tester to verify.
The general tab of the test contains the test name and description, metadata, all system and custom fields related to the test, and a comments section.
The steps tab contains all steps information.
The traceability tab is where linkage to requirements and issues is defined and displayed.
The history tab includes the history log of the tests.
Exploratory Tests
Exploratory tests contain a charter, mission, for the exploratory session. And, guide points for the executing tester. During the session, the tester can add annotations on the fly based on their findings.
Exploratory tests are structured like scripted tests, but have a 'Charter', and 'Guide Points' fields instead of the 'Description' field.
Automation Tests
There are a few different types of automated tests you can create, based on the methods you use for test automation integration. This guide will cover the different types in a later chapter.
You can find the complete test library guide here.
Structure Tests Using Filters
Working with filters allows you to have a flexible structure for your tests. The concept is simple - you define a criteria for a filter. Once tests meet the criteria, they will appear under it. This allows you to have the same tests in several different filter trees, under different categories.
For example, one of my tests relates to the ‘Login’ feature. It is also a sanity test. By assigning the ‘Login’ value for the ‘Feature’ field, and the ‘Sanity’ value for the ‘Test Level’ field, I can make sure the test will appear under the ‘Feature’ filter tree AND under the ‘Test Level’ filter tree.
Creating filtered views based on the fields you defined, will ensure you can easily find tests in the library, by focusing on a smaller group of tests related to a particular category instead of looking at a view containing all your tests. It will also allow you to focus reporting items based on a specific category of tests, and create test sets faster.
Creating Filters
To create a new filter - navigate to the test library and click ‘+ New Filter’. Add a name for your filter, then add criteria in the filter criteria section. Finally, select which fields you want to add to the view of your filter from the custom fields section of this page.
Auto filters - The auto filter feature allows you to create filters and sub filters automatically based on any of your list type fields. To enable this option, tick the ‘Auto filter’ checkbox in the filter creation page. Then, select a list type field in the filter criteria section. Your filter will be named automatically based on the selected field.
You can find a detailed step-by-step guide for setting up your filters here.
Create and Oranize Test Sets
The next step is to create your test sets. The Test Sets & Runs module is where test execution takes place and tests results are recorded. A Test Set is a group of tests that you want to run together for a certain purpose.
Test Sets allow you to organize your testing activities in the same way that you organize your work into cycles, tasks or assignments.
Good candidates for Test Sets can be, for example:
- Tests that focus on a certain part of the system, such as GUI or Database
- Tests that belong to a certain task, such as Regression or Sanity
- A set of tests that need to be run by a single tester during a day or calendar week
Terminology Clarification: Test Sets, Test Instances & Test Runs
When you add a test to a test set, the system creates an instance of this test within the test set. An Instance is a dynamic copy of a test from the test library and it allows you to run the test as many times as you need to as part of multiple test sets.
When you run your instance, the test (Including all the steps it contains) is copied to a new Test Run. This ensures that you are running the latest version of the test from the test library. You can have multiple instances of the same test in a different, or a single test set. In addition, you can run instances as many times as you like.
Test Set Fields and Filters
In the Test Sets module, we recommend using time based items to organize your sets. For example - sprints, releases, cycles, and versions. You can also use different categories such as features, test levels (regression, sanity, etc), and assignments. This, of course, depends on the methodology you use and your overall process.
As mentioned in the Tests section, we highly recommend using list type fields in most cases.
You can find the full Test Sets & Runs guide here.
Import/Create System Requirements & User Stories
The requirements module is where you create and manage your system requirements and user stories, and create coverage for them by linking them to tests. You can either create and define your requirements/user stories directly in PractiTest, or import them from an Excel spreadsheet.
Importing Requirements From Excel
The import process for requirements is similar to the test import process. Please note, that when you import requirements, you can import the linkage to tests as well using the traceability field. To access requirements import, navigate to settings - import & export. Then select ‘Import Requirements’.
Creating Requirements in PractiTest
When creating a new requirement in the system, you will first need to fill in the information of the requirement in the Name and Description fields. After you fill in the requirements information, you can proceed by linking it to tests that cover the requirement, from the traceability tab.
Syncing Requirements From a Third Party System
If you are working with an integrated third-party system like Jira, you can sync tickets from the other system into PractiTest, and create coverage for them by linking tests from the traceability tab.
You can find additional instructions for setting this up when integrated with Jira here.
If you are working with another system, you can find the details in the dedicated integration help guide page. You can find all integration guides here.
Organizing Requirements
Like every other module in PractiTest, the most recommended method to organize your requirements is first using fields to categorize your requirements, then create filter views to sort your requirements based on the categories.
Requirements Status Field
The requirements status field will be populated automatically based on the status of the tests you link to the requirement.
Below you can find the criteria for each status:
NOT COVERED – There are no tests linked to this Requirement.
NO RUN – There is at least one test linked to the requirement. None of the linked tests ran.
STARTED – There is at least one test linked to the requirement that started running. None of them Failed or was set to Blocked
PASSED – All linked tests ran and passed.
BLOCKED – At least one of the linked tests was set to blocked. None of the linked Tests was Failed.
FAILED – At least one of the linked tests failed.
N/A – All tests linked to this requirement marked as N/A
Run Your First Test Cycle
Congratulations, your project is now ready to go live! Below are a few tips for your first test execution with PractiTest.
To make sure your test cycle is documented correctly, verify that the test sets executed for this cycle are populated with the relevant cycle value for the cycle field. If you use ‘Release’, or ‘Sprint’ instead, make sure those fields are populated properly.
Assigning Test Sets and Instances
Make sure you are using the ‘Assigned to’ field to assign test sets to testers. If your workflow requires a few testers to execute a single test set, you can also assign individual instances to different testers. To do that, navigate to settings - fields, edit the ‘Assigned to’ field, and activate it for ‘Instances’.
Running Instances & Reporting Issues
To run an entire test set, click the ‘Run Now’ button from the test set screen. This will initiate test execution for all test instances residing in the test set in the order they were added. You can also run an individual instance from the test set, by clicking on ‘Run’ next to it.
When running a new test instance, you will be redirected to a new test run directly. If a test instance has been run previously, you will first get redirected to the instance page, showing a summary of all its previous runs.
In both cases, a new run will be created for your instance/s. In the run screen, when running a scripted test, you will see all steps information including the name, description, expected results on the left side of the screen, and a field for the actual result on the right. You can assign each individual step with a status. When finding an issue during a test run, you can report the issue directly from the test run using the 'Fail & Issue' option. Or, if the issue already exists in the system, use the ‘Link existing issue’ option to link the issue to the step.
The Tester Field
‘Tester’ is a system field which is deactivated by default. By enabling it, you will be able to see which tester actually ran each one of the instances from the test instances grid. This field will populate automatically.
To enable the tester field, navigate to settings - fields. Edit the ‘Tester’ field from the system fields section, and activate it for instances. Once enabled, you can add the field as a column to your test instances grid using the ‘Columns’ option.
See the full test sets and runs guide here.
Manage Issues
Issues/Bugs are managed in the Issues module. Issues that were reported from test runs will be added directly to the issues module. You can also create issues directly from the issues module.
To create an organizational structure for your Issues modules, we recommend using fields and filters.
In the issues module, you can also use the fast filters option to create a quick filter for your issues.
Issues Workflow
The workflow editor, which you can access from settings, allows you to customize the Life-Cycle of your issues to match your needs. You can add custom statuses to issues, and dictate the transition between existing issues statuses.
See the full guide for issues workflows here.
Defect Age
The defect age field will be added to every issue created in your project. This field allows you to keep track of the time (in days) that the issue was opened. You can also generate a dashboard graph to reflect the data from this field.
Traceability
In the traceability tab of your issues, you can see the tests and requirements that the issue is linked to.
Deleting Issues
Issues deletion is disabled by default. To enable issue deletion for a project, navigate to the account settings - project management tab. Then, edit the relevant project and tick the ‘Enable issue deletion’ box. Only account owners have the permission to enable the deletion.
See the full issues module guide here.
Bug tracker Integration
When working with a bug tracker integration, Issues reported from test runs will be reported to the integrated bug tracker directly. If you are using a two-way integration, a copy of the issue will also be created in PractiTest, and the name, description and status fields will be synced between the PractiTest issue to the bugtracer’s issue.
Jira Integration
As described above, when working with Jira two way integration, issues you report from PractiTest to Jira will be created in your PractiTest issues module as synced copies. Additionally, you can also decide to sync tickets that were created directly in your integrated Jira projects either using their IDs or using an existing filter from your Jira project to bulk sync tickets. To sync tickets, click the small arrow next to ‘+New Issue’, then click ‘Sync a new issue from Jira’.
Dashboards & Reports
Dashboards
PractiTest dashboards are fully customizable. We highly recommend adding new dashboard items and tabs and customizing the default dashboard to tailor it for your needs and your specific projects. Every dashboard tab can contain up to 8 items, and you can create an unlimited number of tabs for each PractiTest project.
Dashboard Items
When creating a new dashboard item, first, you need to select the entity you want to focus on: Tests, Test Sets, Instances, Runs, Issues or Requirements. To focus on test execution data, make sure you select the Instances entity or if you want to show test automation data select Runs entity. After you selected the entity, select the item type you would like to create, and press on continue. Next, select the filter you want to focus on from the selected entity.
For example, if I want to create an item that will display test execution data from my last cycle (cycle 3) - I will create an instance based item and the ‘Cycle 3’ filter.
By ticking the box next to ‘After saving stay on this page to preview the Dashboard Item’, you can preview your item in this same page, and make changes to it until you are satisfied with the result.
Test, Test Set, and LastRun Fields In Instances Based Items
When creating items based on instances, you can select to show data from fields linked to tests, test sets, and runs in your dashboard item. In the drop down list of fields, fields starting with an entity name, followed by a colon, are fields from one of the different levels. Fields with no entity name at the beginning, are fields from the instance level.
For example, if I want to show information from a ‘Feature’ field linked to my tests in the test library - I will select the field ‘Tests: Feature’ as my X field.
See the full guide for dashboards here.
Cloning Dashboard Tabs
You can reuse dashboard items you already created by cloning a dashboard tab. For example, to use a dashboard tab you created for your previous cycle, for your current cycle, clone the previous cycle’s tab, and simply edit the filters that the dashboard items are focused on to the new cycle’s filters.
External Dashboards
The external dashboard feature allows you to share your dashboard tabs as an external URL, with your stakeholders and non-PractiTest users.
See more information about external dashboards here.
Reports
While dashboards provide you with a high level view of your project, you can utilize reports to get a deep level look into your data. There are two main types of reports available for all modules, and a few additional types depending on the module you want to create the report for.
Like with dashboards, test execution reports should be created based on the instance entity, and test automation execution reports should be created based on the run entity.
When creating a new report, make sure you select a filter from the relevant module to display information for a specific slice of your information. For example, to create a report based on information from ‘Cycle 3’, I will create an instance report with the ‘Cycle 3’ filter selected.
Tabular summary reports - Excel based reports. Using tabular summary reports, you can fully customize the data displayed in the report, decide which fields you want to display, and which charts you want to display. Tabular with steps reports are identical to tabular summary reports, but also include steps data, and issues data.
Detailed reports - PDF based reports that provide detailed information about the selected entity and scope.
Scheduling Reports
When creating a report, you can decide to schedule it to run on a daily, weekly or monthly basis. The report will be regenerated with the settings you define for it based on the selected interval, and will be sent by email to users you select from the reports settings.
See more information about the reports module here, and report samples here.
Test Automation Integration
With PractiTest, you can run, control, view and manage your automation testing alongside your manual testing. Thus, have complete visibility for your entire process in one place. Below you can find the details about the different methods available for integrating your automated testing into PractiTest.
API
We have a fully featured JSON-based REST API that covers all modules of the platform and can integrate PractiTest with any automated testing framework. Our API allows you to fetch details of test sets from the system and then push back the results of your automated runs into PractiTest. Our API documentation is here.
FireCracker
Firecracker is a PractiTest developed tool that allows you to integrate any CI/CD framework and any XML test result file with your PractiTest project. FireCracker allows you to parse and modify XML report files and upload them into PractiTest easily and automatically. See the step-by-step guide for using FireCracker here.
xBot
xBot is a PractiTest internal automation framework that allows users to run (or initiate a run by scheduling) automated test scripts on a remote machine from the PractiTest UI. See the step-by-step guide for using xBot here.