When implemented manually, software can be a laborious and time-consuming process, and is not always effective in identifying certain classes of defects. The benefit of test automation is that these kinds of tests can be performed effectively, quickly and repeatedly. For your business, this is a cost-effective way to ensure your software products have a considerably longer maintenance life. Most often, manual tests are automated with little or no visibility leading to tests being re-tested manually due to a lack of testing synergy.

Unfortunately even organisations with mature automation capabilities are spending more time investigating automated test logs due to a disconnect between the original manual test case design and what was verified by the automated test.

The problem is generally amplified when there are two or more separate testing teams, i.e:

  1. The Test Analysts – Manual Testing Team/s (responsible for designing tests, executing them manually prior to assigning the tests for automation)
  2. The Test Engineers – Automated Testing team (developing the automated test, executing and supporting)

Automated testing should not alter or change your standards with test execution and reporting.  The team should not feel like automation is a mysterious process due to the lack of detail and confirmed test coverage against test steps designed by the Test Analyst.

It is recommended to be working off an integrated test management system where your automated tests are managed and reported within the same testing platform as what the rest of the team are using for manual testing. This means when you package a set of test cases for manual testing, for example your regression testing suite, both your manual testing team and automation solution will be testing and reporting from the same suite or package within your test management system.

Over time while you are building your automated tests your manual effort should be reduced eventually to your intended automation target, allowing the testing team to focus on test results and other tests and issues which haven’t been automated.

Successfully integrating automated and manual testing

Both commercial and open source automation tools come with robust reporting features, however as mentioned above, the integration with Manual testing is limited preventing the steps executed from being written correctly against the test case details. If your automation and Test Management  system cannot be integrated then as a minimum you need be implementing a solution where your steps and results are linked to the test case details with all the necessary info and associated test execution comments.

Vansah integrated automated software testing solution

The process is to create readable test scripts which can be managed and supported by both automated and manual testers.

Your manual testers simply design the tests by defining the steps and expected results in Vansah as a test case.

Once you have defined your tests its recommend to verify the test works manually capturing the actual results and detailed logs which the automation team will be able to reference when they construct the automated script to be linked to the Test Case. Linking the automated script to the manual case is managed through the automation service available for the preferred robot of choice.

In Vansah each test case step is identified by a unique automation ID.  In the example below we refer to the second step with an automation id, ie: 132.

From your automated script simply reference the step by passing the code = 132 as a value for Test Step see below.


Now from Vansah you will see the results against the test case linked to the test case step (132)

The Overall Test Case is still shown as In Progress (PASSED) since the entire test execution against the test case hasn’t completed however testers can view the results in real time and be alerted once the tests have completed.


TestPoint’s ™ automation Vansah™  QA framework will rapidly increase test coverage and automation adoption by ensuring your tests are visible to the team,  aligned to your testing governance along with real time detailed automation reporting.