Skip to main content
Mitratech Success Center

Client Support Center

Need help? Click a product group below to select your application and get access to knowledge articles, webinars, training content, and release notes or to contact our support team.

Authorized users - log in to create a ticket, view tickets status and check your success plan details.

 

Test Plan Templates for the Hosted Test Environment

Before the Suite hosted environment is upgraded from Suite 8.10 to 8.11, we want to make sure that you have an opportunity to test rules or reports in a safe environment to ensure that everything works properly before the upgrade. By creating a test plan and test cases, you have an organized record system that can help you discuss any concerns with your Mitratech Support or Services representative.

What is a Test Plan?

A test plan represents a comprehensive effort that covers all requirements listed in the Technical Specification, provided by Mitratech.  A test plan contains a number of test cases, which are the sets of conditions and steps that a tester uses to test a scenario. For each rule, report, or other implementation being tested, Mitratech recommends creating a separate test plan in its own Excel workbook. Each test plan workbook has a separate tab for each test case. 

Creating a Test Plan

  1. Open the test plan template workbook and save it with a unique name.
  2. On the Test Plan tab, enter the following:
    • Business Requirement: The name of the rule, report, or implementation being tested.
    • Specification File: Name of technical specification document related to the custom work. Mitratech includes the most recent version of the technical specification in the software package deliverable. If you do not have a copy, contact your Mitratech account representative.
    • Specification Summary: A description, in your own words, of what the rule does.
    • Scope and Assumptions
      • Scope example: If the company only imports LEDES 2000 invoices, then LEDES 98 invoices do not need to be tested.
      • Assumptions example: The rule has been tested before and is undergoing a minor update. The tester could reasonably assume that besides specific testing of new functionality, the rest of the rules requirements will only require a basic regression test. 

Creating a Test Case

  1. Make a copy the Test Case 1 tab for each test case that you expect to have.
  2. For the first test case, on the Test Case 1 tab, enter the following background information on the new tab:
    • Test Case: Name your test cases in a way that makes sense to anyone who is going to refer the test cases in future.
    • Description: 
      • What is rule or report is being tested?
      • What behavior is being verified?
      • What is the test data involved in the test?
      • What is the test environment?
    • Preconditions:
      • User data dependency
        • What type of user should be logged in? For example, an administrator or an end user
        • What type of user role or permissions should the user have?
      • Dependencies on test environment
        • Does the tester have proper access to the test environment?
        • Does the tester’s computer need to have any particular requirements?
        • Does the tester need to use a particular browser?
      • Prerequisites
        • What configuration needs to be done before the test? For example, a rule needs to be imported and enabled.
        • Dependencies on any other test cases – does the Test Case need to be run before/after some other test case?
    • Constraints: List any possible issue that will place constraints on testing. A good example is if the test would require another, separate rule to be running in order for testing to be completed.
    • Prepared by: The name of the creator of the test case. 
    • Tested by: The name of the person executing the test case.
    • Date Prepared: The date the test case was written.
    • Date Tested: The date the test case was executed.
  3. In the User Activities section, enter the following information:
    • Activity Description: In sequential order, list the actions the tester must perform to successfully execute a test. Each step should include only one action to avoid ambiguity during testing.
    • Expected results: Each test step should clearly describe what you expect as outcome of that verification step. Include information such as what page/screen you expect to appear after the step and, any updates you expect. If needed, you also can attach screenshots or documents to the relevant step.
    • Actual Results: What actually happened. If the results are not the same as the expected results, include any relevant information such as error messages or screen shots.
       

Best Practices

  • When writing a test case, the test steps should be detailed enough that a user with a functional understanding of the application should be able to understand them and recreate the test results without difficulty.
  • Keep test cases small, aiming for fewer than 25-30 steps. Smaller test cases help narrow down the cause of any ambiguous results, decrease the number of steps during retesting, and help the tester focus on the intended result.
  • Was this article helpful?