X

Technical info and insight on using Oracle Documaker for customer communication and document automation

  • ODEE
    November 17, 2015

Automating Oracle Documaker Interactive and WIP Edit Plug-In Using OpenScript – Part 2

Andy Little
Technical Director

Welcome back to the continuation of our discussion on testing automation with Oracle Documaker! In our last post
on Documaker regression testing, we explained the differences between
keyword-driven and data-driven frameworks - our testing strategy modeled after the framework proposed by Carl Nagle. For data-intensive
applications such as Oracle Documaker Interactive, it is preferable to use data-driven testing frameworks, because these frameworks more closely mirror the use case for Documaker Interactive. In this post, we will review the testing framework and process that is used by the Oracle Documaker Quality Assurance team for automated testing of Documaker Interactive in several typical use cases. Let's get started, shall we?

Test Framework and Design

As Nagle pointed out, a framework for automation is indispensable in creating repeatable tests to achieved repeatable results with the same or similar input data. The first step the design of the automated testing process is to define the parameters of the test: what is being tested, how it should be tested, and how we'll know if the test passed or failed. We know we're testing Documaker Interactive, and we know that it will either work to user satisfaction or it won't - so there's the first and last items done. But how will we test it? The functionality offered by Documaker Interactive contains many possible function paths that a user can take to achieve similar ends, so we need to determine the typical paths and automate those. A good rule of thumb is to consider what 80% of test cases should look like, and design test cases around those processes first. A typical test case might look like this (keep in mind that I'm abstracting some of the details for the sake of clarity - your actual test cases should be much more detailed):

  • Inputs: one or more recipients, each with address information, supplied via external data source
  • Functions: user creates a new transaction in Documaker Interactive, edits a few fields, saves, opens for edit again, completes.
  • Outputs: completed PDF document.
  • Pass/Fail: Pass if no errors occur and outputs contain expected data (e.g. supplied by external data source and the user). Fail if errors occur or data does not match.

What we've defined above is an abstraction of a testing scenario - a structured test with a defined input, output, and results. As you can see, the scenario is very granular, but is not specific to data - it's a functional test case to verify that the software does what it's supposed to do. It's entirely possible that your own test cases can (and should) be more specific to data, especially if you have an enterprise-wide system that accommodates more than one data source or services more than one line of business. After you have a library of scenarios built up, you'll have a test suite, which you can then use for regression testing on software upgrades, functional changes, and more. At this point you're probably thinking, "Right, ok, I have all that. You said we were actually doing to test something?" You're right, I did say that and we will - to do that, we're going to use a few software packages to assist us:

  • Oracle Automated Testing Suite - also known affectionately as OATS - which is available here, and includes OpenScript, which is documented here.
  • Java class Robot - to generate native system events based on keyboard and/or mouse interaction - JavaDoc is here.

For the purposes of this post, we're going to assume you've already installed OATS and Oracle Documaker Enterprise Edition (ODEE), of which Documaker Interactive is a part. If you haven't installed OATS, see the link provided above. If you haven't installed ODEE, I have a series of blog posts that detail an end-to-end installation and configuration of ODEE.

Scripts, Hierarchy, and Data Files

Testing is executed within OATS using a hierarchy of inheritance and execution. At the top of the hierarchy is the master script, which is used to coordinate execution of all the lower-level scripts. The next level is component script, which as the name implies defines the collection of scripts for a software component. Finally there is the scenario script, which is the lowest level and includes all the details of outlined above - inputs, outputs, functions, and pass/fail criteria. Here's a handy diagram in which we have defined multiple components and we show the scenario detail for one component:


The Documaker QA team has a test suite for Documaker Interactive that uses a data file to house all the testing configuration elements used by the master, component and scenario scripts in the test suite. For convenience, the data file is a spreadsheet which contains multiple worksheets, each with unique data that can be replicated and modified to extend the test cases as necessary. The data file is read
during the initialization phase of test execution and is stored in a global location for reference across multiple component and scenario levels. All three levels of the automation script use this data file. Let's review the hierarchical test phases and how the data file is used:

  • The
    master script controls the test.
    This script checks the component (specifically, the application) and
    platform being tested (i.e. Documaker Interactive, Windows). The master
    script references data in the ODEE_Components worksheet of the data file to know which components to execute. The master script then calls the appropriate component-level script in order. The ODEE_Components worksheet contains the following details: component name,
    release, environment (operating system), test run by, and date run. The
    component and environment cells are drop-down fields. Based on the
    selections made in the fields in this worksheet, the script picks the
    applicable URL and executes the associated test script.





  • The component scripts are used to determine which test scenarios will be run for a component. Each component script references one or more scenarios which are detailed in the TestScenarios worksheet of the data file. Each scenario can be turned on or off for a given component test using the Run Status column value of Y (include in test) or N (exclude from test). The TestScenarios sheet contains all the scenarios for the automation test and the
    supporting test data for all scenarios. When there are multiple values such as form names or attachment file names, the values should be separated by a semicolon (;). Refer to the example worksheet below. For Scenario_001, look at the Required Fields column and you will see the semicolon-delimited value "34564675;566787;37,500". This string will be parsed by the scenario script and populated into required fields.





  • Scenario scripts are the actual tests that are executed. Each scenario is created as a different method in OpenScript, based on the required functionality that needs to be performed. These methods can be reused and called by other scenarios, so it is possible for a basic scenario to have many variations with little actual code that must be created to support each.

The data file has other supporting sheets that are used by the various testing scripts for automation and control:

  • Interactive_Users -this sheet contains credentials, user roles, and the approval levels.

  • Interactive_forms - this sheet contains all forms with approval levels used for different test scenarios. When you add a new form, that form gets added to both the object library and to this worksheet. From there, the form can then be used across all scenarios. To do this, add the form name to the Forms_List column in the TestScenarios worksheet.

  • Addressees - this sheet is used to add addressees to the data set, which will be shown on the Addressee tab while creating documents within Documaker Interactive. A new addressee can be added in same pattern as defined in the Addressees worksheet.

Putting the Test Together

We have outlined the test cases and the test data for control and execution. Now comes the fun part - we actually need to build the test! But before we do that, I must remind you that it is important to figure out how wide your test cases ought to be. By width I mean how much of the system's functionality the test case should cover. It's tempting to make a scenario cover an entire end-to-end test, across all layers of the system, from upstream data feed to downstream printing or electronic delivery. With OATS you have the power to do that, but as a wise man once said, "With great power comes great responsibility," and test design is no different! A good practice, which is reinforced by the OATS hierarchical design, is the limit a scenario to functionality within a component. That way, you can limit the Documaker Interactive test to include only the functionality that's needed within that component, and external components should be covered by other scenarios. Why am I saying this now? Because as you're going to find out, we're jumping right into Documaker Interactive - no creating a transaction, dropping data, invoking a web service, or anything else. Our assumption will be that the data is there, because it was provided by another test scenario and therefore should be tested there. It will keep our test scenario smaller and easier to manage.While we're on the subject of test scenarios, I should point out that you can use the file system to your advantage here as well - since you're going to have a data file out there with all the control parameters for your scenarios, you can also create an attachments folder and use it to store any test documents that you will be attaching in Documaker Interactive (keeping in mind our plan to segregate test scenarios by component, we'll assume this attachment is coming from a user desktop, or provided by an external system).

As mentioned above, we're going to use the OpenScript component of OATS in combination with the Java class Robots. If you have used Documaker Interactive before, you know that it uses a plugin called WIPedit for facilitating data entry onto documents. Part of the process for test script creation includes the ability to record user interaction with a browser, which then generates the OpenScript code that you can customize. The OpenScript recording capability will capture user interaction with web components, but cannot capture events within WIPedit, and so for that we will use the Robots class to programmatically generate keyboard and mouse input. This screen shot below illustrates the area of differentiation between web components and WIPedit - note that the WIPedit area is everything below the toolbar, inclusive of the form set tree and the document preview window:


When recording your scripts, you'll need to note what input events (keyboard/mouse) are occurring that aren't going to be captured by the recording. In the screen shot above, I have clicked on Zoom Normal, which is a web component as it's in the toolbar. When I go back to edit the recorded script, I'll need to programmatically move the mouse and simulate clicks from the point of departure from web components. Here's a code snippet of how this will work:

oracle.oats.scripting.modules.functionalTest.common.api.internal.types.Point
p=web.element("{{obj.ODEE_Interactive.NewDoc_Document Tab_Zoom_Normal button}}").getElementCenterPoint();

Once the position of the Zoom Normal button is captured, I need to move the pointer 40 points down and 40 points left using Java Robot object
to place the mouse pointer on the document:

robot.mouseMove(p.x-40, p.y+40);

Now we'll execute a right-click to expose the context menu, move the pointer, and execute a left click to select the "Check Required Fields" menu item:

// Right CLICK
robot.mousePress(InputEvent.BUTTON3_MASK); 
robot.mouseRelease(InputEvent.BUTTON3_MASK);


// Moving to 1st option in right click menu "Check required fields"
robot.mouseMove(p.x-40+87, p.y+40+13);

// Left CLICK
robot.mousePress(InputEvent.BUTTON1_MASK);
robot.mouseRelease(InputEvent.BUTTON1_MASK);

There! From here we can continue to flesh out the remainder of the scenario until the test case is completed. This means populating any fields with data (e.g. from your data file to simulate user input), submitting for approval, generating previews, and the like - whatever is required for your test case. A special footnote: dialog boxes generated from WIPedit are detectable by OpenScript, so it is not necessary to use the Robots class to interact with these elements. Have fun putting together your scenarios - when you're done, it's time to execute the tests with OATS. A few pointers here:

  • ErrorScreens - your OATS scripts can store a screenshot of browser windows at the time an error occurs during execution of a test scenario, which is quite helpful in seeing what's happening from a user perspective. Screen captures will be stored in this directory and are named according to the release, build, environment, and scenario undergoing testing. Note that this particular naming convention is specific to Documaker QA's testing scripts, so you don't have to replicate this as-is.
  • OpenScriptLogs - logs for the test are stored in this subdirectory.
    Every activity along with the values for the web fields gets logged.
    Logs can be used for troubleshooting in the event of a test failure. If
    multiple iterations of the test are run on same environment, release,
    and build, the log gets appended. When the environment, release, and\or
    build changes, a new log file is created. This file gets initialized
    when the main script is executed.
  • TestReports - the test report of each successful test run is stored in
    the TestReports subdirectory. This file gets initialized when the main
    script is executed. The Test Report is in an *.xls format. If the test
    run is aborted or stopped for any unknown reason, the test report is not
    generated. The log file in the OpenScriptLogs will hold the report
    until the last successful test step is executed.

I hope you've enjoyed this glimpse into the world of regression testing, and that you were able to glean something useful that you can implement within your own environment. If you need assistance with regression testing, OATS, OpenScript, or any of the other technologies or concepts mentioned herein, please head over to the Oracle Forum and post a query. Until next time!

Be the first to comment

Comments ( 0 )
Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.