Skip to main content

Notice: this Wiki will be going read only early in 2024 and edits will no longer be possible. Please see: https://gitlab.eclipse.org/eclipsefdn/helpdesk/-/wikis/Wiki-shutdown-plan for the plan.

Jump to: navigation, search

Difference between revisions of "Testing Process Part 2 Draft 1 Comments"

(Alan Haggarty)
Line 10: Line 10:
 
** Why do we need section 5.0?  Are there really exceptions now that Jonathan has automated the control channel tests?  If they cannot be fully automated, they should be manual test with instructions on how to run the automated bits.
 
** Why do we need section 5.0?  Are there really exceptions now that Jonathan has automated the control channel tests?  If they cannot be fully automated, they should be manual test with instructions on how to run the automated bits.
 
** Section 6.0 is the really value of this section document, which needs to be completed.
 
** Section 6.0 is the really value of this section document, which needs to be completed.
 +
 +
== Jonathan West ==
 +
 +
 +
Here are my section 5 comments, as requested:
 +
 +
BTW, 5.1 section has a typo, should read the 'JVMPI tests' rather than the 'JVMTI' tests.
 +
 +
 +
'''5.2 JVMTI Profiler Test Suites''':
 +
 +
Like the JVMPI Profiler tests, the JVMTI profiler tests are run like regular JUnit tests, but require manual configuration of a local configuration file. Unlike the JVMPI tests, it is not required to install a test server on the remote machine, or to edit remote configuration files, as these functions are now fully integrated into the local test installation.
 +
 +
 +
'''5.3 Platform.Communication.Control.Channel.Test automated test''':
 +
 +
The control channel tests are run like a regular jUnit test, with test execution being fully automated. The jUnit test will connect to the remote hosts being tested, configure the environment, run the test, and compare the results from the hosts. Presently, z/OS on zSeries and AS400 are not supported.  Execution results must still be generated manually.
 +
 +
 +
''''5.4 Platform.Communication.Request.Peer.Monitor automated test'''
 +
 +
While request peer monitoring is still a manual test, it is a tool-assisted manual test, meaning that through automation of installation, configuration, and execution, the total test run time is reduced by about 80% versus a fully-manual run. Execution results must still be generated manually. z/OS on zSeries and AS400 are not supported.
 +
 +
--[[User:Jgwest.ca.ibm.com|Jgwest.ca.ibm.com]] 11:59, 29 November 2007 (EST)

Revision as of 10:19, 3 December 2007

Alan Haggarty

  • One comment I have from them though - the implication that all future test pass 1's will be full test passes and test pass 2's will be smoke test passes? I did not realize this was set as policy, nor do I agree with it. In the last test pass it seemed to just be a coping strategy to the fact that we started test pass 1 without a good driver. It doesn't solve that problem. What assurance do we have that the code changes between test pass 1 and test pass 2 will be appropriate for smoke testing only?
    • Response (Paul Slauenwhite): Ultimately it is the responsibility of the project lead to determine what is ran for either test pass (e.g. test plan). From a project perspective, we schedule less time for the TP2, so by definition it is a subset of the test cases ran for TP1. For example, if a defect was fixed between TP1 and TP2, the project lead would ensure test cases are ran in TP2 to cover the function in and around the fix. Given the resources, this test automation, and the short iteration cycles, there should be no need to run two full test passes per iteration. Ideally, we should be trying to get TP1 and TP2 down from 3 weeks to 1 - 1.5 weeks.

Paul Slauenwhite

    • Remove 1.0.
    • Move 2.0, 3.0, and 4.0 to part 1 (see my comments on part 1).
    • Why do we need section 5.0? Are there really exceptions now that Jonathan has automated the control channel tests? If they cannot be fully automated, they should be manual test with instructions on how to run the automated bits.
    • Section 6.0 is the really value of this section document, which needs to be completed.

Jonathan West

Here are my section 5 comments, as requested:

BTW, 5.1 section has a typo, should read the 'JVMPI tests' rather than the 'JVMTI' tests.


5.2 JVMTI Profiler Test Suites:

Like the JVMPI Profiler tests, the JVMTI profiler tests are run like regular JUnit tests, but require manual configuration of a local configuration file. Unlike the JVMPI tests, it is not required to install a test server on the remote machine, or to edit remote configuration files, as these functions are now fully integrated into the local test installation.


5.3 Platform.Communication.Control.Channel.Test automated test:

The control channel tests are run like a regular jUnit test, with test execution being fully automated. The jUnit test will connect to the remote hosts being tested, configure the environment, run the test, and compare the results from the hosts. Presently, z/OS on zSeries and AS400 are not supported. Execution results must still be generated manually.


'5.4 Platform.Communication.Request.Peer.Monitor automated test

While request peer monitoring is still a manual test, it is a tool-assisted manual test, meaning that through automation of installation, configuration, and execution, the total test run time is reduced by about 80% versus a fully-manual run. Execution results must still be generated manually. z/OS on zSeries and AS400 are not supported.

--Jgwest.ca.ibm.com 11:59, 29 November 2007 (EST)

Back to the top