Skip to main content

Notice: This Wiki is now read only and edits are no longer possible. Please see: https://gitlab.eclipse.org/eclipsefdn/helpdesk/-/wikis/Wiki-shutdown-plan for the plan.

Jump to: navigation, search

4.5 Test Automation Initiative

Members

  • Paul Slauenwhite (IBM)
  • Alan Haggarty (IBM)
  • Joel Cayne (IBM)
  • Jonathan West (IBM)
  • Joanna Kubasta (IBM)
  • Kiryl Kazakevich (Intel)

Goals

  • Consolidation of our testing process and infrastructure.
  • Consolidation of custom test frameworks (Test Dashboard, AC/Profiler test server, variants).
  • Reuse of Test Project framework (ASF, reporting, etc.) and tools (AGR, JUnit, etc.).
  • Automatic test execution and report generation with every build (BVT).
  • Integrate EMMA with the build and test process for generating code coverage reports for all manual and automated testing.

Benefits

  • Specialization of testing skills (e.g. builds).
  • Provide an infrastructure that makes it easy for committers to contribute automated test cases/suites.
  • Manual labor savings by using automated tests, which are automatically executed with every build.
  • Decrease time/cost to resolve defects since uncovered earlier in the code-test-build cycle.
  • Decrease testing overlap by a) localizing test executing and reporting to one group and b) using code coverage statistics.

Delivery dates

  • November 14:
    • First draft of the first testing process document for review.
    • Proof of concept (PoC) for integration of ASF and TPTP builds for PMC approval and integrate with TPTP builds.
    • Status update for the PMC.
    • Measure cost savings.
  • November 21:
    • Review of the first testing process document.
    • First draft of the remaining testing process documents for review
    • Identify and resolve bugs.
  • November 28:
    • Final version of the the first testing process document.
    • Review of the remaining testing process documents for review
    • Status update including cost savings for the PMC.
    • Presentation to TPTP committers.
    • Each project automates their manual test suites and converge existing automated test suites over time to leverage the test infrastructure.

Testing Process

  • Summary:
    • A very lightweight testing process for TPTP.
    • Extends the existing TPTP Testing Strategy. The existing technical content will remain unchanged, aside from review changes.
    • Considered as a instruction manual (e.g. step-by-step) for testing TPTP for each type of testing scenario.
    • Sections include
      • Getting tests.
      • Creating tests.
      • Running tests.
      • Save test results.
      • Reporting results.
      • BVTs.
    • Two (or more) documents:
  1. High steps to use the testing infrastructure intended for first time users. This document will reference the other document(s).
  2. Low level detailed discussion on the motivation and design of the infrastructure intended for TPTP adopters or extenders.
  3. Additional documents detailing the specifics of a testing topics.


Common Test Infrastructure

  • Summary:
    • Reference platform:
      • Developers are expected to unit test on reference platform before checking-in code to CVS.
      • Target: IBM Java 1.5 (latest SR) and Windows XP/x86.
      • Goal to expand to a second reference platform for better test coverage (target: Sun Java 5 and Linux/x86).
      • IBM JREs for Windows and Linux (see IBM Development Package for Eclipse) are publicly available from developerWorks.
      • Once the PoC is complete, Intel will focus on integrating the testing for the native components (e.g. JVMTI profiler) with the test infrastructure.
    • ASF integrated with the TPTP builds by invoking the automated tests on reference platform via ANT:
      • Invoke tests.
      • Interrogate results.
      • Generate test reports.
    • TPTP test types:
      • AGR for functional UI testing.
      • TPTP JUnit and TPTP JUnit Plug-in test for unit testing.
    • TPTP build infrastructure:
      • Extract test suites from CVS.
      • Provision reference platform (Agent Controller and Eclipse/TPTP).
      • Install and configure Agent Controller and Eclipse/TPTP on the reference platform.
      • Clean-up reference platform.
      • Post test reports (test and code coverage) on TPTP web site.
      • Email notifications to component leads for failing test suites.
    • Limitations/Issues:
      • Can not test with the Integrated Agent Controller (IAC) since it cannot coexist with the Agent Controller running on the reference platform.
      • Intel needs to inform IBM that their portion of the build is complete (see defect 200351).

AsfBVT.JPG

Metrics

In order to evaluate this initiative, are measuring the labor costs (as accurately as possible) in Person Weeks (PW) to run:

  1. full test pass
  2. smoke test pass


  • Before Initiative:
ProjectFull Test Pass Cost (PW)Smoke Test Pass Cost (PW)
Platform52.5
Test52.5
Trace31.5
Monitoring31.5
Total168


  • With Test Automation:
ProjectFull Test Pass Cost (PW)Smoke Test Pass Cost (PW)
Platform42
Test41.5
Trace31.5
Monitoring2.51
Total13.56

References

Meeting Minutes

Back to the top