Jump to: navigation, search

Difference between revisions of "WTP Performance Tests/How To"

(New page: === How to create a performance test case? === The [http://dev.eclipse.org/viewcvs/index.cgi/org.eclipse.test.performance/doc/Performance%20Tests%20HowTo.html?view=co Performance Tests How...)
 
m
Line 18: Line 18:
  
 
  -os ${target.os} -ws ${target.ws} -arch ${target.arch} -nl ${target.nl}
 
  -os ${target.os} -ws ${target.ws} -arch ${target.arch} -nl ${target.nl}
+
 
 
and '''VM arguments''' that could look like:
 
and '''VM arguments''' that could look like:
  

Revision as of 10:32, 6 October 2008

How to create a performance test case?

The Performance Tests HowTo document describes well how to create a performance test case. Useful tips:

  • Start with a clean Debry DB instance. Do not reuse a DB that was used for other purposes.
  • Always make an archive of the Derby DB before each run. In case of wrongly confugured runs of the performance tests you may "polute" your DB with unneeded data. In such case you can easily restore the DB from the archive.
  • Make sure that you are not connected to the Derby DB during test execution. An active connection may prevent the establishment of new connections and this can result in NullPointerException.
  • Use the following pattern for the -Declipse.perf.config property:
-Declipse.perf.config=build=<build_id>;host=<host_where_the_tests_are_run>;jvm=<jvm_used>

Examples:

-Declipse.perf.config=build=3.0_200808010000_200808010000;host=my_host;jvm=sun
-Declipse.perf.config=build=I20080915-0000;host=my_host;jvm=sun

How to create a baseline?

The baseline is called a perfromance test run, which results are used for comparision with future runs. A baseline can be easily created by running the performance test suite as a JUnit Plug-in Test with Program arguments:

-os ${target.os} -ws ${target.ws} -arch ${target.arch} -nl ${target.nl}

and VM arguments that could look like:

-Xms256M -Xmx256M
-Declipse.perf.dbloc=C:\eclipse\perf\wtp-perf-db
-Declipse.perf.config=build=3.0_200808010000_200808010000;host=my_host;jvm=sun

It is very important that the build part in the eclipse.perf.config VM argument has the format X.Y_YYYYMMDDHHMM_YYYYMMDDHHMM, where

  • X.Y is the version of the build, and
  • YYYYMMDDHHMM is the timestamp of the build.

This format will be later required by the graph generation tool.

How to run tests compared to a baseline?

Here we have another run of the performance test suite, but this time compared to an already created baseline in the DB. The run configuration is similar to that of creating a baseline - a JUnit Plug-in Test configuration with Program arguments:

-os ${target.os} -ws ${target.ws} -arch ${target.arch} -nl ${target.nl}

and VM arguments that could look like:

-Xms256M -Xmx256M
-Declipse.perf.dbloc=C:\eclipse\perf\wtp-perf-db
-Declipse.perf.config=build=I20080915-0000;host=my_host;jvm=sun
-Declipse.perf.assertAgainst=build=3.0_200808010000_200808010000;host=my_host;jvm=sun

Note that this time the format of the buildpart in the eclipse.perf.config VM argument is different: TYYYYMMDD-HHMM where

  • T is the build type: R, M, S, I or N, and
  • YYYYMMDD-HHMM is the timestamp of the build.

This format is again important for the graph generation tool.

The value of the -Declipse.perf.assertAgainst VM argument must point to the value of -Declipse.perf.config on the run configution of the baseline we want to compare against.

How to create graph results?

After we have created the baseline and run the performance tests against the baseline, we can now run the graph generation tool to create the comparision graphics. The tool documentation does not provide enough information how to successfully run the tool. There is also a limitation in the tool to generate results for the WTP tests. The below info feels the gap.

First, the org.eclipse.test.performance plug-in must be checked out and the patch from bug 244986 needs to be applied.

Second, the org.eclipse.releng.basebuilder project must be checked out from CVS. It contains the graph generetion tool - under /plugins/org.eclipse.test.performance.ui.

Finally, the tool can be executed as an Eclipse application. The run configuration must have Program arguments that look like:

-baseline 3.0_200808010000_200808010000 
-current I20080915-0000 
-jvm sun 
-config my_host 
-scenario.pattern org.eclipse.% 
-output C:\eclipse\perf/perfRoot/results/graph 
-print

and VM arguments that look like:

-Xms40m
-Xmx512m
-XX:MaxPermSize=256M
-Dosgi.ws=win32
-Dosgi.os=win32
-Dosgi.arch=x86
-Declipse.perf.dbloc=C:\eclipse\perf/wtp-perf-db
-classpath C:\eclipse\perf\perfRoot\eclipse\plugins\org.eclipse.equinox.launcher_1.0.100.v20080509-1800.jar

The tool generates graphics for the comparision between the builds specified by the -baseline and -current program arguments related to the VM, specified by -jvm, and host, specified by -config. Graphics are generated only for scenarios filetered in the -scenario.pattern argument - it is very important to give org.eclipse.% here. The tool looks for data in the DB specified by the -Declipse.perf.dbloc argument and writes the result in the location specified by the -output argument.