Notice: this Wiki will be going read only early in 2024 and edits will no longer be possible. Please see: https://gitlab.eclipse.org/eclipsefdn/helpdesk/-/wikis/Wiki-shutdown-plan for the plan.
Difference between revisions of "WTP Performance Tests/2008-07-21"
(New page: === Attendees === {|border=1 cellpadding=5 |- | Gary Karasiuk | Y |- | Angel Vera | Y |- | Mark Hutchinson | Y |- | Nick Sandonato | Y |- | Carl Anderson | Y |- | Kaloyan Raev | ...) |
m (→Minutes) |
||
Line 50: | Line 50: | ||
:We can start running the automated perf tests on it. If we want to do this on other platforms we should make a list of possible machines the perf team can gather. | :We can start running the automated perf tests on it. If we want to do this on other platforms we should make a list of possible machines the perf team can gather. | ||
− | + | :Above is about automated testing. How about ad-hoc perf testing? Incorporate as part of our smoke tests? Performance snapshots. The tester clicks a menu item to start recording the snapshot...?! | |
=== Action items === | === Action items === |
Revision as of 13:11, 21 July 2008
Attendees
Gary Karasiuk | Y |
Angel Vera | Y |
Mark Hutchinson | Y |
Nick Sandonato | Y |
Carl Anderson | Y |
Kaloyan Raev | Y |
Raghu Srinivasan | Y |
Neil Hauge | Y |
David Williams |
Agenda
- First meeting
- Explain current problems
- Discuss how we would like to see WTP performance tests look like.
Minutes
- Teams should own the perf tests, like they do the junit tests. This responsibility should not be centralized.
- Tests should be run in central controlled environment.
- There is good value to have tests running repeatedly, but they should be meaningful. Tests should be automated and run frequently enough.
- We want to reuse as much as we can. We should use the same framework as the Platform team does.
- Start with 1 test per component
- Let's start with looking at how the Platform team does their performance tests.
- At the moment we have one Windows Server 2003 machine for the performance tests (owned by Kaloyan).
- We can start running the automated perf tests on it. If we want to do this on other platforms we should make a list of possible machines the perf team can gather.
- Above is about automated testing. How about ad-hoc perf testing? Incorporate as part of our smoke tests? Performance snapshots. The tester clicks a menu item to start recording the snapshot...?!
Action items
- Gary - will spend one hour to see how the Platform are doing their performance tests
- Everybody else - do like Gary or at least spend some time reading the Performance wiki page and related links.
- Gary - put info on the wiki about how ad-hoc perf testing is done.
References
- Performance wiki page
- Performance results for WTP 2.0
- Performance results for WTP 2.0.1 (run on the new system as a basis for reference)