Skip to main content

Notice: this Wiki will be going read only early in 2024 and edits will no longer be possible. Please see: https://gitlab.eclipse.org/eclipsefdn/helpdesk/-/wikis/Wiki-shutdown-plan for the plan.

Jump to: navigation, search

Difference between revisions of "SCA/SCA Component/SCA Testing SCA Applications Principles"

< SCA
m (Pros and Cons / Test Philosophy)
m (Reminder)
Line 1: Line 1:
 
== Reminder ==
 
== Reminder ==
  
Testing unit and integration tests for an SCA application consists in testing component services, integration of the components and composite services.  
+
Testing unit and integration tests for an SCA application consists in testing component services, integration of components and external services, and composite services.  
 
In the following, we call "SCA element" a component or a composite.
 
In the following, we call "SCA element" a component or a composite.
 
 
  
 
== Basis Principles ==
 
== Basis Principles ==

Revision as of 11:47, 22 July 2008

Reminder

Testing unit and integration tests for an SCA application consists in testing component services, integration of components and external services, and composite services. In the following, we call "SCA element" a component or a composite.

Basis Principles

Writing and performing unit and integration tests for SCA applications is very similar.

The only difference in the tooling job is the number of mocked up references. And for the user, what changes from one to the other is the interaction with the tooling (dialogs, right-clicks...) and the TestCases he has to write.


Writing tests for an SCA element consists in:

  • Generating testing code to run these tests, distinct from the business code which should not be modified to run tests. This can be made by the tooling.
  • Completing a JUnit TestCase skeleton with test methods. This has to be done by the user.

This is a common to unit and integration tests. Only the number of mocked up references changes. In unit tests, all the references are mocked up. In integration tests, this number vary from 0 to n-1 references (where n is the number of references of the element).


Roughly, here are the usage steps:

  1. You have an SCA application in your work space (composite, interfaces, implementations).
  2. You ask to generate a testing project.
  3. for( every element to test )...
    1. You ask to generate a TestCase.
    2. The tooling generates a new SCA application (composite, interfaces, implementations) from the original SCA application. This means that several SCA applications co-exist in your workspace and share common elements. It sets coherence issues that are discussed farther.
    3. The user completes a JUnit TestCase by using EasyMock features to mock references.
    4. The user can run thse tests from Eclipse or keep them for later.


Generated testing applications have the following properties:

  • They are SCA projects with Java implementations only.
  • They are made up of two components: the first one's implementation contains the TestCase. The second one is related to the element to test. References of this second components are all wired to services of the first component, which will define references behavior during tests (in testing methods).
  • In any case, they respect one of the two followingpatterns.


Testing a component

SCAtestPrinciples ComponentPattern.gif

In this case, we copy-paste the component and its elements in the testing application.


Testing a composite

SCAtestPrinciples CompositePattern.gif

In this case, we generate a component with the same services and references than the composite, and having the composite as implementation (which means we copy the entire business application in the testing application). To avoid this situation, we should allow the definition of dependencies between SCA projects, in the same way it is done with Java projects.


This implicitly means:

  • One test => One SCA application. This is a strong constraints, but it allows you to separate your tests and to not modify your business application. Besides, this will let you the possibility to re-run these tests in a future iteration (non-regression tests).
  • An excellent organization of the test project to keep it useable and manageable.
  • Coherence constraints must be checked so that when the tested project evolves, the tester project evolves in consequence. As an example, if an implementation is changed, and that a test is using it, then this implementation should be reimported into the tester project. This could be done using a builder.


Pros and Cons / Test Philosophy

Pros

  • We do not touch the business code.
  • Tests are saved from one iteration to another, so that we can run non-regression tests.
  • Most of the testing code is generated, the tester can focus on writing TestCases.
  • We can use JUnit and EasyMock even when the business implementation are not written in Java (the SCA platform deals with that for us).


Cons

  • We do not simply test services. We tests services by defining references behavior. This requires the tester to know how references are used by the implementations and in which conditions. It means we can not have the same approach with SCA elements than with Java interfaces / classes (i.e. be able to write tests only from the interfaces). This might be solved by providing more flexibility in the use of EasyMock (e.g. let the implementation define the bahvior of the mock at runtime, so that the user can avoid defining it himself). This ia an issue if we want to do "Test Driven Development" with SCA projects.

Back to the top