Skip to main content
Jump to: navigation, search


Revision as of 02:24, 13 September 2010 by (Talk | contribs) (New page: The Object Teams Development Tooling is built in two main stages: == Building and Testing with PDE/Build == This is basically a plain vanilla PDE/Build setup with the following customiza...)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

The Object Teams Development Tooling is built in two main stages:

Building and Testing with PDE/Build

This is basically a plain vanilla PDE/Build setup with the following customizations:

  • a top-level build.xml ant file initializes a myriad of properties and then
    1. prepare building:
      • unzip a drop of the Eclipse SDK
      • put into dropins (patched version to handle trunk as expected, bug 301045)
    2. create the OT/J compiler by 1. invocation of PDE/Build
    3. build the OTDT and tests by 2. invocation of PDE/Build
      • preparations before this stage
        • unpack eclipse-test-framework to make it available while compiling tests
        • pre-load the output repository location from the previous build including category-less metadata
      • polish after this stage
        • patch content.xml re version range in feature reference from patch feature (see also below, Publishing step 5)
    4. delegate to the next level for testing
  • the main build.xml is a fairly thin layer for steps 2 & 3 above: it handles some properties and then delegates to one of two final PDE/Build stages:
    • OT-Compiler: a PDE/Build with a generated build.xml and these customizations:
      • fetch from svn
      • install the special Bootstrap_MANIFEST.MF in order to create org.eclipse.jdt.core in the exact same version number as we have it in our base install.
      • at end of build insert the resulting org.eclipse.jdt.core into the base install (simply replacing the existing plugin jar)
    • OTDT: a second PDE/Build with a generated build.xml and these customizations:
      • at this stage, the jdt.core is our version, so we can actually compiler OT/J code
      • fetch from svn
      • by setting p2.gathering=true in we trigger a p2-based build.
  • the final test.xml drives setup and execution of tests:
    • create the software-under-test:
      • unzip the base Eclipse SDK
      • call p2 to install the OTDT from testrun/otdtUpdatesDir
    • call p2 to install the tests from testrun/updateSiteTests/eclipse
    • invoke all tests in sequence, has support for parallization (currently unused)

Publishing a p2 Repository

After building and testing were successful a shell script performs these steps:

  1. Jar signing
    • zip all jars into a big ball - skip symlinks which only point to artifacts from previous build
    • send to the signing daemon, wait for the result to appear
  2. Populate a new repository stagingRepo
    • symlink artifacts from existing update site -- this ensures exact same bits, so unchanged plugins can be re-used during updating
      would we copy these files they were more difficult to distinguish/skip in subsequent phases
    • physically copy bcel jar
      how can we ensure that clients in all update-scenarii will find bcel from original orbit repo?
  3. Pack200 all jar files
    apparently conditioning is already performed by PDE/Build for p2?
  4. Generate p2 metadata
  5. Patch content.xml as to widen the version range of references to
    The patch feature org.eclipse.objectteams.otdt.core.patch must refer to an exact version of the feature which it patches, but even the same jdt bits will create different qualifiers in subsequent builds, so when we build against jdt RC3 it wouldn't be compatible with the final release even if jdt has no changes. - see bug 304156
    This is done using patch-content-xml.xsl
  6. Archive category-less metadata as the basis for future builds
    cumulative repositories and categories don't play well together, because a category, once created cannot be augmented see bug 251888#c11
  7. Generate the category
  8. Add download statistics capability
    use XSL transformation addDownloadStats.xsl, which may not be a very safe way of doing this
  9. Jar-up metadata
  10. Remove symlinks which were needed to resolve dependencies on previous builds

Now the stagingRepo is ready to be copied over the previously published repository.

The script accepts these parameters: updateMasterRelativePath [ -nosign ] [ statsRepoId statsVersionId ]
  • updateMasterRelativePath:
relative path under downloads/objectteams/updates pointing to a repository from which to resolve unchanged bundles/features
specify none for an un-parented repository
  • -nosign
skip signing
  • statsRepoId
this identifies the repository in the download statistics, currently either 0.7 or unstable
  • statsVersionId
this identifies the version in the downloading statistics, currently 0.7.1

Updating to a new Eclipse version

Throughout the build process various version dependencies must be observed, thus moving to a new version of Eclipse to build against involves a number of adjustments:

  • First our SVN branch/tag needs to be selected
    • enter this path in (for bootstrapping our map files via variable mapVersionTag)
    • if not building from trunk adjust map files, insert the branch using the tag property (this requires a further patched version of to avoid qualifier replacement using the branch name).
  • Next the exact version of a downloadable SDK build must be selected
    • this version goes into file, variables EVERSION and DROP
  • on I have a script bin/extractVersions that extracts these numbers from the SDK tar ball:
    • versions of and equinox.launcher: insert in
    • version of the jdt feature:
      • short format into (incl. .next: incremented by one for version range)
      • long format into feature.xml of otdt.core.patch
    • versions of the jdt.core plugin
      • original version into {EOT}.otdt.feature/ot-compiler-feature/feature.xml and org.eclipse.jdt.core/META-INF/Boostrap_MANIFEST.MF
      • actual OT version in odtd.core.patch

Back to the top