Skip to main content

Notice: This Wiki is now read only and edits are no longer possible. Please see: for the plan.

Jump to: navigation, search

Babel / Large Contribution Import Process

This document describes the process the Babel committer uses to import large contributions into the Babel server.

Submitting such a contribution is documented here:


  1. Make sure the contribution follows the spec, and that is has been approved by Eclipse Legal by submitting it to IPZilla
  2. Unzip the file to
    1. To copy the files to the import directory you will need to su as genie
  3. Inspect some files in each language to ensure they are properly encoded as Unicode. Run/hack as needed to convert the files from their current encoding to Unicode.
    1. The unix command 'find' can be used to help determine the encoding.
    2. example: find /tmp/tmp-import/ -type f -exec file {} \;
  4. Determine the accuracy of the contribution.
    1. Reviewed: Translations were done by professionals, and were reviewed and tested in context by loading them up in Eclipse
    2. Fuzzy: Translations were not done by professional translators; translations were done using software and dictionaries; translations were done by professional translators, but were not reviewed and tested in context.
    3. If unsure, ask the contributor. If unsure, choose Fuzzy.

Import on Staging

  1. su - genie
  2. copy the CVS version of import_translation_zip.php to the STAGING web root: cp /home/data/httpd/ /home/data/httpd/
  3. Edit /home/data/httpd/ to ensure the target project_id and version are set (see note, below)
  4. Set the Fuzzy factor according to the accuracy of the contribution.
  5. 'run' the above file using wget: wget <-- file name fudged to avoid crawlers
  6. Check the translations using the
  7. If something went wrong, you can still delete whatever you contributed by logging into the Mysql database and running this query for example (it deletes all the records contributed during the current day):

delete from translations WHERE UNIX_TIMESTAMP(created_on) BETWEEN UNIX_TIMESTAMP(CURDATE()) AND UNIX_TIMESTAMP(DATE_ADD(CURDATE(), INTERVAL + 86399 SECOND)); Or if you need to delete all the records for a specific project: delete from translations where string_id IN (select string_id from strings, files where strings.file_id = files.file_id and files.project_id = "birt"); Manipulate those expressions with care and make sure you use them on the staging database exclusively.

Import on Live

  1. su - genie
  2. copy the STAGING version of import_translation_zip.php to the LIVE web root: cp /home/data/httpd/ /home/data/httpd/
  3. 'run' the above file using wget: wget <-- file name fudged to avoid crawlers
  4. Check the translations using the
  5. Delete the import script: rm /home/data/httpd/
  6. Delete the translation files.

Note: If a contributions crosses many projects, such as those from the JapanWG, simply delete the following line in the SQL statements (there are currently two occurrences of this line) :

   AND F.project_id = '" . $PROJECT_ID . "' AND F.version = '" . $VERSION . "'

Back to the top