Skip to main content

Notice: This Wiki is now read only and edits are no longer possible. Please see: for the plan.

Jump to: navigation, search

SMILA/Documentation/HowTo/How to implement a crawler

This is deprecated for SMILA 1.0, the connectivity framework is still functional but will aimed to be replaced by scalable import based on SMILAs job management.

Explains how to implement an Crawler and add its functionality to SMILA.

Prepare bundle and manifest

  • Create a new bundle that will contain your crawler. Follow the instructions on How to create a bundle. In this sample we use the prefix myplugin.crawler.mock for the name of project.
  • For crawler JXB code generation we need to import SMILA.builder project into our workspace.
  • Edit the manifest file and add at least the following packages to the Import-Package section.
    • org.eclipse.smila.connectivity;version="1.0.0"
    • org.eclipse.smila.connectivity.framework;version="1.0.0"
    • org.eclipse.smila.connectivity.framework.performancecounters;version="1.0.0"
    • org.eclipse.smila.connectivity.framework.schema;version="1.0.0"
    • org.eclipse.smila.connectivity.framework.schema.config;version="1.0.0"
    • org.eclipse.smila.connectivity.framework.schema.config.interfaces;version="1.0.0"
    • org.eclipse.smila.connectivity.framework.util;version="1.0.0"
    • org.eclipse.smila.datamodel;version="1.0.0"
  • you will have to add additional packages to fill you crawler with business logic !
  • Now your MANIFEST.MF file should be like
Manifest-Version: 1.0
Bundle-ManifestVersion: 2
Bundle-Name: Mock Crawler
Bundle-SymbolicName: myplugin.crawler.mock
Bundle-Version: 1.0.0
Bundle-RequiredExecutionEnvironment: JavaSE-1.6

Prepare DataSourceConnect schema and classes

  • create an additional source folder code/gen to contain the generated schema sources
    • Right-click your bundle and click New > Source Folder.
    • Enter "code/gen" as the folder name.
    • edit and add folder code/gen to the source folders.
source.. = code/src/,\
output.. = code/bin/

  • create schema definition
    • create a folder schema in your bundle
    • create file schemas\MockCrawlerSchema.xsd to contain the XSD schema for the crawler configuration based on the abstract XSD schema "RootDataSourceConnectionConfigSchema"
    • therin you have to provide definitions of "Process" and "Attribute" nodes for crawler specific information
    • the following code snippet can be used as a template
<?xml version="1.0" encoding="UTF-8"?>
<xs:schema elementFormDefault="qualified" attributeFormDefault="unqualified" xmlns:xs="">
  <xs:redefine schemaLocation="../../org.eclipse.smila.connectivity.framework.schema/schemas/RootDataSourceConnectionConfigSchema.xsd">
    <xs:complexType name="Process">
        <xs:documentation>Process Specification</xs:documentation>
        <xs:extension base="Process">
    	  <\!--define crawler specific process here -->
    <xs:complexType name="Attribute">
        <xs:extension base="Attribute">
    	  <\!--define crawler specific attributes here -->
  • create JAXB mapping
    • create file schemas\MockCrawlerSchema.jxb to contain the JAXB mappings used for generating configuration classes.
    • Here is an example for the MockCrawler JXB file you can use as a template, just rename the "schemaLocation" and "package name":
<jxb:bindings version="1.0" 
  <jxb:bindings schemaLocation="MockCrawlerSchema.xsd" node="/xs:schema">
      <jxb:package name="mypackage.crawler.mock.messages"/>
      <jxb:javaType name="java.util.Date" xmlType="xs:dateTime" printMethod="" parseMethod=""/>
      <jxb:javaType name="org.eclipse.smila.connectivity.framework.schema.config.MimeTypeAttributeType" xmlType="MimeTypeAttributeType" parseMethod="org.eclipse.smila.connectivity.framework.schema.config.MimeTypeAttributeType.fromValue" printMethod="org.eclipse.smila.connectivity.framework.schema.config.MimeTypeAttributeType.toValue"/>
      <jxb:serializable uid="1"/>

  • Add a schema location reference in the plug-in implementation
    • Create a new class (DataSourceConnectionConfigPluginImpl) which implements the interface DataSourceConnectionConfigPlugin.
    • Use the method String getSchemaLocation() to return "schemas/MockCrawlerSchema.xsd".
    • Use the method String getMessagesPackage() to return package name"mypackage.crawler.mock.messages".
Here is an example implementation for the MockCrawler you can use as a template:
package mypackage.crawler.mock;
import org.eclipse.smila.connectivity.framework.schema.DataSourceConnectionConfigPlugin;
 * The Class DataSourceConnectionConfigPluginImpl.
public class DataSourceConnectionConfigPluginImpl implements DataSourceConnectionConfigPlugin {
   * {@inheritDoc}
   * @see org.eclipse.smila.connectivity.framework.schema.DataSourceConnectionConfigPlugin#getSchemaLocation()
  public String getSchemaLocation() {
    return "schemas/MockCrawlerSchema.xsd";
   * {@inheritDoc}
   * @see org.eclipse.smila.connectivity.framework.schema.DataSourceConnectionConfigPlugin#getMessagesPackage()
  public String getMessagesPackage() {
    return "mypackage.crawler.mock.messages";
  • create new file plugin.xml
    • define the extension for org.eclipse.smila.connectivity.framework.schema.extension, using the bundle name as ID and NAME.
    • set the schema class to your implmenetation of interface DataSourceConnectionConfigPlugin
    • Here is an example for the MockCrawler plugin.xml file you can use as a template:

  • Compile schema into JAXB classes by using ant
    • See SMILA/Development Guidelines/Setup for JAXB code generation for instruction on how to setup the JAXB generation tools. It is advised to let lib outside the workspace, for example in a lower level folder. (my -Dlib.dir=../../
    • create a new file build.xml to contain JXB build information. Use the following template as the content for file build.xml and rename the property value accordingly:
<project name="sub-build" default="compile-schema-and-decorate" basedir=".">
  <property name=""  value="MockCrawlerSchema" />
  <import file="../SMILA.builder/xjc/build.xml" />
    • Launch ant -Dlib.dir=../lib from a cmd console to create the java files or to see any error messages.

Note: If you rename the schema file name, make sure to update the following locations:

  • Plug-in implementation classes
  • MockCrawlerSchema.jxb (it also should be renamed with the same name as schema)
  • build.xml

OSGi and Declarative Service requirements

  • It is not required to implement a BundleActivator.
  • Create the top level folder OSGI-INF.
  • Create a Component Description file in OSGI-INF. You can name the file as you like, but it is good practice to name it like the crawler. Therein you have to provide a unique component name, it should be the same as the crawler's class name. Then you have to provide your implementation class and the service interface class, which is always org.eclipse.smila.connectivity.framework.Crawler. Here is an example for the MockCrawler component description file you can use as a template:
<component name="MockCrawler" immediate="false" factory="CrawlerFactory">
    <implementation class="mypackage.crawer.mock.MockCrawler" />
         <provide interface="org.eclipse.smila.connectivity.framework.Crawler"/>
  • Add a Service-Component entry to your manifest file, e.g.:
Service-Component: OSGI-INF/mockcrawler.xml
  • Open and change the binary build: Add the folders OSGI-INF and schemas as well as the file plugin.xml.
bin.includes = META-INF/,\

Implement your crwler

  • Implement your crawler in a new class extending org.eclipse.smila.connectivity.framework.AbstractCrawler.
  • Follow the example of FileSystemCrawler


Activate your crawler

Activation SMILA in eclipse

  • Open the Run dialog, switch to the configuration page of Bundles, select your bundle and set the parameter Default Auto-Start to true.
  • Launch SMILA.launch.

Activation SMILA application

  • Insert your bundle , e.g. myplugin.crawler.mock@4:start, to the config.ini file.
  • Launch SMILA by calling either SMILA.exe or eclipse.exe -console

Run your crawler

Information on how to start and run an Crawler can be found in the CrawlerController documentation.

Back to the top