Skip to main content

Notice: this Wiki will be going read only early in 2024 and edits will no longer be possible. Please see: https://gitlab.eclipse.org/eclipsefdn/helpdesk/-/wikis/Wiki-shutdown-plan for the plan.

Jump to: navigation, search

SMILA/Documentation/Management

SMILA is a framework with a lot of functionality. Most is invoke automatically by internal operations. Nevertheless, the user has to configure and start an initial operation. All functions a user can execute are accessible from the JMX Management Agent. On the following pages you will learn how to use SMILA with the aid of Java's built in JConsole and to handle the JMXClient which features access to SMILA commands via batch files.

Management with the aid of jconsole

The jconsole is a little tool for monitoring java applications nested in the JDK. Over a JMX connection it’s possible to connect an application with the swing UI of jconsole. If you start up SMILA engine and open jconsole you can connect the Jconsole to SMILA immediately.

jconsole

After connecting you can find SMILA operation on MBeans tab in the Tree on the left site.

Smila manageable Components

There are four components of SMILA which you can access over jconsole.

CrawlerController

Here you can manage the crawling jobs. The following commands are available:

  • startCrawl(String dataSourceID): starts a crawling job with the given dataSourceID, for example file or web.
  • stopCrawl(String dataSourceID): stops the crawling job for the given dataSourceID. Note: the crawler is only signaled to stop and may do so at its own discretion. In other words: depending on the implementation it might take a while until it actually stops crawling. It thus gives the crawler the chance to clean up all open resources and finish whatever business it needs to.
  • getActiveCrawls(): opens a dialog which show a list containing the dataSourceID for all active crawl jobs. If no job is running the dialog shows null.
  • getActiveCrawlsStatus(): opens a dialog telling you how many crawl jobs are active at the moment.
  • getStatus(String dataSourceID): opens a dialog which shows you the status of the crawling job for a given dataSoruceID. Possible states are: RUNNING, FINISHED, STOPPED or ABORTED.

RecordRecycler

The RecordRecycler gives you the possibly to push already crawled records into Data Flow Process again. For example it could be useful if you want to modify record in the index with another pipeline. To control RecordRecycler there are following operations available.

  • startRecycling(String configurationID, String dataSourceId): fires a recycling event with the given configurationID ( the configurationID must match the name of a configuration file located at configuration/org.eclipse.smila.connectivity.framework.queue.worker/recyclers) and dataSourceID (get records from RecordStorage which have this dataSourceID). See QueueWorker documentation for further enlightenment on the Recycler.
  • stopRecycling(String dataSourceID): stops the recycle event for the given dataSourceID.
  • getRecordsRecycled(String dataSourceID): opens a dialog shows how many records are recycled.
  • getConfigurations(String dataSourceID): show a list containing all available recycle configuration files.
  • getStatus(String dataSourceID): open dialog showing the status of recycling event for given dataSourceID. Possible states are: STARTED, IN_PROCESS, STOPPING, STOPPED, FINISHED.

DeltaIndexingManager

The DeltaIndexManager stores a hash value of each record. It is part of the Connectivity Framework and signals a crawler that a given record has (not) changed since the last crawl. See DeltaIndexing documentation. Within jconsole you can use the following commands:

  • clearAll(): clears all hashes thus enabling to reprocess all records
  • unlockAll(): unlock all datasources
  • clear(String dataSourceID): same as clearAll but limited to one data source

Lucene

With Lucene you have the possibility to invoke several method concerning the index. Following operation are available:

  • deleteIndex(String indexName): removes the index with the given name if available. Otherwise an error dialog is shown.
  • indexExists(String indexName): ask the framework if the given Index exists. Returns true or false.
  • createIndex (String indexName): creates an index with the given name.
  • reorganizeIndex(String indexName): reorganizes the index with the given name. This will clean up the index, in that deleted entries are physically removed resulting in a smaller index size.
  • renameIndex(String currentIndexName, String newIndexName): rename the index with the given name (currentIndexname) into the value of newIndexName.

PerformanceCounter

A PerformanceCounter monitors the activity of a component. In SMILA currently two kinds of PerformanceCounters are available, one for Crawlers and another for Processing within the Data Flow Process. With the aid of jconsole you have the possibility to look at interesting counters of SMILA. There exist a lot of views that allow you to get information about different situations.

Crawlers performance counters

After you start a crawl job immediately a new branch in MBeans-tree appears with the following nodes/values:

Crawler counters tree
  • Crawlers
    • FileSystem - Crawler type
      • Launches
        • file - Data source Id
          • 19786841 - Crawler instance, one node for every crawl job
      • Total - Crawler type sub-total
    • Web - Crawler type
      • Launches
        • web - Data source Id
          • 2611152 - Crawler instance, one node for every crawl job
      • Total - Crawler type sub-total
    • Total

The nodes contain a subset collection of these possible counters:

  • AttachmentBytesTransfered: total records attachment bytes transfered
  • AttachmentTransferRate: records attachment bytes transfer rate (bytes/sec)
  • AverageAttachmentTransferRate: overall (whole crawl job) records attachment bytes transfer rate (bytes/sec)
  • AverageDeltaIndicesProcessingTime: average delta index processing time (sec)
  • AverageRecordsProcessingTime: average record processing time (sec)
  • OverallAverageDeltaIndicesProcessingTime: overall delta index processing time (sec)
  • OverallAverageRecordsProcessingTime: overall record processing time (sec)
  • Error: contains a collection of all errors occurred. On operation tab you can find a method to show all errors in a dialog box.
  • Delta-indices: number of delta indices created in LuceneIndex.
  • Exceptions(critical): number of critical exceptions.
  • Exceptions(non-critical): number of non-critical exceptions.
  • Exceptions(producer): number of producer exceptions.
  • Files: number of files which were crawled. (only FileSystemCrawler)
  • Folder: number of folder walked through. (only FileSystemCrawler)
  • Records: number of records created.
  • Bytes: how much bytes were downloaded
  • http-fetch-time: average of each http-fetch-time, i.e. the time it costs to download a webpage.
  • Pages: how many pages were visited.

Processing performance counters

As soon as Router puts Records into MQ the Listener pushes them into Data Flow Process. This time a new section with the following hierarchy (only an example, because PerformanceCounters vary according to your personal usage of SMILA) appears in MBeans-tree:

  • Pipeline: lists all invoked pipelines.
    • AddPipeline
    • DeletePipeline
  • Processing Service: lists all processing services which were invoked, sorted by pipelines
    • AddPipeline
      • LuceneIndexService
      • SimepleMimeTypeIdentifier
    • DeletePipeline
      • LuceneIndexService
  • Simple Pipelet: lists all pipelets which were used, sorted by pipelines
    • AddPipeline
      • HtmlToTextPipelet

JMX Client

The JMX Client is a lightweight and very easy to use command line driven component to use access most JMX Management operations. It works without jconsole and provides only a few commands. If you want to have full control over SMILA framework you have to use jconsole as described in the chapter above. But if you only want to start a crawl job or check if a crawl job is still active, you don’t have to use the jconsole. Furthermore you have the possibility to expand functionality of JMX Client. It is highly configurable with only one single configuration file.

Pre-defined commands (batch-files)

  • crawlFILE: start a crawl job with the data source "file"
  • crawlFILEandWait: same as command above with the exception that the console-window stays open till the crawl job has finished.
  • crawlFILEstop: stop the crawl job for data source "file", if one is active.
  • crawlWEB: start a crawl job for data source "web".
  • crawlWEBandWait: same as command above with the exception that the console-windows stays open till the crawl job has finished.
  • crawlWEBstop: stop the crawl job for data source "web", if one is active.
  • getActiveCrawls: show all active crawl jobs on console.
  • agentFEEDS start agent for the data source "feeds"
  • agentFEEDSStop stop agent for the data source "feeds", if one is active
  • getActiveAgents: show all active agent jobs on console.
  • indexOptimize: invoke a reorganize of LuceneIndex (default test_index).
  • recycleConfigurations: display all available recycler configurations.
  • recycleFILE: start a recycle event for dataSoruceID file and the default recyclerConfig (recycler1.xml)
  • recycleFILEandWait: same as command above with the exception that the console-window stays open till the recycle event has finished.
  • recycleFILEstatus: show the actual status of recycle event for dataSourceID file.

Usage

If you open command window in folder SMILA/jmxclient and execute run.bat you'll get very useful help.

JMX Client

The JMX Client can be used to simplify JMX Management while using batch-files for most important functions. But that’s not all. With the aid of JMX Client you have the possibility to use SMILA completely from your console or write own batch files which could invoke for example one method after another. The Client works with commands. These commands are managed in only one configuration file. In addition to the pre-defined commands you are able to create own commands. You only need to know the fully qualified class name and method name of function you want to invoke. To execute a command simply use this pattern: run.bat commandName commandParameters. The JMX client is able to execute any JMX operation and get any JMX attribute and to make it in one batch with reusing previous results.

Configuration

There is a configuration file located at org.eclipse.smila.management.jmx.client/schemas/jmxclient.xsd (Soruce) and jmxclient/schemas/jmxclient.xsd (Build). The default configuration file could be found at org.eclipse.smila.management.jmx.client/config.xml (Source) and jmxclient/config.xml.


Configuration explanation

To use commands which interact with JMX a connection is needed
<connection id="local" host="localhost" port="9004"/>
To create your own command you have to use cmd command after this schema
  • cmd:
    • id: name of the command.
    • echo: something to display on console if command is execute.
      • operation
        • domain: JMX property root is DOMAIN inside each domain paths (to leafs) are identified by KEY for leafs there are operstions and attributes.
        • key: Class containing method.
        • name: name of the method to invoke.
        • echo: something to display on console if method is invoked.
          • parameter: one tag for each parameter.
            • echo: description for the parameter.
To keep the console open and inform you about actual status you can use the wait tag
  • STEP 1:
  <cmd id="crawl" echo="Starting crawler by datasource id">
    <operation
      key="CrawlerController"
      name="startCrawling"
      echo="Starting crawl [%1]"
    >
      <parameter echo="data source id"/>
    </operation>
  </cmd>
for DOMAIN "SMILA" and leaf identified by KEY „CrawlerController" execute operation "startCrawling" with one input parameter ( its String type - default) JMX will return result to client, e.g. "Crawl with the dataSourceId = file and hashcode [595826] successfully started!"
  • STEP 2:
we should extract hash code from crawler to track its activities it was used by task the next result is "595826"
  <regexp pattern="^.*\[(\d+)\].*$" group="1" echo="Extracting crawler hash code"/>
  • STEP 3 is a wait task - the most complex - we will wait until crawl finished here it defined by 2 subnodes
    <wait echo="Waiting while crawl ends" pause="1000">
      <in>
        <cmd id="-" echo="Getting crawler status by datasource id">
          <operation
            key="CrawlerController"
            name="getStatus"
            echo="Crawl [%1] status"
          >
            <!--  value="%1" -->
            <parameter echo="data source id"/>
          </operation>
        </cmd>
        <const value="Finished" echo="Crawling finished status"/>
        <const value="Stopped" echo="Crawling stopped status"/>
      </in>
      <cmd id="-" echo="Reading crawler performance counters">
        <attribute
          key="Crawlers/%2/Total"
          name="Records"
          echo="Total records"
        />
      ...
      </cmd>
   </wait>
First subnode ( here its logical IN ) is a condition to exit from WAIT task. Second subnode is a batch to execute if we not exited. So each 1000 ms it will ask for status and validate is it IN "Finished" or "Stopped":
  • if it is, the crawling finished, otherwise:
  • three performance counters defined in cmd with id="-" will be printed.

JMX Client in OSGi console

Since rev 464 (after M2), the JMX client is also added to the Equinox OSGi console as a command provider. Thus you can now invoke the same configured actions also from the OSGi console without having to open a seperate window. The command name is smila followed by the same arguments used with the run script in SMILA/jmxclient. Use help to get a description of the supported commands, the output should look like this:

SMILA-osgiconsole-help.png

Usually a lot more help output for the standard Equinox commands follows so you may need to scroll back a lot to find the description of the smila command.

See the next screenshot for an example session in which an agent and an crawler is controlled using the OSGi JMX client only. You'll see that the commands are exactly like when using the run script, only the command name is smila, not run.

SMILA-osgiconsole-commands.png

External links

Back to the top