Added by Jens Bleuel, last edited by Jens Bleuel on Nov 21, 2012  (view change)

Labels:

Enter labels to add to this page:
Wait Image 
Looking for a label? Just start typing.

In General

Pentaho Kettle Component

This component shows a good example how to call a transformation or a job from another Java program.

It can be found in the package org.pentaho.plugin.kettle in the Pentaho-BI-Server project.

Get the source code here.

Program your own Kettle transformation

Recommendation for upward compatibility: If you want to create your own Transformation dynamically (e.g. from meta-data), use the method of generating a XML-file (KTR) instead of using the API. The XML-files are compatibility from the first Version of Kettle until now. This is the same for Jobs.

The example described below performs the following actions:

  1. create a new transformation
  2. save the transformation as XML file
  3. generate the SQL for the target table
  4. Execute the transformation
  5. drop the target table to make this program repeatable

The complete source code for the example is distributed in the distribution zip file. You can find this file in the downloads section. (Kettle version 2.1.3 or higher) After unzipping this file, you can find the source code in the "TransBuilder.java" file in the "extra" directory.

The Kettle Java API for Kettle can be generated with Javadoc, e.g. in Eclipse: Project / Generate Javadoc or by exporting to Javadoc. When the wizard opens: Set the location of the Javadoc command and enter the absolute path (e.g. C:\Program Files\Java\jdk1.5.0_12\bin\javadoc.exe).

// Generate the transformation.
TransMeta transMeta = TransBuilder.buildCopyTable(
   transformationName,
   sourceDatabaseName,
   sourceTableName,
   sourceFields,
   targetDatabaseName,
   targetTableName,
   targetFields
   );

// Save it as a file:
String xml = transMeta.getXML();
DataOutputStream dos = new DataOutputStream(new FileOutputStream(new File(fileName)));
dos.write(xml.getBytes("UTF-8"));
dos.close();
System.out.println("Saved transformation to file: "+fileName);

// OK, What's the SQL we need to execute to generate the target table?
String sql = transMeta.getSQLStatementsString();

// Execute the SQL on the target table:
Database targetDatabase = new Database(transMeta.findDatabase(targetDatabaseName));
targetDatabase.connect();
targetDatabase.execStatements(sql);

// Now execute the transformation...
Trans trans = new Trans(transMeta);
trans.execute(null);
trans.waitUntilFinished();

// For testing/repeatability, we drop the target table again
targetDatabase.execStatement("drop table "+targetTableName);
targetDatabase.disconnect();







Below is the source code for the method that creates the transformation:

/**
* Creates a new Transformation using input parameters such as the tablename to read from.
* @param transformationName The name of the transformation
* @param sourceDatabaseName The name of the database to read from
* @param sourceTableName The name of the table to read from
* @param sourceFields The field names we want to read from the source table
* @param targetDatabaseName The name of the target database
* @param targetTableName The name of the target table we want to write to
* @param targetFields The names of the fields in the target table (same number of fields as sourceFields)
* @return A new transformation metadata object
* @throws KettleException In the rare case something goes wrong
*/

public static final TransMeta buildCopyTable(String transformationName,
		String sourceDatabaseName, String sourceTableName,
		String[] sourceFields, String targetDatabaseName,
		String targetTableName, String[] targetFields)
		throws KettleException {

	EnvUtil.environmentInit();

	try
	{
		// Create a new transformation...
		//

		TransMeta transMeta = new TransMeta();
		transMeta.setName(transformationName);

		// Add the database connections
		for (int i = 0; i < databasesXML.length; i++) {
			DatabaseMeta databaseMeta = new DatabaseMeta(databasesXML[i]);
			transMeta.addDatabase(databaseMeta);
		}

		DatabaseMeta sourceDBInfo = transMeta.findDatabase(sourceDatabaseName);
		DatabaseMeta targetDBInfo  = transMeta.findDatabase(targetDatabaseName);

		//
		// Add a note
		//

		String note = "Reads information from table [" + sourceTableName+ "] on database [" + sourceDBInfo + "]" + Const.CR;
		note += "After that, it writes the information to table ["+ targetTableName + "] on database [" + targetDBInfo + "]";
		NotePadMeta ni = new NotePadMeta(note, 150, 10, -1, -1);
		transMeta.addNote(ni);

		//
		// create the source step...
		//

		String fromstepname = "read from [" + sourceTableName + "]";
		TableInputMeta tii = new TableInputMeta();
		tii.setDatabaseMeta(sourceDBInfo);
		String selectSQL = "SELECT " + Const.CR;

		for (int i = 0; i < sourceFields.length; i++) {
			if (i > 0) selectSQL += ", "; else selectSQL += " ";
			selectSQL += sourceFields[i] + Const.CR;
		}
		selectSQL += "FROM " + sourceTableName;
		tii.setSQL(selectSQL);

		StepLoader steploader = StepLoader.getInstance();

		String fromstepid = steploader.getStepPluginID(tii);
		StepMeta fromstep = new StepMeta(log, fromstepid, fromstepname,(StepMetaInterface) tii);
		fromstep.setLocation(150, 100);
		fromstep.setDraw(true);
		fromstep.setDescription("Reads information from table [" + sourceTableName + "] on database [" + sourceDBInfo + "]");
		transMeta.addStep(fromstep);

		//
		// add logic to rename fields
		// Use metadata logic in SelectValues, use SelectValueInfo...
		//

		SelectValuesMeta svi = new SelectValuesMeta();
		svi.allocate(0, 0, sourceFields.length);

		for (int i = 0; i < sourceFields.length; i++) {
			svi.getMetaName()[i] = sourceFields[i];
			svi.getMetaRename()[i] = targetFields[i];
		}

		String selstepname = "Rename field names";
		String selstepid = steploader.getStepPluginID(svi);
		StepMeta selstep = new StepMeta(log, selstepid, selstepname, (StepMetaInterface) svi);
		selstep.setLocation(350, 100);
		selstep.setDraw(true);
		selstep.setDescription("Rename field names");
		transMeta.addStep(selstep);

		TransHopMeta shi = new TransHopMeta(fromstep, selstep);
		transMeta.addTransHop(shi);
		fromstep = selstep;

		//
		// Create the target step...
		//

		//
		// Add the TableOutputMeta step...
		//

		String tostepname = "write to [" + targetTableName + "]";
		TableOutputMeta toi = new TableOutputMeta();
		toi.setDatabase(targetDBInfo);
		toi.setTablename(targetTableName);
		toi.setCommitSize(200);
		toi.setTruncateTable(true);

		String tostepid = steploader.getStepPluginID(toi);
		StepMeta tostep = new StepMeta(log, tostepid, tostepname, (StepMetaInterface) toi);
		tostep.setLocation(550, 100);

		tostep.setDraw(true);
		tostep.setDescription("Write information to table [" + targetTableName + "] on database [" + targetDBInfo + "]");
		transMeta.addStep(tostep);

		//
		// Add a hop between the two steps...
		//

		TransHopMeta hi = new TransHopMeta(fromstep, tostep);
		transMeta.addTransHop(hi);

		// The transformation is complete, return it...
		return transMeta;
	} catch (Exception e) {
		throw new KettleException("An unexpected error occurred creating the new transformation", e);
	}
}







Running an existing transformation

If you already have created a transformation and saved it in a .ktr or .xml file, you can use code like the below to run it.

public static void runTransformation(String filename) {
  try {
    StepLoader.init();
    EnvUtil.environmentInit();
    TransMeta transMeta = new TransMeta(filename);
    Trans trans = new Trans(transMeta);

    trans.execute(null); // You can pass arguments instead of null.
    trans.waitUntilFinished();
    if ( trans.getErrors() > 0 )
    {
      throw new RuntimeException( "There were errors during transformation execution." );
    }
  }
  catch ( KettleException e ) {
    // TODO Put your exception-handling code here.
    System.out.println(e);
  }
}

The easiest way to change parameters while keeping a transformation or job unchanged is with the use of variables.

Comment: Posted by Matt Casters at Aug 20, 2008 06:39

hello,the class StepLoader I don't found in Java API (4.0),but can find in 3.2,who can tell me where it and in which jar,thank you !

Comment: Posted by xieyujun at Aug 03, 2010 21:41 Updated by xieyujun

This example should be updated for 4.0.  StepLoader.init should be KettleEnvironment.init.

Comment: Posted by Connor Garvey at Aug 18, 2010 14:14

Check the below link for the steps for running transformation stored in kettle DB repository from JAVA with version 4.0.x

http://wiki.pentaho.com/display/COM/Running+kettle+transformation+using+Java

Comment: Posted by Ameeth Paatil at Nov 21, 2010 22:45

donde descargo la API para java?

Comment: Posted by Miguel Angel Reyes at Jul 12, 2012 11:23

me auto responde de aqui baje la version estable hasta este momento es la 4.3.0
http://repository.pentaho.org/artifactory/pentaho/

Comment: Posted by Miguel Angel Reyes at Jul 16, 2012 17:37

Dear all

i am  trying to create a transformation like this.can u please give me the link for the source code of above example

i tried but couldn't find out on my kettle,bi server samples folders

please help me i need it and a bit urgent

thanks

Siva (kssraju@gmail.com)

Comment: Posted by Siva at Oct 22, 2013 05:57

Can Someone provide the same example of JAVA-API in latest version of Pentaho Data Integration?

Seems like I could't find updated doc/example anywhere

Comment: Posted by Gaurav Jain at Nov 29, 2013 05:04

I can concur this needs updating for PDI 5.0, and going by comments above then for 4.0 too.

For me the main change was that StepLoader no longer exists, so lines like this are not required: 

var steploader = StepLoader.getInstance();

var fromstepid = steploader.getStepPluginID(tii);

Instead you just call it like this:

var dummyStepMeta = new Packages.org.pentaho.di.trans.steps.dummytrans.DummyTransMeta();

var fromstep = new Packages.org.pentaho.di.trans.step.StepMeta("My Step", dummyStepMeta);

Figuring that out from the code wasn't too hard, anyone using the API should be sure to have the code checked out anyway. But would be nice if the doco was up to date!

Comment: Posted by Dan Keeley at Jan 08, 2014 04:20