The thin Kettle JDBC driver allows a Java client to query a Kettle transformation remotely using JDBC and SQL.
Available from Pentaho Data Integration Enterprise Edition 5.0 or higher.
As with most JDBC drivers, there is a server and a client component to the JDBC driver.
The server is designed to run as a Servlet on the Carte server or the Pentaho Data Integration server.
You can configure a transformation step to serve as a data service in the "Data Service" tab of the transformation settings dialog:
When such a transformation gets saved there is a new entry created in either the local metastore (under the .pentaho/metastore folder) or in the Enterprise repository on the DI server (/etc/metastore).
As such, Carte or the DI Server will automatically and immediately pick up newly configured data services from the moment your service transformation is saved.
The carte configuration file accepts a <repository> which will also be scanned in addition to the local metastore.
Reminder: the Carte users and passwords are stored in the pwd/kettle.pwd file and the default user is "cluster" with password "cluster". For the DI server you can use the standard admin or suzy accounts to test with password "password"
During execution of a query, 2 transformations will be executed on the server:
These 2 transformations will be visible on Carte or in Spoon in the slave server monitor and can be tracked, sniff tested, paused and stopped just like any other transformation. However, it will not be possible to restart them manually since both transformations are programatically linked.
For this example we open the "Getting Started Transformation" (see the sample/transformations folder of your PDI distribution) and configure a Data Service for the "Number Range" called "gst". (comparable to the screenshot above)
Then we can launch Carte or the Data Integration Server to execute a query against that new virtual database table:
SELECT dealsize, sum(sales) as total_sales, count(*) AS nr FROM gst GROUP BY dealsize HAVING count(*) > 20 ORDER BY sum(sales) DESC
This query is being parsed by the server and a transformation is being generated to convert the service transformation data into the requested format:
The data which is being injected is originating from the service transformation:
So for each executed query you will see 2 transformations listed on the server.
The JDBC driver uses the following class:
The URL is in the following format:
For Carte, this is an example:
For the Data Integration server:
this example is for a the carte configuration file shown above.
The following standard options are available:
Parameters for the service transformation can be set with the following format: PARAMETER_name=value (so with the option name prepended with "PARAMETER_")
Support for the SQL is minimal at the moment.
The following things are supported, please consider everything else unsupported:
Dates have square brackets around them and the following formats are supported: \[yyyy/MM/dd HH:mm:ss.SSS\], \[yyyy/MM/dd HH:mm:ss\] and \[yyyy/MM/dd\]
Besides the obviously plentiful limitations in the support for the SQL standard, there are a few noteworthy things to note:
Clients typically need to following libraries to work:
These libraries can be found in your data-integration/lib folder with the appropriate version number (e.g. kettle-core-5.0.0.jar).
Since SQuirrel already contains most needed jar files, configuring it simply done by adding kettle-core.jar as a new driver jar file along with Apache Commons VFS 1.0 and scannotation.jar
The following jar files need to be added:
Simply replace the kettle-*.jar files in the lib/ folder with new files from Kettle v5.0-M1 or higher.
Replace the current kettle-*.jar files with the ones from Kettle v5 or later.
Interactive reporting runs off Pentaho Metadata so this advice also works there.
You need a BI Server that uses the PDI 5.0 jar files or you can use an older version and update the kettle-core, kettle-db and kettle-engine jar files in the /tomcat/webapps/pentaho/WEB-INF/lib/ folder
See Pentaho Interactive reporting: simply update the kettle-*.jar files in your Pentaho BI Server (tested with 4.1.0 EE and 4.5.0 EE) to get it to work.
Example of patching :
matt@kettle:~/pentaho/4.5.0-ee/server/biserver-ee$ rm ./tomcat/webapps/pentaho/WEB-INF/lib/kettle-core-4.3.0-GA.jar matt@kettle:~/pentaho/4.5.0-ee/server/biserver-ee$ rm ./tomcat/webapps/pentaho/WEB-INF/lib/kettle-engine-4.3.0-GA.jar matt@kettle:~/pentaho/4.5.0-ee/server/biserver-ee$ rm ./tomcat/webapps/pentaho/WEB-INF/lib/kettle-db-4.3.0-GA.jar matt@kettle:~/pentaho/4.5.0-ee/server/biserver-ee$ cp /kettle/5.0/lib/kettle-core.jar ./tomcat/webapps/pentaho/WEB-INF/lib/ matt@kettle:~/pentaho/4.5.0-ee/server/biserver-ee$ cp /kettle/5.0/lib/kettle-db.jar ./tomcat/webapps/pentaho/WEB-INF/lib/ matt@kettle:~/pentaho/4.5.0-ee/server/biserver-ee$ cp /kettle/5.0/lib/kettle-engine.jar ./tomcat/webapps/pentaho/WEB-INF/lib/
Fun fact: Mondrian generates the following SQL for the report shown above:
select "Service"."Category" as "c0", "Service"."Country" as "c1", sum("Service"."sales_amount") as "m0" from "Service" as "Service" group by "Service"."Category", "Service"."Country"
You can query a remote service transformation with any Kettle v5 or higher client. You can query the service through the database explorer and the various database steps (for example the Table Input step).
TODO: ask project owners to change the current old driver class to the new thin one.
Partial success as I'm getting some XML parsing errors. However, adding the aforementioned jar files at least allow you to get back query fields:
To be investigated.
The following things are next on the agenda: