Hitachi Vantara Pentaho Community Wiki

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migration of unmigrated content due to installation of a new plugin
Excerpt Include
Doc_Help
Doc_Help
nopaneltrue

Data Services via the Thin Kettle JDBC driver

The thin Thin Kettle JDBC driver allows a Java Driver provides a means for a Java-based client to query the results of a Kettle transformation remotely using JDBC and SQL.

Architecture

As with most JDBC drivers, there is a server and a client component to the JDBC driver.
The server is designed to run as a Servlet on the Carte server, the Pentaho Data Integration server or Pentaho Business Analytics platform.  At the time of writing only Carte is supported.

Image Removed
The client JDBC driver consists of the kettle-core.jar library which has dependencies against Apache Commons HTTP Client and Apache Commons VFS only.

Server configuration

The carte configuration file accepts a <services> block that can contain <service> elements with the following sub-elements:

  • name : The name of the service.  Only alphanumeric characters are supported at the moment, no spaces.
  • filename: The filename of the service transformation (.ktr) that will provide the data for the service
  • service_step: the name of the step which will provide data during querying.

For example:

<services>
  <service>
    <name>Service</name> 
    <filename>/home/matt/svn/kettle/trunk/testfiles/sql-transmeta-test-data.ktr</filename> 
    <service_step>Output</service_step>
   </service>
 </services>

Monitoring

During execution of a query, 2 transformations will be executed on the server:

  1. A service transformation, of human design built in Spoon to provide the service data
  2. An automatically generated transformation to aggregate, sort and filter the data according to the SQL query

These 2 transformations will be visible on Carte or in Spoon in the slave server monitor and can be tracked, sniff tested, paused and stopped just like any other transformation.  However, it will not be possible to restart them manually since both transformations are programatically linked.

The JDBC Client

The JDBC driver uses the following class:

org.pentaho.di.core.jdbc.ThinDriver

The URL is in the following format:

jdbc:pdi://hostname:port/kettle?option=value&option=value

The following standard options are available:

  • webappname : the name of the web app (future feature to support running on the DI server)
  • proxyhostname : the proxy server for the HTTP connection(s)
  • proxyport : the port of the proxy server
  • nonproxyhosts : the hosts (comma seperator) for which not to use a proxy
  • debugtrans : the optional name of a file in which the generated transformation will be stored for debugging purposes (example: /tmp/debug.ktr)

Parameters for the service transformation can be set with the following format:  PARAMETER_name=value (so with the option name prepended with "PARAMETER_")

SQL Support

Support for the SQL is minimal at the moment.

The following things are supported, please consider everything else unsupported:

  • SELECT:
    • * is expanded to include all rows
    • COUNT(field)
    • COUNT(*)
    • COUNT(DISTINCT field)
    • IIF( condition, true-value or field, false-value or field)
    • Aggregates: SUM, AVG, MIN, MAX
    • Alias both with the "AS" keyword and with one or more spaces seperated, for example SUM(sales) AS "Total Sales" or SUM(sales) TotalSales
    • NOTE: construct DISTINCT col1, col2, col3 is NOT yet suppored
  • FROM
    • Strictly one service name, no alias
  • WHERE
    • nested brackets
    • AND, OR, NOT if preceded by brackets, for example: NOT ( A = 5 OR C = 3 )
    • precedence taken into account
    • Literals (String, Integer)
    • PARAMETER('parameter-name')='value'  (always evaluates to TRUE in the condition)
    • =
    • <
    • >
    • <=, =<
    • >=, =>
    • <>
    • LIKE (standard % and ? wildcards are converted to .* and . regular expressions)
    • REGEX (matches regular expression)
    • IS NULL
    • IS NOT NULL
    • IN ( value, value, value )
    • You can put a condition on the IIF expression or it's alias if one is used. (please use identical string literals for expressions)
  • GROUP BY
    • Group on fields, not IIF() function
  • HAVING
    • Conditions should be placed on the aggregate construct, not the alias
    • Please use identical strings for the expressions, the algorithm is not yet that smart.  In other words, if you use "COUNT(star) " in the SELECT clause you should use the same "COUNT(star) " expression in the HAVING clause, not "COUNT( * )" or any variant of it.
  • ORDER BY
    • You can order on any column in the result. (to be fixed later to allow you to also sort on non-selected columns in the service)

Literals: 

  • Strings have single quotes around them, escaping quotes is not yet supported.
  • Wiki Markup
    Dates have square brackets around them and the following formats are supported:&nbsp;\[yyyy/MM/dd HH:mm:ss.SSS\],&nbsp;\[yyyy/MM/dd HH:mm:ss\] and&nbsp;\[yyyy/MM/dd\]
  • Number and BigNumber should have no grouping symbol and the decimal is . (example 123.45)
  • Integers contain only digits
  • Boolean values can be TRUE or FALSE

Any Java-based, JDBC-compliant tool, including third-party reporting systems, can use this driver to query a Kettle
transformation using a SQL string via JDBC.

Just in time blending of data from multiple sources for a complete picture:

  • Connect, combine and transform data from multiple sources
  • Query data directly from any transformation
  • Access architected blends with the full spectrum of Pentaho Analytics
  • Manage governance and security of data for on-going accuracy

Image Added

This just in time, architected blending delivers accurate big data analytics based on the blended data. You can connect to, combine, and even transform data from any of the multiple data stores in your hybrid data ecosystem into these blended views, then query the data directly via that view using the full spectrum of analytics in the Pentaho Analytics platform, including predictive analytics.

Panel

Available since Pentaho Data Integration Version 5.0 GA (Enterprise Edition)

Please see the following pages for more information:

Page Tree
root@self
expandCollapseAlltrue
sortposition
excerpttrue
reversefalse