...
Include Page | ||||
---|---|---|---|---|
|
Pentaho Big Data Plugin
Div | ||
---|---|---|
| ||
The Pentaho Big Data Plugin Project provides support for an ever-expanding Big Data community within the Pentaho ecosystem. It is a plugin for the Pentaho Kettle engine which can be used within Pentaho Data Integration (Kettle), Pentaho Reporting, and the Pentaho BI Platform.
Pentaho Big Data Plugin Features
This project contains the implementations for connecting to or preforming the following:
- Pentaho MapReduce: visually design MapReduce jobs as Kettle transformations
- HDFS File Operations: Read/write directly from any Kettle step. All made possible by the ubiquitous use of Apache VFS throughout Kettle
- Data Sources
- JDBC connectivity
- Apache Hive
- Native RPC connectivity for reading/writing
- Apache HBase
- Cassandra
- MongoDB
- CouchDB
- JDBC connectivity
Key Links
- Git Repository: https://github.com/pentaho/big-data-plugin
...
- CI:
...
...
- Download the latest development build: pentaho-big-data-plugin-TRUNK-SNAPSHOT.tar.gz
...
Community and where to find help
The Big Data Forum exists for both users and developers. The community also manages the ##pentaho IRC channel on irc.freenode.net.
...
Quick
...
Start:
...
Building
...
the
...
project
...
The
...
Pentaho
...
Big
...
Data
...
Plugin
...
is now a maven project. Please refer to the project readme for build information.
Debugging
We recommend providing unit tests where possible and debugging your code through them.
Remote Debugging
If you want to see your code executing within Spoon we recommend remote debugging. This approach can be used with Pan, Kitchen, or the BA/DI Server as well. The workflow is as follows:
- Download/Checkout Kettle (currently at 4.4.0-SNAPSHOT)
...
- CI
...
- Build:
...
...
- SVN
...
- Source:
...
- svn://source.pentaho.org/svnkettleroot/Kettle/branches/4.4.0)
...
- Configure
...
- the
...
- big
...
- data
...
- plugin's
...
- kettle.dist.dir
...
- property
...
- via
...
override.properties
...
- :
...
- Create
...
override.properties
...
- in
...
- the
...
- root
...
- of
...
- the
...
- big-data-plugin.
...
- This
...
- file
...
- is
...
- a
...
- local
...
- override
...
- for
...
- any
...
- properties
...
- defined
...
- build.properties.
...
- Add
...
- the
...
- property:
...
kettle.dist.dir
...
- and
...
- point
...
- it
...
- to
...
- your
...
- Kettle
...
- install
...
- dir
...
- based
...
- on
...
- if
...
- you're
...
- using
...
- the
...
- CI
...
- download
...
- or
...
- building
...
- from
...
- source:
...
- CI
...
- Download:
...
kettle.dist.dir=../data-integration
...
- Building
...
- from
...
- source:
...
kettle.dist.dir=../Kettle/distrib
...
- (Note:
...
- You
...
- must
...
- build
...
- kettle
...
- with
...
- `ant
...
- distrib`
...
- before
...
- being
...
- able
...
- to
...
- launch
...
- it
...
- when
...
- using
...
- the
...
- source.
...
- This
...
- will
...
- build
...
- Kettle
...
- into
...
- Kettle/distrib.
...
- For
...
- more
...
- information
...
- see
...
...
...
...
- )
- Build and "install"
...
- the
...
- plugin
...
- into
...
- Kettle
...
- with
...
- ant:
...
ant
...
resolve
...
install-plugin
...
- (you
...
- can
...
- drop
...
- the
...
- resolve
...
- after
...
- the
...
- first
...
- build
...
- unless
...
- the
...
- dependencies
...
- change)
...
- Launch
...
- Kettle
...
- with
...
- remote
...
- debugging
...
- and
...
- attach
...
- Eclipse
...
- to
...
- the
...
- process
...
- Configure
...
- the
...
- script
...
- you're
...
- using
...
- to
...
- start
...
- Spoon
...
- (Mac
...
- OS
...
- X
...
- uses
...
- the
...
Data
...
Integration
...
64-bit/Contents/Info.plist
...
- :
...
- Add
-Xdebug
- Add
...
-Xrunjdwp:transport=dt_socket,server=y,suspend=n,address=5005
...
- to
...
- the
...
- JVM
...
- Arguments
...
spoon.sh
...
- or
...
Spoon.bat
...
- :
...
- Update
...
- line
...
- 158
...
- and
...
- add
...
- the
...
- above
...
- JVM
...
- arguments
...
- to
...
- the
...
OPT
...
- variable
Data Integration 64-bit/Contents/Info.plist
...
- :
...
- Update
...
- the
...
- VMOptions
...
- property
...
- and
...
- append
...
- the
...
- above
...
- JVM
...
- arguments
...
- Start
...
- Spoon
...
- Connect
...
- to
...
- the
...
- JVM
...
- with
...
- Eclipse's
...
Remote
...
Java
...
Application
...
- debug
...
- configuration,
...
- using
...
- the
...
- socket
...
- attach
...
- method
...
- and
...
- port
...
- 5005
...
- (as
...
- configured
...
- above)
...
Anchor |
---|
...
|
Contributing Changes
We use the Fork + Pull Model to manage community contributions. Please fork the repository and submit a pull request with your changes.
Here's a sample git workflow to get you started:
- Install Git
- Setup Git to auto-correct line endings:
Code Block git config --global core.autocrlf input
...
- Create a Github account
- Fork the project from https://github.com/pentaho/big-data-plugin
...
- Clone
...
- your
...
- repository:
...
Code Block
...
git clone git@github.com:USERNAME/big-data-plugin.git
...
- * Hack away *
- Stage and commit changes. Please make sure your commit messages include the JIRA case for your changes. It should be in the format: [JIRA-CASE] Short description of fixes.:
Code Block git add . && git commit
...
- Push changes back up to Github:
Code Block git push
- Submit a pull request from your project page. Please include a brief summary of what you changed and why.
Git Resources
Here's a short list of resources to help you learn and master Git:
...
...
...
...
Documentation
Kettle Plugin Development
Getting started with the Pentaho Data Integration Java API
Step Documentation
Job Entry Documentation
Hadoop Configuration
Community Plugins
Here's a list of known community plugins that fall into the "big data" category: