{div:style=position: absolute; top: 100px; right: 100px}[!http://ci.pentaho.com/job/pentaho-big-data-plugin/lastBuild/buildStatus!|http://ci.pentaho.com/job/pentaho-big-data-plugin/]{div}

h1. Pentaho Big Data Plugin

The Pentaho Big Data Plugin Project provides support for an ever-expanding BigData community within the Pentaho ecosystem. It is a plugin for the Pentaho Kettle engine which can be used within Pentaho Data Integration (Kettle), Pentaho Reporting, and the Pentaho BI Platform.

h2. Pentaho Big Data Plugin Features

This project contains the implementations for connecting to or preforming the following:
- *Pentaho MapReduce*: visually design MapReduce jobs as Kettle transformations
- *HDFS File Operations*: Read/write directly from any Kettle step. All made possible by the ubiquitous use of Apache VFS throughout Kettle
- *Data Sources*
-- *Apache Hive*: JDBC connectivity
-- *Apache HBase*: Native RPC connectivity for reading/writing
-- *Cassandra*: Native RPC connectivity for reading/writing
-- *MongoDB*: Native RPC connectivity for reading/writing

h1. Key Links

- SVN Repository: [svn://source.pentaho.org/svnkettleroot/pentaho-big-data-plugin]
- Documentation: <TODO: add dev doc page and aggregate links to wiki pages such as [Cassandra Input|http://wiki.pentaho.com/display/EAI/Cassandra+Input], [Cassandra Output|http://wiki.pentaho.com/display/EAI/Cassandra+Output], [MongoDB Input|http://wiki.pentaho.com/display/EAI/MongoDB+Input], [MongoDB Output|http://wiki.pentaho.com/display/EAI/MongoDB+Output])
-- Link to Kettle plugin development
- CI: [pentaho-big-data-plugin|http://ci.pentaho.com/job/pentaho-big-data-plugin]
Download: The latest development build: [pentaho-big-data-plugin-TRUNK-SNAPSHOT.tar.gz|http://ci.pentaho.com/job/pentaho-big-data-plugin/lastSuccessfulBuild/artifact/pentaho-big-data-plugin/dist/pentaho-big-data-plugin-TRUNK-SNAPSHOT.tar.gz]

h1. Community and where to find help

The [Big Data Forum|http://forums.pentaho.com/forumdisplay.php?301-Big-Data] exists for both users and developers. The community also manages the ##pentaho IRC channel on irc.freenode.net.

h1. Quick Start: Building the project

The Pentaho Big Data Plugin is built with [Apache Ant|http://ant.apache.org/] and uses [Apache Ivy|http://ant.apache.org/ivy/] for dependency management. All you'll need to get started is Ant 1.8.0 or newer to build the project. The build scripts will download Ivy if you do not already have it installed.

{code}svn co svn://source.pentaho.org/svnkettleroot/pentaho-big-data-plugin/trunk pentaho-big-data-plugin
cd pentaho-big-data-plugin
ant{code}

h1. Developing with Eclipse

We recommend [Apache IvyDE|http://ant.apache.org/ivy/ivyde/] to manage your Ivy dependencies within Eclipse.

# Import pentaho-big-data-plugin into Eclipse
# Resolve the project using IvyDE

If IvyDE is not an option then you can manually add the jars from lib/ and libswt/ to your class path. This project, like all other Pentaho projects, uses the open-source [Subfloor|http://code.google.com/p/subfloor/] Ant build framework. Running the following targets will configure the Eclipse project to reference the required libraries:

{code}ant resolve create-dot-classpath{code}

Then import or refresh the project in Eclipse and add the SWT libraries for your architecture, e.g. for Mac OS X x64:
!osx-swt-jars.png|border=1!