Hitachi Vantara Pentaho Community Wiki

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Wiki Markup
{include:COM:StyleInclude} *Welcome to the Big Data space in the Pentaho Community wiki.* This space is the community home and collection point for all things [Big Data|http://en.wikipedia.org/wiki/Big_data] within the Pentaho ecosystem.  It is the place to find documentation, how-to's, best practices, use-cases and other information about employing Pentaho technology as part of your overall Big Data Strategy. It is also where you can share your own information and experiences.  We look forward to your participation and contribution!
{table:width=30%|align=right}
{tr}
{td}
{roundrect:width=100%|height=100%|bgcolor=#CADC99|title=Resources}
* *[Downloads|Community Edition Downloads|Download Released Builds]* - Get the code
* *[CI Builds|http://ci.pentaho.com/|Continuous Integration Server]* - Last Dev Build
* *[How-To's|How To's|Tutorials and samples]* - Get me started
* *[Community Home|http://community.pentaho.com|The rest of the Pentaho Community]*
{roundrect}
{td}
{tr}
{table}
*Expectations* \- If you are unfamiliar with open source, [this article|http://en.wikipedia.org/wiki/Open_source] is a good place to start.  The open source community thrives on participation and cooperation.  There are several communication channels available where people can help you, but they are not obligated to do so.  You are responsible for your own success which will require time, effort and a small amount technical ability.  If you prefer to have a relationship with a known vendor who will answer questions over the phone, help you during your evaluation and support you in production; please visit [www.pentaho.com|http://www.pentaho.com].

h1. Overview
Pentaho's Big Data story revolves around [Pentaho Data Integration AKA Kettle|http://kettle.pentaho.com]. Kettle is a powerful Extraction, Transformation and Loading (ETL) engine that uses a metadata-driven approach. The kettle engine provides data services for and is embedded in many of the applications within the Pentaho BI suite. Kettle comes with a graphical, drag and drop design environment for designing and running Kettle Jobs and Transformations. 

{color:red}A quick 2 min video of PDI in action{color}

*Kettle Transformations*
!Simple Transform.png|align=right, vspace=4!
A Kettle transformation consists of one or more _steps_ that perform core ETL work like reading data in the form of rows from a file or database, filtering rows, calculating new columns and sending the new data stream somewhere else. All steps in a transform execute simultaneously (usually in separate threads) and data is passed from step to step in parallel.  The data is operated on in a continuous stream without having to be fully read into memory or staged.  The image to the right demonstrated a very simple kettle transformation - Read from a data source, do some transformation, in this case a filter and then write the data stream to another data source.



{color:red}(IN WORK DM){color}


{note:title=This is a closed wiki space}
The only people with access are Pentaho Employees and Dave Reinke (Chris will need to sign up for the wiki and send me his user id)

This is a first shot at getting an open source collaboration space for Big Data.  It will eventually be open but is currently a work in progress and a place to put the use cases, demo's etc.  I completely pulled the structure and initial content from my arse and am not in love with any of it.  It is a round lump of clay, waiting to be molded by the brilliant minds of the Big Ass Data Team.
{note}