Authors: Alex Meadows, Adrián Sergio Pulvirenti and María Carina Roldán
Paperback: 462 pages [ 235mm x 191mm ]
Release Date: December 2013
Publisher: Packt Publishing
Pentaho Data Integration Cookbook Second Edition is written in a cookbook format, presenting examples in the style of recipes.This allows you to go directly to your topic of interest, or follow topics throughout a chapter to gain a thorough in-depth knowledge.
Pentaho Data Integration is the premier open source ETL tool, providing easy, fast, and effective ways to move and transform data. While PDI is relatively easy to pick up, it can take time to learn the best practices so you can design your transformations to process data faster and more efficiently. If you are looking for clear and practical recipes that will advance your skills in Kettle, then this is the book for you.
Pentaho Data Integration Cookbook Second Edition guides you through the features of explains the Kettle features in detail and provides easy to follow recipes on file management and databases that can throw a curve ball to even the most experienced developers.
Pentaho Data Integration Cookbook Second Edition provides updates to the material covered in the first edition as well as new recipes that show you how to use some of the key features of PDI that have been released since the publication of the first edition. You will learn how to work with various data sources – from relational and NoSQL databases, flat files, XML files, and more. The book will also cover best practices that you can take advantage of immediately within your own solutions, like building reusable code, data quality, and plugins that can add even more functionality.
Pentaho Data Integration Cookbook Second Edition will provide you with the recipes that cover the common pitfalls that even seasoned developers can find themselves facing. You will also learn how to use various data sources in Kettle as well as advanced features.
What you will learn from this book
- Configure Kettle to connect to relational and NoSQL databases and web applications like SalesForce, explore them, and perform CRUD operations
- Utilize plugins to get even more functionality into your Kettle jobs
- Embed Java code in your transformations to gain performance and flexibility
- Execute and reuse transformations and jobs in different ways
- Integrate Kettle with Pentaho Reporting, Pentaho Dashboards, Community Data Access, and the Pentaho BI Platform
- Interface Kettle with cloud-based applications
- Learn how to control and manipulate data flows
- Utilize Kettle to create datasets for analytics
Table of Contents (full version)
- Chapter 1: Working with Databases
- Chapter 2: Reading and Writing Files
- Chapter 3: Working with Big Data and Cloud Sources
- Chapter 4: Manipulating XML Structures
- Chapter 5: File Management
- Chapter 6: Looking for Data
- Chapter 7: Understanding and Optimizing Data Flows
- Chapter 8: Executing and Re-using Jobs and Transformations
- Chapter 9: Integrating Kettle and the Pentaho Suite
- Chapter 10: Getting the Most Out of Kettle
- Chapter 11: Utilizing Visualization Tools in Kettle
- Chapter 12: Data Analytics
- Appendix A: Data Structures
- Appendix B: References