There are a number of different places from which JAR files originate during execution of a transformation in the AEL engine:
- The set of JARs in data-integration/lib
- JARs from spark-install/jars
- JARs from the Hadoop classpath
- AEL JARs in Karaf
- JARs from kettle plugins (OSGi and otherwise)
In some cases, library versions contained in these different locations can and will conflict, causing general problems where Spark libraries conflict with Hadoop libraries . It also has the potential to create AEL specific problems .
Library conflicts have produced several bugs, both within AEL code and from Spark, in general:
OSGi is valuable specifically because it addresses these sorts of problems, and luckily most of AEL execution happens within Karaf. The places of vulnerability, however, are:
- Execution which occurs outside of the engine, not leveraging Karaf (like SparkWebSocketMain).
- The set of packages specified by org.osgi.framework.system.packages.extra (from karaf/etc/custom.properties). That is, the set of packages exposed from the framework classloader.
As of Pentaho 8.0, running AEL with Spark 2.1.0, the set of JARs in conflict between spark-install/jars and data-integration/lib are the following 24 libraries:
Of these libraries, the set of packages exposed from the framework classloader boil down to these packages:
Since these packages are provided via the framework classloader, and are loaded from indeterminate library versions, there's inherent risk that undesired and unpredictable behavior could result.
To reduce risk, follow these steps.
- Test specific Spark and Hadoop versions and recommend sticking to that set.
- Minimize usage of classes within the above packages. Update the list of potentially conflicting packages as new releases come out.
- Wherever possible, leverage classes injected via blueprint within AEL.
- Avoid usage of libraries that overlap with Hadoop / Spark libraries for any packages retrieved via the framework classloader.