Skip to content
Snippets Groups Projects
user avatar
Marcelo Vanzin authored
This change modifies the "assembly/" module to just copy needed
dependencies to its build directory, and modifies the packaging
script to pick those up (and remove duplicate jars packages in the
examples module).

I also made some minor adjustments to dependencies to remove some
test jars from the final packaging, and remove jars that conflict with each
other when packaged separately (e.g. servlet api).

Also note that this change restores guava in applications' classpaths, even
though it's still shaded inside Spark. This is now needed for the Hadoop
libraries that are packaged with Spark, which now are not processed by
the shade plugin.

Author: Marcelo Vanzin <vanzin@cloudera.com>

Closes #11796 from vanzin/SPARK-13579.
24d7d2e4
History
Name Last commit Last update
..
src/main/assembly
README
pom.xml
This is an assembly module for Spark project.

It creates a single tar.gz file that includes all needed dependency of the project
except for org.apache.hadoop.* jars that are supposed to be available from the
deployed Hadoop cluster.

This module is off by default. To activate it specify the profile in the command line
  -Pbigtop-dist

If you need to build an assembly for a different version of Hadoop the
hadoop-version system property needs to be set as in this example:
  -Dhadoop.version=2.0.6-alpha