Skip to content
Snippets Groups Projects
Commit de95c57a authored by Kousuke Saruta's avatar Kousuke Saruta Committed by Patrick Wendell
Browse files

[SPARK-3787][BUILD] Assembly jar name is wrong when we build with sbt omitting -Dhadoop.version

This PR is another solution for When we build with sbt with profile for hadoop and without property for hadoop version like:

    sbt/sbt -Phadoop-2.2 assembly

jar name is always used default version (1.0.4).

When we build with maven with same condition for sbt, default version for each profile is used.
For instance, if we  build like:

    mvn -Phadoop-2.2 package

jar name is used hadoop2.2.0 as a default version of hadoop-2.2.

Author: Kousuke Saruta <sarutak@oss.nttdata.co.jp>

Closes #3046 from sarutak/fix-assembly-jarname-2 and squashes the following commits:

41ef90e [Kousuke Saruta] Merge branch 'master' of git://git.apache.org/spark into fix-assembly-jarname-2
50c8676 [Kousuke Saruta] Merge branch 'fix-assembly-jarname-2' of github.com:sarutak/spark into fix-assembly-jarname-2
52a1cd2 [Kousuke Saruta] Fixed comflicts
dd30768 [Kousuke Saruta] Merge branch 'master' of git://git.apache.org/spark into fix-assembly-jarname2
f1c90bb [Kousuke Saruta] Fixed SparkBuild.scala in order to read `hadoop.version` property from pom.xml
af6b100 [Kousuke Saruta] Merge branch 'master' of git://git.apache.org/spark into fix-assembly-jarname
c81806b [Kousuke Saruta] Merge branch 'master' of git://git.apache.org/spark into fix-assembly-jarname
ad1f96e [Kousuke Saruta] Merge branch 'master' of git://git.apache.org/spark into fix-assembly-jarname
b2318eb [Kousuke Saruta] Merge branch 'master' of git://git.apache.org/spark into fix-assembly-jarname
5fc1259 [Kousuke Saruta] Fixed typo.
eebbb7d [Kousuke Saruta] Fixed wrong jar name
parent 534f24b2
No related branches found
No related tags found
No related merge requests found
......@@ -15,6 +15,8 @@
* limitations under the License.
*/
import java.io.File
import scala.util.Properties
import scala.collection.JavaConversions._
......@@ -23,7 +25,7 @@ import sbt.Classpaths.publishTask
import sbt.Keys._
import sbtunidoc.Plugin.genjavadocSettings
import sbtunidoc.Plugin.UnidocKeys.unidocGenjavadocVersion
import com.typesafe.sbt.pom.{PomBuild, SbtPomKeys}
import com.typesafe.sbt.pom.{loadEffectivePom, PomBuild, SbtPomKeys}
import net.virtualvoid.sbt.graph.Plugin.graphSettings
object BuildCommons {
......@@ -112,6 +114,15 @@ object SparkBuild extends PomBuild {
override val userPropertiesMap = System.getProperties.toMap
val pom = loadEffectivePom(new File("pom.xml"),
profiles = profiles,
userProps = userPropertiesMap)
if (System.getProperty("hadoop.version") == null) {
System.setProperty("hadoop.version",
pom.getProperties.get("hadoop.version").asInstanceOf[String])
}
lazy val MavenCompile = config("m2r") extend(Compile)
lazy val publishLocalBoth = TaskKey[Unit]("publish-local", "publish local for m2 and ivy")
......@@ -297,8 +308,7 @@ object Assembly {
// This must match the same name used in maven (see network/yarn/pom.xml)
"spark-" + v + "-yarn-shuffle.jar"
} else {
mName + "-" + v + "-hadoop" +
Option(System.getProperty("hadoop.version")).getOrElse("1.0.4") + ".jar"
mName + "-" + v + "-hadoop" + System.getProperty("hadoop.version") + ".jar"
}
},
mergeStrategy in assembly := {
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment