Skip to content
Snippets Groups Projects
  1. Oct 07, 2015
    • Kevin Cox's avatar
      [SPARK-10952] Only add hive to classpath if HIVE_HOME is set. · 9672602c
      Kevin Cox authored
      Currently if it isn't set it scans `/lib/*` and adds every dir to the
      classpath which makes the env too large and every command called
      afterwords fails.
      
      Author: Kevin Cox <kevincox@kevincox.ca>
      
      Closes #8994 from kevincox/kevincox-only-add-hive-to-classpath-if-var-is-set.
      9672602c
  2. Feb 09, 2015
  3. Dec 29, 2014
    • Kousuke Saruta's avatar
      Adde LICENSE Header to build/mvn, build/sbt and sbt/sbt · 4cef05e1
      Kousuke Saruta authored
      Recently, build/mvn and build/sbt are added, and sbt/sbt is changed but there are no license headers. Should we add license headers to the scripts right?
      If it's not right, please let me correct.
      
      This PR doesn't affect behavior of Spark, I don't file in JIRA.
      
      Author: Kousuke Saruta <sarutak@oss.nttdata.co.jp>
      
      Closes #3817 from sarutak/add-license-header and squashes the following commits:
      
      1abc972 [Kousuke Saruta] Added LICENSE Header
      4cef05e1
  4. Dec 27, 2014
    • Brennon York's avatar
      [SPARK-4501][Core] - Create build/mvn to automatically download maven/zinc/scalac · a3e51cc9
      Brennon York authored
      Creates a top level directory script (as `build/mvn`) to automatically download zinc and the specific version of scala used to easily build spark. This will also download and install maven if the user doesn't already have it and all packages are hosted under the `build/` directory. Tested on both Linux and OSX OS's and both work. All commands pass through to the maven binary so it acts exactly as a traditional maven call would.
      
      Author: Brennon York <brennon.york@capitalone.com>
      
      Closes #3707 from brennonyork/SPARK-4501 and squashes the following commits:
      
      0e5a0e4 [Brennon York] minor incorrect doc verbage (with -> this)
      9b79e38 [Brennon York] fixed merge conflicts with dev/run-tests, properly quoted args in sbt/sbt, fixed bug where relative paths would fail if passed in from build/mvn
      d2d41b6 [Brennon York] added blurb about leverging zinc with build/mvn
      b979c58 [Brennon York] updated the merge conflict
      c5634de [Brennon York] updated documentation to overview build/mvn, updated all points where sbt/sbt was referenced with build/sbt
      b8437ba [Brennon York] set progress bars for curl and wget when not run on jenkins, no progress bar when run on jenkins, moved sbt script to build/sbt, wrote stub and warning under sbt/sbt which calls build/sbt, modified build/sbt to use the correct directory, fixed bug in build/sbt-launch-lib.bash to correctly pull the sbt version
      be11317 [Brennon York] added switch to silence download progress only if AMPLAB_JENKINS is set
      28d0a99 [Brennon York] updated to remove the python dependency, uses grep instead
      7e785a6 [Brennon York] added silent and quiet flags to curl and wget respectively, added single echo output to denote start of a download if download is needed
      14a5da0 [Brennon York] removed unnecessary zinc output on startup
      1af4a94 [Brennon York] fixed bug with uppercase vs lowercase variable
      3e8b9b3 [Brennon York] updated to properly only restart zinc if it was freshly installed
      a680d12 [Brennon York] Added comments to functions and tested various mvn calls
      bb8cc9d [Brennon York] removed package files
      ef017e6 [Brennon York] removed OS complexities, setup generic install_app call, removed extra file complexities, removed help, removed forced install (defaults now), removed double-dash from cli
      07bf018 [Brennon York] Updated to specifically handle pulling down the correct scala version
      f914dea [Brennon York] Beginning final portions of localized scala home
      69c4e44 [Brennon York] working linux and osx installers for purely local mvn build
      4a1609c [Brennon York] finalizing working linux install for maven to local ./build/apache-maven folder
      cbfcc68 [Brennon York] Changed the default sbt/sbt to build/sbt and added a build/mvn which will automatically download, install, and execute maven with zinc for easier build capability
      a3e51cc9
  5. Dec 03, 2014
    • Masayoshi TSUZUKI's avatar
      [SPARK-4701] Typo in sbt/sbt · 96786e3e
      Masayoshi TSUZUKI authored
      Modified typo.
      
      Author: Masayoshi TSUZUKI <tsudukim@oss.nttdata.co.jp>
      
      Closes #3560 from tsudukim/feature/SPARK-4701 and squashes the following commits:
      
      ed2a3f1 [Masayoshi TSUZUKI] Another whitespace position error.
      1af3a35 [Masayoshi TSUZUKI] [SPARK-4701] Typo in sbt/sbt
      96786e3e
  6. Sep 08, 2014
    • Prashant Sharma's avatar
      SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within. · e16a8e7d
      Prashant Sharma authored
      ...
      
      Tested ! TBH, it isn't a great idea to have directory with spaces within. Because emacs doesn't like it then hadoop doesn't like it. and so on...
      
      Author: Prashant Sharma <prashant.s@imaginea.com>
      
      Closes #2229 from ScrapCodes/SPARK-3337/quoting-shell-scripts and squashes the following commits:
      
      d4ad660 [Prashant Sharma] SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within.
      e16a8e7d
  7. Jul 10, 2014
    • Prashant Sharma's avatar
      [SPARK-1776] Have Spark's SBT build read dependencies from Maven. · 628932b8
      Prashant Sharma authored
      Patch introduces the new way of working also retaining the existing ways of doing things.
      
      For example build instruction for yarn in maven is
      `mvn -Pyarn -PHadoop2.2 clean package -DskipTests`
      in sbt it can become
      `MAVEN_PROFILES="yarn, hadoop-2.2" sbt/sbt clean assembly`
      Also supports
      `sbt/sbt -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 clean assembly`
      
      Author: Prashant Sharma <prashant.s@imaginea.com>
      Author: Patrick Wendell <pwendell@gmail.com>
      
      Closes #772 from ScrapCodes/sbt-maven and squashes the following commits:
      
      a8ac951 [Prashant Sharma] Updated sbt version.
      62b09bb [Prashant Sharma] Improvements.
      fa6221d [Prashant Sharma] Excluding sql from mima
      4b8875e [Prashant Sharma] Sbt assembly no longer builds tools by default.
      72651ca [Prashant Sharma] Addresses code reivew comments.
      acab73d [Prashant Sharma] Revert "Small fix to run-examples script."
      ac4312c [Prashant Sharma] Revert "minor fix"
      6af91ac [Prashant Sharma] Ported oldDeps back. + fixes issues with prev commit.
      65cf06c [Prashant Sharma] Servelet API jars mess up with the other servlet jars on the class path.
      446768e [Prashant Sharma] minor fix
      89b9777 [Prashant Sharma] Merge conflicts
      d0a02f2 [Prashant Sharma] Bumped up pom versions, Since the build now depends on pom it is better updated there. + general cleanups.
      dccc8ac [Prashant Sharma] updated mima to check against 1.0
      a49c61b [Prashant Sharma] Fix for tools jar
      a2f5ae1 [Prashant Sharma] Fixes a bug in dependencies.
      cf88758 [Prashant Sharma] cleanup
      9439ea3 [Prashant Sharma] Small fix to run-examples script.
      96cea1f [Prashant Sharma] SPARK-1776 Have Spark's SBT build read dependencies from Maven.
      36efa62 [Patrick Wendell] Set project name in pom files and added eclipse/intellij plugins.
      4973dbd [Patrick Wendell] Example build using pom reader.
      628932b8
  8. Mar 26, 2014
    • Michael Armbrust's avatar
      [SQL] Un-ignore a test that is now passing. · 32cbdfd2
      Michael Armbrust authored
      Add golden answer for aforementioned test.
      
      Also, fix golden test generation from sbt/sbt by setting the classpath correctly.
      
      Author: Michael Armbrust <michael@databricks.com>
      
      Closes #244 from marmbrus/partTest and squashes the following commits:
      
      37a33c9 [Michael Armbrust] Un-ignore a test that is now passing, add golden answer for aforementioned test.  Fix golden test generation from sbt/sbt.
      32cbdfd2
  9. Mar 02, 2014
    • Michael Armbrust's avatar
      Merge the old sbt-launch-lib.bash with the new sbt-launcher jar downloading logic. · 012bd5fb
      Michael Armbrust authored
      This allows developers to pass options (such as -D) to sbt.  I also modified the SparkBuild to ensure spark specific properties are propagated to forked test JVMs.
      
      Author: Michael Armbrust <michael@databricks.com>
      
      Closes #14 from marmbrus/sbtScripts and squashes the following commits:
      
      c008b18 [Michael Armbrust] Merge the old sbt-launch-lib.bash with the new sbt-launcher jar downloading logic.
      012bd5fb
  10. Feb 08, 2014
    • Jey Kottalam's avatar
      Merge pull request #454 from jey/atomic-sbt-download. Closes #454. · 78050805
      Jey Kottalam authored
      Make sbt download an atomic operation
      
      Modifies the `sbt/sbt` script to gracefully recover when a previous invocation died in the middle of downloading the SBT jar.
      
      Author: Jey Kottalam <jey@cs.berkeley.edu>
      
      == Merge branch commits ==
      
      commit 6c600eb434a2f3e7d70b67831aeebde9b5c0f43b
      Author: Jey Kottalam <jey@cs.berkeley.edu>
      Date:   Fri Jan 17 10:43:54 2014 -0800
      
          Make sbt download an atomic operation
      78050805
  11. Jan 09, 2014
    • Patrick Wendell's avatar
      Small typo fix · 49cbf48b
      Patrick Wendell authored
      49cbf48b
    • Patrick Wendell's avatar
      Don't delegate to users `sbt`. · 4d2e388e
      Patrick Wendell authored
      This changes our `sbt/sbt` script to not delegate to the user's `sbt`
      even if it is present. If users already have sbt installed and they
      want to use their own sbt, we'd expect them to just call sbt directly
      from within Spark. We no longer set any enironment variables or anything
      from this script, so they should just launch sbt directly on their own.
      
      There are a number of hard-to-debug issues which can come from the
      current appraoch. One is if the user is unaware of an existing sbt
      installation and now without explanation their build breaks because
      they haven't configured options correctly (such as permgen size)
      within their sbt. Another is if the user has a much older version
      of sbt hanging around, in which case some of the older versions
      don't acutally work well when newer verisons of sbt are specified
      in the build file (reported by @marmbrus). A third is if the user
      has done some other modification to their sbt script, such as
      setting it to delegate to sbt/sbt in Spark, and this causes
      that to break (also reported by @marmbrus).
      
      So to keep things simple let's just avoid this path and
      remove it. Any user who already has sbt and wants to build
      spark with it should be able to understand easily how to do it.
      4d2e388e
  12. Jan 07, 2014
  13. Jan 06, 2014
  14. Jan 04, 2014
  15. Jan 02, 2014
  16. Dec 15, 2013
    • Josh Rosen's avatar
      Fix Cygwin support in several scripts. · f8ba89da
      Josh Rosen authored
      This allows the spark-shell, spark-class, run-example, make-distribution.sh,
      and ./bin/start-* scripts to work under Cygwin.  Note that this doesn't
      support PySpark under Cygwin, since that requires many additional `cygpath`
      calls from within Python and will be non-trivial to implement.
      
      This PR was inspired by, and subsumes, #253 (so close #253 after this is merged).
      f8ba89da
  17. Sep 01, 2013
  18. Aug 29, 2013
  19. Aug 23, 2013
  20. Aug 21, 2013
  21. Jul 17, 2013
    • Ubuntu's avatar
      Consistently invoke bash with /usr/bin/env bash in scripts to make code more... · 88a0823c
      Ubuntu authored
      Consistently invoke bash with /usr/bin/env bash in scripts to make code more portable (JIRA Ticket SPARK-817)
      88a0823c
    • ctn's avatar
      [BUGFIX] Fix for sbt/sbt script SPARK_HOME setting · a1d2c343
      ctn authored
      In some environments, this command
      
          export SPARK_HOME=$(cd "$(dirname $0)/.."; pwd)
      
      echoes two paths, one by the "cd ..", and one by the "pwd". Note the resulting
      erroneous -jar paths below:
      
          ctn@ubuntu:~/src/spark$ sbt/sbt
          + EXTRA_ARGS=
          + '[' '' '!=' '' ']'
          +++ dirname sbt/sbt
          ++ cd sbt/..
          ++ pwd
          + export 'SPARK_HOME=/home/ctn/src/spark
          /home/ctn/src/spark'
          + SPARK_HOME='/home/ctn/src/spark
          /home/ctn/src/spark'
          + export SPARK_TESTING=1
          + SPARK_TESTING=1
          + java -Xmx1200m -XX:MaxPermSize=350m -XX:ReservedCodeCacheSize=128m -jar /home/ctn/src/spark /home/ctn/src/spark/sbt/sbt-launch-0.11.3-2.jar
          Error: Invalid or corrupt jarfile /home/ctn/src/spark
      
      Committer: ctn <ctn@adatao.com>
      
      On branch master
      Changes to be committed:
      
      - Send output of the "cd .." part to /dev/null
      	modified:   sbt/sbt
      a1d2c343
  22. Jul 16, 2013
  23. Jul 13, 2013
  24. Jun 08, 2013
  25. Apr 16, 2013
  26. Feb 10, 2013
  27. Jan 27, 2013
Loading