Skip to content
Snippets Groups Projects
  1. Nov 04, 2015
    • jerryshao's avatar
      [SPARK-2960][DEPLOY] Support executing Spark from symlinks (reopen) · 8aff36e9
      jerryshao authored
      This PR is based on the work of roji to support running Spark scripts from symlinks. Thanks for the great work roji . Would you mind taking a look at this PR, thanks a lot.
      
      For releases like HDP and others, normally it will expose the Spark executables as symlinks and put in `PATH`, but current Spark's scripts do not support finding real path from symlink recursively, this will make spark fail to execute from symlink. This PR try to solve this issue by finding the absolute path from symlink.
      
      Instead of using `readlink -f` like what this PR (https://github.com/apache/spark/pull/2386) implemented is that `-f` is not support for Mac, so here manually seeking the path through loop.
      
      I've tested with Mac and Linux (Cent OS), looks fine.
      
      This PR did not fix the scripts under `sbin` folder, not sure if it needs to be fixed also?
      
      Please help to review, any comment is greatly appreciated.
      
      Author: jerryshao <sshao@hortonworks.com>
      Author: Shay Rojansky <roji@roji.org>
      
      Closes #8669 from jerryshao/SPARK-2960.
      8aff36e9
  2. Apr 13, 2015
    • Nathan Kronenfeld's avatar
      [Spark-4848] Allow different Worker configurations in standalone cluster · 435b8779
      Nathan Kronenfeld authored
      This refixes #3699 with the latest code.
      This fixes SPARK-4848
      
      I've changed the stand-alone cluster scripts to allow different workers to have different numbers of instances, with both port and web-ui port following allong appropriately.
      
      I did this by moving the loop over instances from start-slaves and stop-slaves (on the master) to start-slave and stop-slave (on the worker).
      
      Wile I was at it, I changed SPARK_WORKER_PORT to work the same way as SPARK_WORKER_WEBUI_PORT, since the new methods work fine for both.
      
      Author: Nathan Kronenfeld <nkronenfeld@oculusinfo.com>
      
      Closes #5140 from nkronenfeld/feature/spark-4848 and squashes the following commits:
      
      cf5f47e [Nathan Kronenfeld] Merge remote branch 'upstream/master' into feature/spark-4848
      044ca6f [Nathan Kronenfeld] Documentation and formatting as requested by by andrewor14
      d739640 [Nathan Kronenfeld] Move looping through instances from the master to the workers, so that each worker respects its own number of instances and web-ui port
      435b8779
  3. Mar 25, 2014
    • Aaron Davidson's avatar
      SPARK-1286: Make usage of spark-env.sh idempotent · 007a7334
      Aaron Davidson authored
      Various spark scripts load spark-env.sh. This can cause growth of any variables that may be appended to (SPARK_CLASSPATH, SPARK_REPL_OPTS) and it makes the precedence order for options specified in spark-env.sh less clear.
      
      One use-case for the latter is that we want to set options from the command-line of spark-shell, but these options will be overridden by subsequent loading of spark-env.sh. If we were to load the spark-env.sh first and then set our command-line options, we could guarantee correct precedence order.
      
      Note that we use SPARK_CONF_DIR if available to support the sbin/ scripts, which always set this variable from sbin/spark-config.sh. Otherwise, we default to the ../conf/ as usual.
      
      Author: Aaron Davidson <aaron@databricks.com>
      
      Closes #184 from aarondav/idem and squashes the following commits:
      
      e291f91 [Aaron Davidson] Use "private" variables in load-spark-env.sh
      8da8360 [Aaron Davidson] Add .sh extension to load-spark-env.sh
      93a2471 [Aaron Davidson] SPARK-1286: Make usage of spark-env.sh idempotent
      007a7334
  4. Mar 19, 2014
    • Nick Lanham's avatar
      Bundle tachyon: SPARK-1269 · a18ea00f
      Nick Lanham authored
      This should all work as expected with the current version of the tachyon tarball (0.4.1)
      
      Author: Nick Lanham <nick@afternight.org>
      
      Closes #137 from nicklan/bundle-tachyon and squashes the following commits:
      
      2eee15b [Nick Lanham] Put back in exec, start tachyon first
      738ba23 [Nick Lanham] Move tachyon out of sbin
      f2f9bc6 [Nick Lanham] More checks for tachyon script
      111e8e1 [Nick Lanham] Only try tachyon operations if tachyon script exists
      0561574 [Nick Lanham] Copy over web resources so web interface can run
      4dc9809 [Nick Lanham] Update to tachyon 0.4.1
      0a1a20c [Nick Lanham] Add scripts using tachyon tarball
      a18ea00f
  5. Jan 06, 2014
    • sproblvem's avatar
      Update stop-slaves.sh · dea4ba9d
      sproblvem authored
      The most recently version has changed the directory structure, but this script "sbin/stop-all.sh" doesn't change with it accordingly. This mistake makes "sbin/stop-all.sh" can't stop the slave node.
      dea4ba9d
  6. Sep 22, 2013
  7. Sep 01, 2013
  8. Jul 16, 2013
  9. Mar 26, 2013
  10. Aug 02, 2012
  11. Aug 01, 2012
Loading