Skip to content
Snippets Groups Projects
  1. Jun 20, 2014
    • Andrew Ash's avatar
      SPARK-1902 Silence stacktrace from logs when doing port failover to port n+1 · 08d0aca7
      Andrew Ash authored
      Before:
      
      ```
      14/06/08 23:58:23 WARN AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
      java.net.BindException: Address already in use
      	at sun.nio.ch.Net.bind0(Native Method)
      	at sun.nio.ch.Net.bind(Net.java:444)
      	at sun.nio.ch.Net.bind(Net.java:436)
      	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
      	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
      	at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
      	at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
      	at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
      	at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
      	at org.eclipse.jetty.server.Server.doStart(Server.java:293)
      	at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
      	at org.apache.spark.ui.JettyUtils$$anonfun$1.apply$mcV$sp(JettyUtils.scala:192)
      	at org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
      	at org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
      	at scala.util.Try$.apply(Try.scala:161)
      	at org.apache.spark.ui.JettyUtils$.connect$1(JettyUtils.scala:191)
      	at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:205)
      	at org.apache.spark.ui.WebUI.bind(WebUI.scala:99)
      	at org.apache.spark.SparkContext.<init>(SparkContext.scala:223)
      	at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:957)
      	at $line3.$read$$iwC$$iwC.<init>(<console>:8)
      	at $line3.$read$$iwC.<init>(<console>:14)
      	at $line3.$read.<init>(<console>:16)
      	at $line3.$read$.<init>(<console>:20)
      	at $line3.$read$.<clinit>(<console>)
      	at $line3.$eval$.<init>(<console>:7)
      	at $line3.$eval$.<clinit>(<console>)
      	at $line3.$eval.$print(<console>)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:606)
      	at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
      	at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
      	at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
      	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
      	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
      	at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
      	at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
      	at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
      	at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:121)
      	at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:120)
      	at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:263)
      	at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:120)
      	at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:56)
      	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:913)
      	at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:142)
      	at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:56)
      	at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:104)
      	at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:56)
      	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:930)
      	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
      	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
      	at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
      	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
      	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
      	at org.apache.spark.repl.Main$.main(Main.scala:31)
      	at org.apache.spark.repl.Main.main(Main.scala)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:606)
      	at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
      	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
      	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      14/06/08 23:58:23 WARN AbstractLifeCycle: FAILED org.eclipse.jetty.server.Server@7439e55a: java.net.BindException: Address already in use
      java.net.BindException: Address already in use
      	at sun.nio.ch.Net.bind0(Native Method)
      	at sun.nio.ch.Net.bind(Net.java:444)
      	at sun.nio.ch.Net.bind(Net.java:436)
      	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
      	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
      	at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
      	at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
      	at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
      	at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
      	at org.eclipse.jetty.server.Server.doStart(Server.java:293)
      	at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
      	at org.apache.spark.ui.JettyUtils$$anonfun$1.apply$mcV$sp(JettyUtils.scala:192)
      	at org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
      	at org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
      	at scala.util.Try$.apply(Try.scala:161)
      	at org.apache.spark.ui.JettyUtils$.connect$1(JettyUtils.scala:191)
      	at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:205)
      	at org.apache.spark.ui.WebUI.bind(WebUI.scala:99)
      	at org.apache.spark.SparkContext.<init>(SparkContext.scala:223)
      	at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:957)
      	at $line3.$read$$iwC$$iwC.<init>(<console>:8)
      	at $line3.$read$$iwC.<init>(<console>:14)
      	at $line3.$read.<init>(<console>:16)
      	at $line3.$read$.<init>(<console>:20)
      	at $line3.$read$.<clinit>(<console>)
      	at $line3.$eval$.<init>(<console>:7)
      	at $line3.$eval$.<clinit>(<console>)
      	at $line3.$eval.$print(<console>)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:606)
      	at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
      	at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
      	at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
      	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
      	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
      	at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
      	at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
      	at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
      	at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:121)
      	at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:120)
      	at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:263)
      	at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:120)
      	at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:56)
      	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:913)
      	at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:142)
      	at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:56)
      	at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:104)
      	at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:56)
      	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:930)
      	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
      	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
      	at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
      	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
      	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
      	at org.apache.spark.repl.Main$.main(Main.scala:31)
      	at org.apache.spark.repl.Main.main(Main.scala)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:606)
      	at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
      	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
      	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      14/06/08 23:58:23 INFO JettyUtils: Failed to create UI at port, 4040. Trying again.
      14/06/08 23:58:23 INFO JettyUtils: Error was: Failure(java.net.BindException: Address already in use)
      14/06/08 23:58:23 INFO SparkUI: Started SparkUI at http://aash-mbp.local:4041
      ````
      
      After:
      ```
      14/06/09 00:04:12 INFO JettyUtils: Failed to create UI at port, 4040. Trying again.
      14/06/09 00:04:12 INFO JettyUtils: Error was: Failure(java.net.BindException: Address already in use)
      14/06/09 00:04:12 INFO Server: jetty-8.y.z-SNAPSHOT
      14/06/09 00:04:12 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:4041
      14/06/09 00:04:12 INFO SparkUI: Started SparkUI at http://aash-mbp.local:4041
      ```
      
      Lengthy logging comes from this line of code in Jetty: http://grepcode.com/file/repo1.maven.org/maven2/org.eclipse.jetty.aggregate/jetty-all/9.1.3.v20140225/org/eclipse/jetty/util/component/AbstractLifeCycle.java#210
      
      Author: Andrew Ash <andrew@andrewash.com>
      
      Closes #1019 from ash211/SPARK-1902 and squashes the following commits:
      
      0dd02f7 [Andrew Ash] Leave old org.eclipse.jetty silencing in place
      1e2866b [Andrew Ash] Address CR comments
      9d85eed [Andrew Ash] SPARK-1902 Silence stacktrace from logs when doing port failover to port n+1
      08d0aca7
  2. May 12, 2014
    • Andrew Or's avatar
      [SPARK-1753 / 1773 / 1814] Update outdated docs for spark-submit, YARN, standalone etc. · 2ffd1eaf
      Andrew Or authored
      YARN
      - SparkPi was updated to not take in master as an argument; we should update the docs to reflect that.
      - The default YARN build guide should be in maven, not sbt.
      - This PR also adds a paragraph on steps to debug a YARN application.
      
      Standalone
      - Emphasize spark-submit more. Right now it's one small paragraph preceding the legacy way of launching through `org.apache.spark.deploy.Client`.
      - The way we set configurations / environment variables according to the old docs is outdated. This needs to reflect changes introduced by the Spark configuration changes we made.
      
      In general, this PR also adds a little more documentation on the new spark-shell, spark-submit, spark-defaults.conf etc here and there.
      
      Author: Andrew Or <andrewor14@gmail.com>
      
      Closes #701 from andrewor14/yarn-docs and squashes the following commits:
      
      e2c2312 [Andrew Or] Merge in changes in #752 (SPARK-1814)
      25cfe7b [Andrew Or] Merge in the warning from SPARK-1753
      a8c39c5 [Andrew Or] Minor changes
      336bbd9 [Andrew Or] Tabs -> spaces
      4d9d8f7 [Andrew Or] Merge branch 'master' of github.com:apache/spark into yarn-docs
      041017a [Andrew Or] Abstract Spark submit documentation to cluster-overview.html
      3cc0649 [Andrew Or] Detail how to set configurations + remove legacy instructions
      5b7140a [Andrew Or] Merge branch 'master' of github.com:apache/spark into yarn-docs
      85a51fc [Andrew Or] Update run-example, spark-shell, configuration etc.
      c10e8c7 [Andrew Or] Merge branch 'master' of github.com:apache/spark into yarn-docs
      381fe32 [Andrew Or] Update docs for standalone mode
      757c184 [Andrew Or] Add a note about the requirements for the debugging trick
      f8ca990 [Andrew Or] Merge branch 'master' of github.com:apache/spark into yarn-docs
      924f04c [Andrew Or] Revert addition of --deploy-mode
      d5fe17b [Andrew Or] Update the YARN docs
      2ffd1eaf
    • Andrew Or's avatar
      [SPARK-1780] Non-existent SPARK_DAEMON_OPTS is lurking around · ba96bb3d
      Andrew Or authored
      What they really mean is SPARK_DAEMON_***JAVA***_OPTS
      
      Author: Andrew Or <andrewor14@gmail.com>
      
      Closes #751 from andrewor14/spark-daemon-opts and squashes the following commits:
      
      70c41f9 [Andrew Or] SPARK_DAEMON_OPTS -> SPARK_DAEMON_JAVA_OPTS
      ba96bb3d
  3. Apr 22, 2014
    • Patrick Wendell's avatar
      Assorted clean-up for Spark-on-YARN. · 995fdc96
      Patrick Wendell authored
      In particular when the HADOOP_CONF_DIR is not not specified.
      
      Author: Patrick Wendell <pwendell@gmail.com>
      
      Closes #488 from pwendell/hadoop-cleanup and squashes the following commits:
      
      fe95f13 [Patrick Wendell] Changes based on Andrew's feeback
      18d09c1 [Patrick Wendell] Review comments from Andrew
      17929cc [Patrick Wendell] Assorted clean-up for Spark-on-YARN.
      995fdc96
  4. Apr 21, 2014
    • Patrick Wendell's avatar
      Clean up and simplify Spark configuration · fb98488f
      Patrick Wendell authored
      Over time as we've added more deployment modes, this have gotten a bit unwieldy with user-facing configuration options in Spark. Going forward we'll advise all users to run `spark-submit` to launch applications. This is a WIP patch but it makes the following improvements:
      
      1. Improved `spark-env.sh.template` which was missing a lot of things users now set in that file.
      2. Removes the shipping of SPARK_CLASSPATH, SPARK_JAVA_OPTS, and SPARK_LIBRARY_PATH to the executors on the cluster. This was an ugly hack. Instead it introduces config variables spark.executor.extraJavaOpts, spark.executor.extraLibraryPath, and spark.executor.extraClassPath.
      3. Adds ability to set these same variables for the driver using `spark-submit`.
      4. Allows you to load system properties from a `spark-defaults.conf` file when running `spark-submit`. This will allow setting both SparkConf options and other system properties utilized by `spark-submit`.
      5. Made `SPARK_LOCAL_IP` an environment variable rather than a SparkConf property. This is more consistent with it being set on each node.
      
      Author: Patrick Wendell <pwendell@gmail.com>
      
      Closes #299 from pwendell/config-cleanup and squashes the following commits:
      
      127f301 [Patrick Wendell] Improvements to testing
      a006464 [Patrick Wendell] Moving properties file template.
      b4b496c [Patrick Wendell] spark-defaults.properties -> spark-defaults.conf
      0086939 [Patrick Wendell] Minor style fixes
      af09e3e [Patrick Wendell] Mention config file in docs and clean-up docs
      b16e6a2 [Patrick Wendell] Cleanup of spark-submit script and Scala quick start guide
      af0adf7 [Patrick Wendell] Automatically add user jar
      a56b125 [Patrick Wendell] Responses to Tom's review
      d50c388 [Patrick Wendell] Merge remote-tracking branch 'apache/master' into config-cleanup
      a762901 [Patrick Wendell] Fixing test failures
      ffa00fe [Patrick Wendell] Review feedback
      fda0301 [Patrick Wendell] Note
      308f1f6 [Patrick Wendell] Properly escape quotes and other clean-up for YARN
      e83cd8f [Patrick Wendell] Changes to allow re-use of test applications
      be42f35 [Patrick Wendell] Handle case where SPARK_HOME is not set
      c2a2909 [Patrick Wendell] Test compile fixes
      4ee6f9d [Patrick Wendell] Making YARN doc changes consistent
      afc9ed8 [Patrick Wendell] Cleaning up line limits and two compile errors.
      b08893b [Patrick Wendell] Additional improvements.
      ace4ead [Patrick Wendell] Responses to review feedback.
      b72d183 [Patrick Wendell] Review feedback for spark env file
      46555c1 [Patrick Wendell] Review feedback and import clean-ups
      437aed1 [Patrick Wendell] Small fix
      761ebcd [Patrick Wendell] Library path and classpath for drivers
      7cc70e4 [Patrick Wendell] Clean up terminology inside of spark-env script
      5b0ba8e [Patrick Wendell] Don't ship executor envs
      84cc5e5 [Patrick Wendell] Small clean-up
      1f75238 [Patrick Wendell] SPARK_JAVA_OPTS --> SPARK_MASTER_OPTS for master settings
      4982331 [Patrick Wendell] Remove SPARK_LIBRARY_PATH
      6eaf7d0 [Patrick Wendell] executorJavaOpts
      0faa3b6 [Patrick Wendell] Stash of adding config options in submit script and YARN
      ac2d65e [Patrick Wendell] Change spark.local.dir -> SPARK_LOCAL_DIRS
      fb98488f
  5. Mar 01, 2014
  6. Feb 22, 2014
    • CodingCat's avatar
      [SPARK-1041] remove dead code in start script, remind user to set that in spark-env.sh · 437b62fc
      CodingCat authored
      the lines in start-master.sh and start-slave.sh no longer work
      
      in ec2, the host name has changed, e.g.
      
      ubuntu@ip-172-31-36-93:~$ hostname
      ip-172-31-36-93
      
      also, the URL to fetch public DNS name also changed, e.g.
      
      ubuntu@ip-172-31-36-93:~$ wget -q -O - http://instance-data.ec2.internal/latest/meta-data/public-hostname
      ubuntu@ip-172-31-36-93:~$  (returns nothing)
      
      since we have spark-ec2 project, we don't need to have such ec2-specific lines here, instead, user only need to set in spark-env.sh
      
      Author: CodingCat <zhunansjtu@gmail.com>
      
      Closes #588 from CodingCat/deadcode_in_sbin and squashes the following commits:
      
      e4236e0 [CodingCat] remove dead code in start script, remind user set that in spark-env.sh
      437b62fc
  7. Feb 14, 2014
    • Andrew Ash's avatar
      Typo: Standlone -> Standalone · eec4bd1a
      Andrew Ash authored
      Author: Andrew Ash <andrew@andrewash.com>
      
      Closes #601 from ash211/typo and squashes the following commits:
      
      9cd43ac [Andrew Ash] Change docs references to metrics.properties, not metrics.conf
      3813ff1 [Andrew Ash] Typo: mulitcast -> multicast
      873bd2f [Andrew Ash] Typo: Standlone -> Standalone
      eec4bd1a
  8. Jan 10, 2014
  9. Jan 09, 2014
  10. Jan 07, 2014
  11. Jan 05, 2014
  12. Nov 19, 2013
  13. Nov 11, 2013
  14. Nov 08, 2013
    • Russell Cardullo's avatar
      Add graphite sink for metrics · ef85a51f
      Russell Cardullo authored
      This adds a metrics sink for graphite.  The sink must
      be configured with the host and port of a graphite node
      and optionally may be configured with a prefix that will
      be prepended to all metrics that are sent to graphite.
      ef85a51f
  15. Oct 13, 2013
  16. Oct 11, 2013
  17. Sep 08, 2013
  18. Aug 31, 2013
  19. Aug 21, 2013
  20. Aug 15, 2013
  21. Aug 12, 2013
  22. Aug 09, 2013
  23. Aug 01, 2013
  24. Jul 24, 2013
Loading