Skip to content
Snippets Groups Projects
  1. Dec 19, 2015
  2. Dec 18, 2015
  3. Dec 17, 2015
  4. Dec 16, 2015
    • Andrew Or's avatar
      [SPARK-12390] Clean up unused serializer parameter in BlockManager · 97678ede
      Andrew Or authored
      No change in functionality is intended. This only changes internal API.
      
      Author: Andrew Or <andrew@databricks.com>
      
      Closes #10343 from andrewor14/clean-bm-serializer.
      97678ede
    • Marcelo Vanzin's avatar
      [SPARK-12386][CORE] Fix NPE when spark.executor.port is set. · d1508dd9
      Marcelo Vanzin authored
      Author: Marcelo Vanzin <vanzin@cloudera.com>
      
      Closes #10339 from vanzin/SPARK-12386.
      d1508dd9
    • Rohit Agarwal's avatar
      [SPARK-12186][WEB UI] Send the complete request URI including the query string when redirecting. · fdb38227
      Rohit Agarwal authored
      Author: Rohit Agarwal <rohita@qubole.com>
      
      Closes #10180 from mindprince/SPARK-12186.
      fdb38227
    • tedyu's avatar
      [SPARK-12365][CORE] Use ShutdownHookManager where Runtime.getRuntime.addShutdownHook() is called · f590178d
      tedyu authored
      SPARK-9886 fixed ExternalBlockStore.scala
      
      This PR fixes the remaining references to Runtime.getRuntime.addShutdownHook()
      
      Author: tedyu <yuzhihong@gmail.com>
      
      Closes #10325 from ted-yu/master.
      f590178d
    • Imran Rashid's avatar
      [SPARK-10248][CORE] track exceptions in dagscheduler event loop in tests · 38d9795a
      Imran Rashid authored
      `DAGSchedulerEventLoop` normally only logs errors (so it can continue to process more events, from other jobs).  However, this is not desirable in the tests -- the tests should be able to easily detect any exception, and also shouldn't silently succeed if there is an exception.
      
      This was suggested by mateiz on https://github.com/apache/spark/pull/7699.  It may have already turned up an issue in "zero split job".
      
      Author: Imran Rashid <irashid@cloudera.com>
      
      Closes #8466 from squito/SPARK-10248.
      38d9795a
    • Andrew Or's avatar
      MAINTENANCE: Automated closing of pull requests. · ce5fd400
      Andrew Or authored
      This commit exists to close the following pull requests on Github:
      
      Closes #1217 (requested by ankurdave, srowen)
      Closes #4650 (requested by andrewor14)
      Closes #5307 (requested by vanzin)
      Closes #5664 (requested by andrewor14)
      Closes #5713 (requested by marmbrus)
      Closes #5722 (requested by andrewor14)
      Closes #6685 (requested by srowen)
      Closes #7074 (requested by srowen)
      Closes #7119 (requested by andrewor14)
      Closes #7997 (requested by jkbradley)
      Closes #8292 (requested by srowen)
      Closes #8975 (requested by andrewor14, vanzin)
      Closes #8980 (requested by andrewor14, davies)
      ce5fd400
    • Andrew Or's avatar
      [MINOR] Add missing interpolation in NettyRPCEnv · 861549ac
      Andrew Or authored
      ```
      Exception in thread "main" org.apache.spark.rpc.RpcTimeoutException:
      Cannot receive any reply in ${timeout.duration}. This timeout is controlled by spark.rpc.askTimeout
      	at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:48)
      	at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:63)
      	at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
      	at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
      ```
      
      Author: Andrew Or <andrew@databricks.com>
      
      Closes #10334 from andrewor14/rpc-typo.
      861549ac
    • Davies Liu's avatar
      [SPARK-12380] [PYSPARK] use SQLContext.getOrCreate in mllib · 27b98e99
      Davies Liu authored
      MLlib should use SQLContext.getOrCreate() instead of creating new SQLContext.
      
      Author: Davies Liu <davies@databricks.com>
      
      Closes #10338 from davies/create_context.
      27b98e99
Loading