Skip to content
Snippets Groups Projects
  1. Apr 18, 2014
    • Reynold Xin's avatar
      Fixed broken pyspark shell. · 81a152c5
      Reynold Xin authored
      Author: Reynold Xin <rxin@apache.org>
      
      Closes #444 from rxin/pyspark and squashes the following commits:
      
      fc11356 [Reynold Xin] Made the PySpark shell version checking compatible with Python 2.6.
      571830b [Reynold Xin] Fixed broken pyspark shell.
      81a152c5
  2. Apr 16, 2014
  3. Apr 10, 2014
    • Ivan Wick's avatar
      Set spark.executor.uri from environment variable (needed by Mesos) · 5cd11d51
      Ivan Wick authored
      The Mesos backend uses this property when setting up a slave process.  It is similarly set in the Scala repl (org.apache.spark.repl.SparkILoop), but I couldn't find any analogous for pyspark.
      
      Author: Ivan Wick <ivanwick+github@gmail.com>
      
      This patch had conflicts when merged, resolved by
      Committer: Matei Zaharia <matei@databricks.com>
      
      Closes #311 from ivanwick/master and squashes the following commits:
      
      da0c3e4 [Ivan Wick] Set spark.executor.uri from environment variable (needed by Mesos)
      5cd11d51
  4. Apr 07, 2014
    • Aaron Davidson's avatar
      SPARK-1099: Introduce local[*] mode to infer number of cores · 0307db0f
      Aaron Davidson authored
      This is the default mode for running spark-shell and pyspark, intended to allow users running spark for the first time to see the performance benefits of using multiple cores, while not breaking backwards compatibility for users who use "local" mode and expect exactly 1 core.
      
      Author: Aaron Davidson <aaron@databricks.com>
      
      Closes #182 from aarondav/110 and squashes the following commits:
      
      a88294c [Aaron Davidson] Rebased changes for new spark-shell
      a9f393e [Aaron Davidson] SPARK-1099: Introduce local[*] mode to infer number of cores
      0307db0f
  5. Feb 08, 2014
    • Mark Hamstra's avatar
      Merge pull request #542 from markhamstra/versionBump. Closes #542. · c2341c92
      Mark Hamstra authored
      Version number to 1.0.0-SNAPSHOT
      
      Since 0.9.0-incubating is done and out the door, we shouldn't be building 0.9.0-incubating-SNAPSHOT anymore.
      
      @pwendell
      
      Author: Mark Hamstra <markhamstra@gmail.com>
      
      == Merge branch commits ==
      
      commit 1b00a8a7c1a7f251b4bb3774b84b9e64758eaa71
      Author: Mark Hamstra <markhamstra@gmail.com>
      Date:   Wed Feb 5 09:30:32 2014 -0800
      
          Version number to 1.0.0-SNAPSHOT
      c2341c92
  6. Jan 02, 2014
  7. Dec 24, 2013
  8. Sep 24, 2013
  9. Sep 07, 2013
  10. Sep 06, 2013
  11. Sep 01, 2013
  12. Aug 12, 2013
  13. Jul 16, 2013
  14. Jan 30, 2013
  15. Jan 20, 2013
  16. Jan 01, 2013
  17. Dec 28, 2012
    • Josh Rosen's avatar
      Simplify PySpark installation. · 665466df
      Josh Rosen authored
      - Bundle Py4J binaries, since it's hard to install
      - Uses Spark's `run` script to launch the Py4J
        gateway, inheriting the settings in spark-env.sh
      
      With these changes, (hopefully) nothing more than
      running `sbt/sbt package` will be necessary to run
      PySpark.
      665466df
  18. Dec 27, 2012
  19. Oct 19, 2012
Loading