- Jul 21, 2013
-
-
Konstantin Boudnik authored
-
Matei Zaharia authored
Regression: default webui-port can't be set via command line "--webui-port" anymore
-
- Jul 19, 2013
-
-
Konstantin Boudnik authored
-
Matei Zaharia authored
Move ML lib data generator files to util/
-
Matei Zaharia authored
Do not copy local jars given to SparkContext in yarn mode
-
Liang-Chi Hsieh authored
-
Liang-Chi Hsieh authored
-
Liang-Chi Hsieh authored
-
Liang-Chi Hsieh authored
Do not copy local jars given to SparkContext in yarn mode since the Context is not running on local. This bug causes failure when jars can not be found. Example codes (such as spark.examples.SparkPi) can not work without this fix under yarn mode.
-
- Jul 18, 2013
-
-
Matei Zaharia authored
Updates to LogisticRegression
-
Shivaram Venkataraman authored
-
Shivaram Venkataraman authored
-
Matei Zaharia authored
[BUGFIX] Fix for sbt/sbt script SPARK_HOME setting
-
Matei Zaharia authored
fix a bug in build process that pulls in two versions of ASM.
-
Liang-Chi Hsieh authored
-
Matei Zaharia authored
Consistently invoke bash with /usr/bin/env bash in scripts to make code ...
-
- Jul 17, 2013
-
-
Ubuntu authored
Consistently invoke bash with /usr/bin/env bash in scripts to make code more portable (JIRA Ticket SPARK-817)
-
Shivaram Venkataraman authored
-
Shivaram Venkataraman authored
Also ensure weights are initialized to a column vector.
-
Shivaram Venkataraman authored
multiple train methods.
-
Shivaram Venkataraman authored
-
Shivaram Venkataraman authored
Also move LogisticGradient to the LogisticRegression file and fix the unit tests log path.
-
ctn authored
In some environments, this command export SPARK_HOME=$(cd "$(dirname $0)/.."; pwd) echoes two paths, one by the "cd ..", and one by the "pwd". Note the resulting erroneous -jar paths below: ctn@ubuntu:~/src/spark$ sbt/sbt + EXTRA_ARGS= + '[' '' '!=' '' ']' +++ dirname sbt/sbt ++ cd sbt/.. ++ pwd + export 'SPARK_HOME=/home/ctn/src/spark /home/ctn/src/spark' + SPARK_HOME='/home/ctn/src/spark /home/ctn/src/spark' + export SPARK_TESTING=1 + SPARK_TESTING=1 + java -Xmx1200m -XX:MaxPermSize=350m -XX:ReservedCodeCacheSize=128m -jar /home/ctn/src/spark /home/ctn/src/spark/sbt/sbt-launch-0.11.3-2.jar Error: Invalid or corrupt jarfile /home/ctn/src/spark Committer: ctn <ctn@adatao.com> On branch master Changes to be committed: - Send output of the "cd .." part to /dev/null modified: sbt/sbt
-
ctn authored
-
- Jul 16, 2013
-
-
Matei Zaharia authored
Dependency upgrade Akka 2.0.3 -> 2.0.5
-
Matei Zaharia authored
Conflicts: make-distribution.sh
-
Matei Zaharia authored
-
Matei Zaharia authored
-
Matei Zaharia authored
-
Matei Zaharia authored
-
Matei Zaharia authored
-
Matei Zaharia authored
-
Prashant Sharma authored
-
Matei Zaharia authored
Throw a more meaningful message when runJob is called to launch tasks on non-existent partitions.
-
Reynold Xin authored
-
- Jul 15, 2013
-
-
seanm authored
-
seanm authored
-
Matei Zaharia authored
Link to job UI from standalone deploy cluster web UI
-
Karen Feng authored
-
Karen Feng authored
-