Skip to content
Snippets Groups Projects
  1. Mar 03, 2016
    • Dongjoon Hyun's avatar
      [MINOR] Fix typos in comments and testcase name of code · 941b270b
      Dongjoon Hyun authored
      ## What changes were proposed in this pull request?
      
      This PR fixes typos in comments and testcase name of code.
      
      ## How was this patch tested?
      
      manual.
      
      Author: Dongjoon Hyun <dongjoon@apache.org>
      
      Closes #11481 from dongjoon-hyun/minor_fix_typos_in_code.
      941b270b
    • Dongjoon Hyun's avatar
      [SPARK-13583][CORE][STREAMING] Remove unused imports and add checkstyle rule · b5f02d67
      Dongjoon Hyun authored
      ## What changes were proposed in this pull request?
      
      After SPARK-6990, `dev/lint-java` keeps Java code healthy and helps PR review by saving much time.
      This issue aims remove unused imports from Java/Scala code and add `UnusedImports` checkstyle rule to help developers.
      
      ## How was this patch tested?
      ```
      ./dev/lint-java
      ./build/sbt compile
      ```
      
      Author: Dongjoon Hyun <dongjoon@apache.org>
      
      Closes #11438 from dongjoon-hyun/SPARK-13583.
      b5f02d67
  2. Feb 09, 2016
    • Iulian Dragos's avatar
      [SPARK-13086][SHELL] Use the Scala REPL settings, to enable things like `-i file`. · e30121af
      Iulian Dragos authored
      Now:
      
      ```
      $ bin/spark-shell -i test.scala
      NOTE: SPARK_PREPEND_CLASSES is set, placing locally compiled Spark classes ahead of assembly.
      Setting default log level to "WARN".
      To adjust logging level use sc.setLogLevel(newLevel).
      16/01/29 17:37:38 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
      16/01/29 17:37:39 INFO Main: Created spark context..
      Spark context available as sc (master = local[*], app id = local-1454085459000).
      16/01/29 17:37:39 INFO Main: Created sql context..
      SQL context available as sqlContext.
      Loading test.scala...
      hello
      
      Welcome to
            ____              __
           / __/__  ___ _____/ /__
          _\ \/ _ \/ _ `/ __/  '_/
         /___/ .__/\_,_/_/ /_/\_\   version 2.0.0-SNAPSHOT
            /_/
      
      Using Scala version 2.11.7 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_45)
      Type in expressions to have them evaluated.
      Type :help for more information.
      ```
      
      Author: Iulian Dragos <jaguarul@gmail.com>
      
      Closes #10984 from dragos/issue/repl-eval-file.
      e30121af
  3. Jan 30, 2016
    • Josh Rosen's avatar
      [SPARK-6363][BUILD] Make Scala 2.11 the default Scala version · 289373b2
      Josh Rosen authored
      This patch changes Spark's build to make Scala 2.11 the default Scala version. To be clear, this does not mean that Spark will stop supporting Scala 2.10: users will still be able to compile Spark for Scala 2.10 by following the instructions on the "Building Spark" page; however, it does mean that Scala 2.11 will be the default Scala version used by our CI builds (including pull request builds).
      
      The Scala 2.11 compiler is faster than 2.10, so I think we'll be able to look forward to a slight speedup in our CI builds (it looks like it's about 2X faster for the Maven compile-only builds, for instance).
      
      After this patch is merged, I'll update Jenkins to add new compile-only jobs to ensure that Scala 2.10 compilation doesn't break.
      
      Author: Josh Rosen <joshrosen@databricks.com>
      
      Closes #10608 from JoshRosen/SPARK-6363.
      289373b2
  4. Jan 13, 2016
  5. Jan 05, 2016
  6. Dec 31, 2015
  7. Dec 24, 2015
    • Kazuaki Ishizaki's avatar
      [SPARK-12311][CORE] Restore previous value of "os.arch" property in test... · 39204661
      Kazuaki Ishizaki authored
      [SPARK-12311][CORE] Restore previous value of "os.arch" property in test suites after forcing to set specific value to "os.arch" property
      
      Restore the original value of os.arch property after each test
      
      Since some of tests forced to set the specific value to os.arch property, we need to set the original value.
      
      Author: Kazuaki Ishizaki <ishizaki@jp.ibm.com>
      
      Closes #10289 from kiszk/SPARK-12311.
      39204661
  8. Dec 20, 2015
  9. Dec 19, 2015
  10. Dec 18, 2015
    • Marcelo Vanzin's avatar
      [SPARK-12350][CORE] Don't log errors when requested stream is not found. · 27828182
      Marcelo Vanzin authored
      If a client requests a non-existent stream, just send a failure message
      back, without logging any error on the server side (since it's not a
      server error).
      
      On the executor side, avoid error logs by translating any errors during
      transfer to a `ClassNotFoundException`, so that loading the class is
      retried on a the parent class loader. This can mask IO errors during
      transmission, but the most common cause is that the class is not
      served by the remote end.
      
      Author: Marcelo Vanzin <vanzin@cloudera.com>
      
      Closes #10337 from vanzin/SPARK-12350.
      27828182
  11. Dec 10, 2015
    • Marcelo Vanzin's avatar
      [SPARK-11563][CORE][REPL] Use RpcEnv to transfer REPL-generated classes. · 4a46b885
      Marcelo Vanzin authored
      This avoids bringing up yet another HTTP server on the driver, and
      instead reuses the file server already managed by the driver's
      RpcEnv. As a bonus, the repl now inherits the security features of
      the network library.
      
      There's also a small change to create the directory for storing classes
      under the root temp dir for the application (instead of directly
      under java.io.tmpdir).
      
      Author: Marcelo Vanzin <vanzin@cloudera.com>
      
      Closes #9923 from vanzin/SPARK-11563.
      4a46b885
    • Jakob Odersky's avatar
      [SPARK-11832][CORE] Process arguments in spark-shell for Scala 2.11 · db516524
      Jakob Odersky authored
      Process arguments passed to the spark-shell. Fixes running the spark-shell from within a build environment.
      
      Author: Jakob Odersky <jodersky@gmail.com>
      
      Closes #9824 from jodersky/shell-2.11.
      db516524
  12. Nov 24, 2015
    • Marcelo Vanzin's avatar
      [SPARK-11929][CORE] Make the repl log4j configuration override the root logger. · e6dd2374
      Marcelo Vanzin authored
      In the default Spark distribution, there are currently two separate
      log4j config files, with different default values for the root logger,
      so that when running the shell you have a different default log level.
      This makes the shell more usable, since the logs don't overwhelm the
      output.
      
      But if you install a custom log4j.properties, you lose that, because
      then it's going to be used no matter whether you're running a regular
      app or the shell.
      
      With this change, the overriding of the log level is done differently;
      the log level repl's main class (org.apache.spark.repl.Main) is used
      to define the root logger's level when running the shell, defaulting
      to WARN if it's not set explicitly.
      
      On a somewhat related change, the shell output about the "sc" variable
      was changed a bit to contain a little more useful information about
      the application, since when the root logger's log level is WARN, that
      information is never shown to the user.
      
      Author: Marcelo Vanzin <vanzin@cloudera.com>
      
      Closes #9816 from vanzin/shell-logging.
      e6dd2374
    • Jungtaek Lim's avatar
      [SPARK-11818][REPL] Fix ExecutorClassLoader to lookup resources from … · be9dd155
      Jungtaek Lim authored
      …parent class loader
      
      Without patch, two additional tests of ExecutorClassLoaderSuite fails.
      
      - "resource from parent"
      - "resources from parent"
      
      Detailed explanation is here, https://issues.apache.org/jira/browse/SPARK-11818?focusedCommentId=15011202&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-15011202
      
      Author: Jungtaek Lim <kabhwan@gmail.com>
      
      Closes #9812 from HeartSaVioR/SPARK-11818.
      be9dd155
  13. Nov 20, 2015
  14. Nov 11, 2015
  15. Oct 07, 2015
  16. Sep 15, 2015
  17. Sep 14, 2015
  18. Sep 09, 2015
    • Luc Bourlier's avatar
      [SPARK-10227] fatal warnings with sbt on Scala 2.11 · c1bc4f43
      Luc Bourlier authored
      The bulk of the changes are on `transient` annotation on class parameter. Often the compiler doesn't generate a field for this parameters, so the the transient annotation would be unnecessary.
      But if the class parameter are used in methods, then fields are created. So it is safer to keep the annotations.
      
      The remainder are some potential bugs, and deprecated syntax.
      
      Author: Luc Bourlier <luc.bourlier@typesafe.com>
      
      Closes #8433 from skyluc/issue/sbt-2.11.
      c1bc4f43
  19. Aug 04, 2015
  20. Jul 22, 2015
    • Kenichi Maehashi's avatar
      [SPARK-9180] fix spark-shell to accept --name option · 430cd781
      Kenichi Maehashi authored
      This patch fixes [[SPARK-9180]](https://issues.apache.org/jira/browse/SPARK-9180).
      Users can now set the app name of spark-shell using `spark-shell --name "whatever"`.
      
      Author: Kenichi Maehashi <webmaster@kenichimaehashi.com>
      
      Closes #7512 from kmaehashi/fix-spark-shell-app-name and squashes the following commits:
      
      e24991a [Kenichi Maehashi] use setIfMissing instead of setAppName
      18aa4ad [Kenichi Maehashi] fix spark-shell to accept --name option
      430cd781
    • Matei Zaharia's avatar
      [SPARK-9244] Increase some memory defaults · fe26584a
      Matei Zaharia authored
      There are a few memory limits that people hit often and that we could
      make higher, especially now that memory sizes have grown.
      
      - spark.akka.frameSize: This defaults at 10 but is often hit for map
        output statuses in large shuffles. This memory is not fully allocated
        up-front, so we can just make this larger and still not affect jobs
        that never sent a status that large. We increase it to 128.
      
      - spark.executor.memory: Defaults at 512m, which is really small. We
        increase it to 1g.
      
      Author: Matei Zaharia <matei@databricks.com>
      
      Closes #7586 from mateiz/configs and squashes the following commits:
      
      ce0038a [Matei Zaharia] [SPARK-9244] Increase some memory defaults
      fe26584a
  21. Jul 16, 2015
    • Jan Prach's avatar
      [SPARK-9015] [BUILD] Clean project import in scala ide · b536d5dc
      Jan Prach authored
      Cleanup maven for a clean import in scala-ide / eclipse.
      
      * remove groovy plugin which is really not needed at all
      * add-source from build-helper-maven-plugin is not needed as recent version of scala-maven-plugin do it automatically
      * add lifecycle-mapping plugin to hide a few useless warnings from ide
      
      Author: Jan Prach <jendap@gmail.com>
      
      Closes #7375 from jendap/clean-project-import-in-scala-ide and squashes the following commits:
      
      c4b4c0f [Jan Prach] fix whitespaces
      5a83e07 [Jan Prach] Revert "remove java compiler warnings from java tests"
      312007e [Jan Prach] scala-maven-plugin itself add scala sources by default
      f47d856 [Jan Prach] remove spark-1.4-staging repository
      c8a54db [Jan Prach] remove java compiler warnings from java tests
      999a068 [Jan Prach] remove some maven warnings in scala ide
      80fbdc5 [Jan Prach] remove groovy and gmavenplus plugin
      b536d5dc
  22. Jul 14, 2015
    • Josh Rosen's avatar
      [SPARK-8962] Add Scalastyle rule to ban direct use of Class.forName; fix existing uses · 11e5c372
      Josh Rosen authored
      This pull request adds a Scalastyle regex rule which fails the style check if `Class.forName` is used directly.  `Class.forName` always loads classes from the default / system classloader, but in a majority of cases, we should be using Spark's own `Utils.classForName` instead, which tries to load classes from the current thread's context classloader and falls back to the classloader which loaded Spark when the context classloader is not defined.
      
      <!-- Reviewable:start -->
      [<img src="https://reviewable.io/review_button.png" height=40 alt="Review on Reviewable"/>](https://reviewable.io/reviews/apache/spark/7350)
      <!-- Reviewable:end -->
      
      Author: Josh Rosen <joshrosen@databricks.com>
      
      Closes #7350 from JoshRosen/ban-Class.forName and squashes the following commits:
      
      e3e96f7 [Josh Rosen] Merge remote-tracking branch 'origin/master' into ban-Class.forName
      c0b7885 [Josh Rosen] Hopefully fix the last two cases
      d707ba7 [Josh Rosen] Fix uses of Class.forName that I missed in my first cleanup pass
      046470d [Josh Rosen] Merge remote-tracking branch 'origin/master' into ban-Class.forName
      62882ee [Josh Rosen] Fix uses of Class.forName or add exclusion.
      d9abade [Josh Rosen] Add stylechecker rule to ban uses of Class.forName
      11e5c372
  23. Jul 10, 2015
    • Iulian Dragos's avatar
      [SPARK-7944] [SPARK-8013] Remove most of the Spark REPL fork for Scala 2.11 · 11e22b74
      Iulian Dragos authored
      This PR removes most of the code in the Spark REPL for Scala 2.11 and leaves just a couple of overridden methods in `SparkILoop` in order to:
      
      - change welcome message
      - restrict available commands (like `:power`)
      - initialize Spark context
      
      The two codebases have diverged and it's extremely hard to backport fixes from the upstream REPL. This somewhat radical step is absolutely necessary in order to fix other REPL tickets (like SPARK-8013 - Hive Thrift server for 2.11). BTW, the Scala REPL has fixed the serialization-unfriendly wrappers thanks to ScrapCodes's work in [#4522](https://github.com/scala/scala/pull/4522)
      
      All tests pass and I tried the `spark-shell` on our Mesos cluster with some simple jobs (including with additional jars), everything looked good.
      
      As soon as Scala 2.11.7 is out we need to upgrade and get a shaded `jline` dependency, clearing the way for SPARK-8013.
      
      /cc pwendell
      
      Author: Iulian Dragos <jaguarul@gmail.com>
      
      Closes #6903 from dragos/issue/no-spark-repl-fork and squashes the following commits:
      
      c596c6f [Iulian Dragos] Merge branch 'master' into issue/no-spark-repl-fork
      2b1a305 [Iulian Dragos] Removed spaces around multiple imports.
      0ce67a6 [Iulian Dragos] Remove -verbose flag for java compiler (added by mistake in an earlier commit).
      10edaf9 [Iulian Dragos] Keep the jline dependency only in the 2.10 build.
      529293b [Iulian Dragos] Add back Spark REPL files to rat-excludes, since they are part of the 2.10 real.
      d85370d [Iulian Dragos] Remove jline dependency from the Spark REPL.
      b541930 [Iulian Dragos] Merge branch 'master' into issue/no-spark-repl-fork
      2b15962 [Iulian Dragos] Change jline dependency and bump Scala version.
      b300183 [Iulian Dragos] Rename package and add license on top of the file, remove files from rat-excludes and removed `-Yrepl-sync` per reviewer’s request.
      9d46d85 [Iulian Dragos] Fix SPARK-7944.
      abcc7cb [Iulian Dragos] Remove the REPL forked code.
      11e22b74
    • Jonathan Alter's avatar
      [SPARK-7977] [BUILD] Disallowing println · e14b545d
      Jonathan Alter authored
      Author: Jonathan Alter <jonalter@users.noreply.github.com>
      
      Closes #7093 from jonalter/SPARK-7977 and squashes the following commits:
      
      ccd44cc [Jonathan Alter] Changed println to log in ThreadingSuite
      7fcac3e [Jonathan Alter] Reverting to println in ThreadingSuite
      10724b6 [Jonathan Alter] Changing some printlns to logs in tests
      eeec1e7 [Jonathan Alter] Merge branch 'master' of github.com:apache/spark into SPARK-7977
      0b1dcb4 [Jonathan Alter] More println cleanup
      aedaf80 [Jonathan Alter] Merge branch 'master' of github.com:apache/spark into SPARK-7977
      925fd98 [Jonathan Alter] Merge branch 'master' of github.com:apache/spark into SPARK-7977
      0c16fa3 [Jonathan Alter] Replacing some printlns with logs
      45c7e05 [Jonathan Alter] Merge branch 'master' of github.com:apache/spark into SPARK-7977
      5c8e283 [Jonathan Alter] Allowing println in audit-release examples
      5b50da1 [Jonathan Alter] Allowing printlns in example files
      ca4b477 [Jonathan Alter] Merge branch 'master' of github.com:apache/spark into SPARK-7977
      83ab635 [Jonathan Alter] Fixing new printlns
      54b131f [Jonathan Alter] Merge branch 'master' of github.com:apache/spark into SPARK-7977
      1cd8a81 [Jonathan Alter] Removing some unnecessary comments and printlns
      b837c3a [Jonathan Alter] Disallowing println
      e14b545d
  24. Jun 28, 2015
    • Josh Rosen's avatar
      [SPARK-8683] [BUILD] Depend on mockito-core instead of mockito-all · f5100451
      Josh Rosen authored
      Spark's tests currently depend on `mockito-all`, which bundles Hamcrest and Objenesis classes. Instead, it should depend on `mockito-core`, which declares those libraries as Maven dependencies. This is necessary in order to fix a dependency conflict that leads to a NoSuchMethodError when using certain Hamcrest matchers.
      
      See https://github.com/mockito/mockito/wiki/Declaring-mockito-dependency for more details.
      
      Author: Josh Rosen <joshrosen@databricks.com>
      
      Closes #7061 from JoshRosen/mockito-core-instead-of-all and squashes the following commits:
      
      70eccbe [Josh Rosen] Depend on mockito-core instead of mockito-all.
      f5100451
  25. Jun 19, 2015
    • Davies Liu's avatar
      [SPARK-8461] [SQL] fix codegen with REPL class loader · e41e2fd6
      Davies Liu authored
      The ExecutorClassLoader for REPL will cause Janino failed to find class for those in java.lang, so switch to use default class loader for Janino, which will also help performance.
      
      cc liancheng yhuai
      
      Author: Davies Liu <davies@databricks.com>
      
      Closes #6898 from davies/fix_class_loader and squashes the following commits:
      
      24276d4 [Davies Liu] add regression test
      4ff0457 [Davies Liu] address comment, refactor
      7f5ffbe [Davies Liu] fix REPL class loader with codegen
      e41e2fd6
  26. Jun 03, 2015
    • Patrick Wendell's avatar
      [SPARK-7801] [BUILD] Updating versions to SPARK 1.5.0 · 2c4d550e
      Patrick Wendell authored
      Author: Patrick Wendell <patrick@databricks.com>
      
      Closes #6328 from pwendell/spark-1.5-update and squashes the following commits:
      
      2f42d02 [Patrick Wendell] A few more excludes
      4bebcf0 [Patrick Wendell] Update to RC4
      61aaf46 [Patrick Wendell] Using new release candidate
      55f1610 [Patrick Wendell] Another exclude
      04b4f04 [Patrick Wendell] More issues with transient 1.4 changes
      36f549b [Patrick Wendell] [SPARK-7801] [BUILD] Updating versions to SPARK 1.5.0
      2c4d550e
  27. May 29, 2015
    • Andrew Or's avatar
      [SPARK-7558] Demarcate tests in unit-tests.log · 9eb222c1
      Andrew Or authored
      Right now `unit-tests.log` are not of much value because we can't tell where the test boundaries are easily. This patch adds log statements before and after each test to outline the test boundaries, e.g.:
      
      ```
      ===== TEST OUTPUT FOR o.a.s.serializer.KryoSerializerSuite: 'kryo with parallelize for primitive arrays' =====
      
      15/05/27 12:36:39.596 pool-1-thread-1-ScalaTest-running-KryoSerializerSuite INFO SparkContext: Starting job: count at KryoSerializerSuite.scala:230
      15/05/27 12:36:39.596 dag-scheduler-event-loop INFO DAGScheduler: Got job 3 (count at KryoSerializerSuite.scala:230) with 4 output partitions (allowLocal=false)
      15/05/27 12:36:39.596 dag-scheduler-event-loop INFO DAGScheduler: Final stage: ResultStage 3(count at KryoSerializerSuite.scala:230)
      15/05/27 12:36:39.596 dag-scheduler-event-loop INFO DAGScheduler: Parents of final stage: List()
      15/05/27 12:36:39.597 dag-scheduler-event-loop INFO DAGScheduler: Missing parents: List()
      15/05/27 12:36:39.597 dag-scheduler-event-loop INFO DAGScheduler: Submitting ResultStage 3 (ParallelCollectionRDD[5] at parallelize at KryoSerializerSuite.scala:230), which has no missing parents
      
      ...
      
      15/05/27 12:36:39.624 pool-1-thread-1-ScalaTest-running-KryoSerializerSuite INFO DAGScheduler: Job 3 finished: count at KryoSerializerSuite.scala:230, took 0.028563 s
      15/05/27 12:36:39.625 pool-1-thread-1-ScalaTest-running-KryoSerializerSuite INFO KryoSerializerSuite:
      
      ***** FINISHED o.a.s.serializer.KryoSerializerSuite: 'kryo with parallelize for primitive arrays' *****
      
      ...
      ```
      
      Author: Andrew Or <andrew@databricks.com>
      
      Closes #6441 from andrewor14/demarcate-tests and squashes the following commits:
      
      879b060 [Andrew Or] Fix compile after rebase
      d622af7 [Andrew Or] Merge branch 'master' of github.com:apache/spark into demarcate-tests
      017c8ba [Andrew Or] Merge branch 'master' of github.com:apache/spark into demarcate-tests
      7790b6c [Andrew Or] Fix tests after logical merge conflict
      c7460c0 [Andrew Or] Merge branch 'master' of github.com:apache/spark into demarcate-tests
      c43ffc4 [Andrew Or] Fix tests?
      8882581 [Andrew Or] Fix tests
      ee22cda [Andrew Or] Fix log message
      fa9450e [Andrew Or] Merge branch 'master' of github.com:apache/spark into demarcate-tests
      12d1e1b [Andrew Or] Various whitespace changes (minor)
      69cbb24 [Andrew Or] Make all test suites extend SparkFunSuite instead of FunSuite
      bbce12e [Andrew Or] Fix manual things that cannot be covered through automation
      da0b12f [Andrew Or] Add core tests as dependencies in all modules
      f7d29ce [Andrew Or] Introduce base abstract class for all test suites
      9eb222c1
  28. May 19, 2015
  29. May 13, 2015
    • Masayoshi TSUZUKI's avatar
      [SPARK-6568] spark-shell.cmd --jars option does not accept the jar that has space in its path · 50c72708
      Masayoshi TSUZUKI authored
      escape spaces in the arguments.
      
      Author: Masayoshi TSUZUKI <tsudukim@oss.nttdata.co.jp>
      Author: Kousuke Saruta <sarutak@oss.nttdata.co.jp>
      
      Closes #5447 from tsudukim/feature/SPARK-6568-2 and squashes the following commits:
      
      3f9a188 [Masayoshi TSUZUKI] modified some errors.
      ed46047 [Masayoshi TSUZUKI] avoid scalastyle errors.
      1784239 [Masayoshi TSUZUKI] removed Utils.formatPath.
      e03f289 [Masayoshi TSUZUKI] removed testWindows from Utils.resolveURI and Utils.resolveURIs. replaced SystemUtils.IS_OS_WINDOWS to Utils.isWindows. removed Utils.formatPath from PythonRunner.scala.
      84c33d0 [Masayoshi TSUZUKI] - use resolveURI in nonLocalPaths - run tests for Windows path only on Windows
      016128d [Masayoshi TSUZUKI] fixed to use File.toURI()
      2c62e3b [Masayoshi TSUZUKI] Merge pull request #1 from sarutak/SPARK-6568-2
      7019a8a [Masayoshi TSUZUKI] Merge branch 'master' of https://github.com/apache/spark into feature/SPARK-6568-2
      45946ee [Kousuke Saruta] Merge branch 'master' of git://git.apache.org/spark into SPARK-6568-2
      10f1c73 [Kousuke Saruta] Added a comment
      93c3c40 [Kousuke Saruta] Merge branch 'classpath-handling-fix' of github.com:sarutak/spark into SPARK-6568-2
      649da82 [Kousuke Saruta] Fix classpath handling
      c7ba6a7 [Masayoshi TSUZUKI] [SPARK-6568] spark-shell.cmd --jars option does not accept the jar that has space in its path
      50c72708
  30. May 08, 2015
    • vinodkc's avatar
      [SPARK-7489] [SPARK SHELL] Spark shell crashes when compiled with scala 2.11 · 4e7360e1
      vinodkc authored
      Spark shell crashes when compiled with scala 2.11 and  SPARK_PREPEND_CLASSES=true
      
      There is a similar Resolved JIRA issue -SPARK-7470 and a PR https://github.com/apache/spark/pull/5997 , which handled same issue only in scala 2.10
      
      Author: vinodkc <vinod.kc.in@gmail.com>
      
      Closes #6013 from vinodkc/fix_sqlcontext_exception_scala_2.11 and squashes the following commits:
      
      119061c [vinodkc] Spark shell crashes when compiled with scala 2.11
      4e7360e1
    • Andrew Or's avatar
      [SPARK-7470] [SQL] Spark shell SQLContext crashes without hive · 714db2ef
      Andrew Or authored
      This only happens if you have `SPARK_PREPEND_CLASSES` set. Then I built it with `build/sbt clean assembly compile` and just ran it with `bin/spark-shell`.
      ```
      ...
      15/05/07 17:07:30 INFO EventLoggingListener: Logging events to file:/tmp/spark-events/local-1431043649919
      15/05/07 17:07:30 INFO SparkILoop: Created spark context..
      Spark context available as sc.
      java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
      	at java.lang.Class.getDeclaredConstructors0(Native Method)
      	at java.lang.Class.privateGetDeclaredConstructors(Class.java:2493)
      	at java.lang.Class.getConstructor0(Class.java:2803)
      ...
      Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
      	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
      	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
      	at java.security.AccessController.doPrivileged(Native Method)
      	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
      	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
      	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
      	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
      	... 52 more
      
      <console>:10: error: not found: value sqlContext
             import sqlContext.implicits._
                    ^
      <console>:10: error: not found: value sqlContext
             import sqlContext.sql
                    ^
      ```
      yhuai marmbrus
      
      Author: Andrew Or <andrew@databricks.com>
      
      Closes #5997 from andrewor14/sql-shell-crash and squashes the following commits:
      
      61147e6 [Andrew Or] Also expect NoClassDefFoundError
      714db2ef
  31. Apr 25, 2015
    • Prashant Sharma's avatar
      [SPARK-7092] Update spark scala version to 2.11.6 · a11c8683
      Prashant Sharma authored
      Author: Prashant Sharma <prashant.s@imaginea.com>
      
      Closes #5662 from ScrapCodes/SPARK-7092/scala-update-2.11.6 and squashes the following commits:
      
      58cf4f9 [Prashant Sharma] [SPARK-7092] Update spark scala version to 2.11.6
      a11c8683
  32. Apr 09, 2015
Loading