Skip to content
Snippets Groups Projects
Commit 5d0f58b2 authored by Michael Armbrust's avatar Michael Armbrust Committed by Matei Zaharia
Browse files

Use scala deprecation instead of java.

This gets rid of a warning when compiling core (since we were depending on a deprecated interface with a non-deprecated function).  I also tested with javac, and this does the right thing when compiling java code.

Author: Michael Armbrust <michael@databricks.com>

Closes #452 from marmbrus/scalaDeprecation and squashes the following commits:

f628b4d [Michael Armbrust] Use scala deprecation instead of java.
parent 28238c81
No related branches found
No related tags found
No related merge requests found
...@@ -114,7 +114,7 @@ class JavaSparkContext(val sc: SparkContext) extends JavaSparkContextVarargsWork ...@@ -114,7 +114,7 @@ class JavaSparkContext(val sc: SparkContext) extends JavaSparkContextVarargsWork
* @deprecated As of Spark 1.0.0, defaultMinSplits is deprecated, use * @deprecated As of Spark 1.0.0, defaultMinSplits is deprecated, use
* {@link #defaultMinPartitions()} instead * {@link #defaultMinPartitions()} instead
*/ */
@Deprecated @deprecated("use defaultMinPartitions", "1.0.0")
def defaultMinSplits: java.lang.Integer = sc.defaultMinSplits def defaultMinSplits: java.lang.Integer = sc.defaultMinSplits
/** Default min number of partitions for Hadoop RDDs when not given by user */ /** Default min number of partitions for Hadoop RDDs when not given by user */
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment