Skip to content
Snippets Groups Projects
Commit 2aeb84bc authored by Holden Karau's avatar Holden Karau Committed by Josh Rosen
Browse files

replace awaitTransformation with awaitTermination in scaladoc/javadoc

Author: Holden Karau <holden@pigscanfly.ca>

Closes #2861 from holdenk/SPARK-4015-Documentation-in-the-streaming-context-references-non-existent-function and squashes the following commits:

081db8a [Holden Karau] fix pyspark streaming doc too
0e03863 [Holden Karau] replace awaitTransformation with awaitTermination
parent 85708168
No related branches found
No related tags found
No related merge requests found
...@@ -79,7 +79,7 @@ class StreamingContext(object): ...@@ -79,7 +79,7 @@ class StreamingContext(object):
L{DStream} various input sources. It can be from an existing L{SparkContext}. L{DStream} various input sources. It can be from an existing L{SparkContext}.
After creating and transforming DStreams, the streaming computation can After creating and transforming DStreams, the streaming computation can
be started and stopped using `context.start()` and `context.stop()`, be started and stopped using `context.start()` and `context.stop()`,
respectively. `context.awaitTransformation()` allows the current thread respectively. `context.awaitTermination()` allows the current thread
to wait for the termination of the context by `stop()` or by an exception. to wait for the termination of the context by `stop()` or by an exception.
""" """
_transformerSerializer = None _transformerSerializer = None
......
...@@ -47,7 +47,7 @@ import org.apache.spark.streaming.ui.{StreamingJobProgressListener, StreamingTab ...@@ -47,7 +47,7 @@ import org.apache.spark.streaming.ui.{StreamingJobProgressListener, StreamingTab
* The associated SparkContext can be accessed using `context.sparkContext`. After * The associated SparkContext can be accessed using `context.sparkContext`. After
* creating and transforming DStreams, the streaming computation can be started and stopped * creating and transforming DStreams, the streaming computation can be started and stopped
* using `context.start()` and `context.stop()`, respectively. * using `context.start()` and `context.stop()`, respectively.
* `context.awaitTransformation()` allows the current thread to wait for the termination * `context.awaitTermination()` allows the current thread to wait for the termination
* of the context by `stop()` or by an exception. * of the context by `stop()` or by an exception.
*/ */
class StreamingContext private[streaming] ( class StreamingContext private[streaming] (
......
...@@ -46,7 +46,7 @@ import org.apache.spark.streaming.receiver.Receiver ...@@ -46,7 +46,7 @@ import org.apache.spark.streaming.receiver.Receiver
* org.apache.spark.api.java.JavaSparkContext (see core Spark documentation) can be accessed * org.apache.spark.api.java.JavaSparkContext (see core Spark documentation) can be accessed
* using `context.sparkContext`. After creating and transforming DStreams, the streaming * using `context.sparkContext`. After creating and transforming DStreams, the streaming
* computation can be started and stopped using `context.start()` and `context.stop()`, * computation can be started and stopped using `context.start()` and `context.stop()`,
* respectively. `context.awaitTransformation()` allows the current thread to wait for the * respectively. `context.awaitTermination()` allows the current thread to wait for the
* termination of a context by `stop()` or by an exception. * termination of a context by `stop()` or by an exception.
*/ */
class JavaStreamingContext(val ssc: StreamingContext) extends Closeable { class JavaStreamingContext(val ssc: StreamingContext) extends Closeable {
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment