Skip to content
Snippets Groups Projects
Commit cbd507d6 authored by Shixiong Zhu's avatar Shixiong Zhu Committed by Tathagata Das
Browse files

[SPARK-7799][STREAMING][DOCUMENT] Add the linking and deploying instructions...

[SPARK-7799][STREAMING][DOCUMENT] Add the linking and deploying instructions for streaming-akka project

Since `actorStream` is an external project, we should add the linking and deploying instructions for it.

A follow up PR of #10744

Author: Shixiong Zhu <shixiong@databricks.com>

Closes #10856 from zsxwing/akka-link-instruction.
parent 08c781ca
No related branches found
No related tags found
No related merge requests found
...@@ -257,54 +257,61 @@ The following table summarizes the characteristics of both types of receivers ...@@ -257,54 +257,61 @@ The following table summarizes the characteristics of both types of receivers
## Implementing and Using a Custom Actor-based Receiver ## Implementing and Using a Custom Actor-based Receiver
<div class="codetabs">
<div data-lang="scala" markdown="1" >
Custom [Akka Actors](http://doc.akka.io/docs/akka/2.3.11/scala/actors.html) can also be used to Custom [Akka Actors](http://doc.akka.io/docs/akka/2.3.11/scala/actors.html) can also be used to
receive data. Extending [`ActorReceiver`](api/scala/index.html#org.apache.spark.streaming.akka.ActorReceiver) receive data. Here are the instructions.
allows received data to be stored in Spark using `store(...)` methods. The supervisor strategy of
this actor can be configured to handle failures, etc.
{% highlight scala %} 1. **Linking:** You need to add the following dependency to your SBT or Maven project (see [Linking section](streaming-programming-guide.html#linking) in the main programming guide for further information).
class CustomActor extends ActorReceiver { groupId = org.apache.spark
def receive = { artifactId = spark-streaming-akka_{{site.SCALA_BINARY_VERSION}}
case data: String => store(data) version = {{site.SPARK_VERSION_SHORT}}
}
}
// A new input stream can be created with this custom actor as 2. **Programming:**
val ssc: StreamingContext = ...
val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver")
{% endhighlight %} <div class="codetabs">
<div data-lang="scala" markdown="1" >
See [ActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/ActorWordCount.scala) for an end-to-end example. You need to extend [`ActorReceiver`](api/scala/index.html#org.apache.spark.streaming.akka.ActorReceiver)
</div> so as to store received data into Spark using `store(...)` methods. The supervisor strategy of
<div data-lang="java" markdown="1"> this actor can be configured to handle failures, etc.
Custom [Akka UntypedActors](http://doc.akka.io/docs/akka/2.3.11/java/untyped-actors.html) can also be used to class CustomActor extends ActorReceiver {
receive data. Extending [`JavaActorReceiver`](api/scala/index.html#org.apache.spark.streaming.akka.JavaActorReceiver) def receive = {
allows received data to be stored in Spark using `store(...)` methods. The supervisor strategy of case data: String => store(data)
this actor can be configured to handle failures, etc. }
}
{% highlight java %} // A new input stream can be created with this custom actor as
val ssc: StreamingContext = ...
val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver")
class CustomActor extends JavaActorReceiver { See [ActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/ActorWordCount.scala) for an end-to-end example.
@Override </div>
public void onReceive(Object msg) throws Exception { <div data-lang="java" markdown="1">
store((String) msg);
}
}
// A new input stream can be created with this custom actor as You need to extend [`JavaActorReceiver`](api/scala/index.html#org.apache.spark.streaming.akka.JavaActorReceiver)
JavaStreamingContext jssc = ...; so as to store received data into Spark using `store(...)` methods. The supervisor strategy of
JavaDStream<String> lines = AkkaUtils.<String>createStream(jssc, Props.create(CustomActor.class), "CustomReceiver"); this actor can be configured to handle failures, etc.
{% endhighlight %} class CustomActor extends JavaActorReceiver {
@Override
public void onReceive(Object msg) throws Exception {
store((String) msg);
}
}
See [JavaActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/JavaActorWordCount.scala) for an end-to-end example. // A new input stream can be created with this custom actor as
</div> JavaStreamingContext jssc = ...;
</div> JavaDStream<String> lines = AkkaUtils.<String>createStream(jssc, Props.create(CustomActor.class), "CustomReceiver");
See [JavaActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/JavaActorWordCount.scala) for an end-to-end example.
</div>
</div>
3. **Deploying:** As with any Spark applications, `spark-submit` is used to launch your application.
You need to package `spark-streaming-akka_{{site.SCALA_BINARY_VERSION}}` and its dependencies into
the application JAR. Make sure `spark-core_{{site.SCALA_BINARY_VERSION}}` and `spark-streaming_{{site.SCALA_BINARY_VERSION}}`
are marked as `provided` dependencies as those are already present in a Spark installation. Then
use `spark-submit` to launch your application (see [Deploying section](streaming-programming-guide.html#deploying-applications) in the main programming guide).
<span class="badge" style="background-color: grey">Python API</span> Since actors are available only in the Java and Scala libraries, AkkaUtils is not available in the Python API. <span class="badge" style="background-color: grey">Python API</span> Since actors are available only in the Java and Scala libraries, AkkaUtils is not available in the Python API.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment