Skip to content

Commit cbd507d

Browse files
zsxwingtdas
authored andcommitted
[SPARK-7799][STREAMING][DOCUMENT] Add the linking and deploying instructions for streaming-akka project
Since `actorStream` is an external project, we should add the linking and deploying instructions for it. A follow up PR of #10744 Author: Shixiong Zhu <[email protected]> Closes #10856 from zsxwing/akka-link-instruction.
1 parent 08c781c commit cbd507d

File tree

1 file changed

+44
-37
lines changed

1 file changed

+44
-37
lines changed

docs/streaming-custom-receivers.md

Lines changed: 44 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -257,54 +257,61 @@ The following table summarizes the characteristics of both types of receivers
257257

258258
## Implementing and Using a Custom Actor-based Receiver
259259

260-
<div class="codetabs">
261-
<div data-lang="scala" markdown="1" >
262-
263260
Custom [Akka Actors](http://doc.akka.io/docs/akka/2.3.11/scala/actors.html) can also be used to
264-
receive data. Extending [`ActorReceiver`](api/scala/index.html#org.apache.spark.streaming.akka.ActorReceiver)
265-
allows received data to be stored in Spark using `store(...)` methods. The supervisor strategy of
266-
this actor can be configured to handle failures, etc.
261+
receive data. Here are the instructions.
267262

268-
{% highlight scala %}
263+
1. **Linking:** You need to add the following dependency to your SBT or Maven project (see [Linking section](streaming-programming-guide.html#linking) in the main programming guide for further information).
269264

270-
class CustomActor extends ActorReceiver {
271-
def receive = {
272-
case data: String => store(data)
273-
}
274-
}
265+
groupId = org.apache.spark
266+
artifactId = spark-streaming-akka_{{site.SCALA_BINARY_VERSION}}
267+
version = {{site.SPARK_VERSION_SHORT}}
275268

276-
// A new input stream can be created with this custom actor as
277-
val ssc: StreamingContext = ...
278-
val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver")
269+
2. **Programming:**
279270

280-
{% endhighlight %}
271+
<div class="codetabs">
272+
<div data-lang="scala" markdown="1" >
281273

282-
See [ActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/ActorWordCount.scala) for an end-to-end example.
283-
</div>
284-
<div data-lang="java" markdown="1">
274+
You need to extend [`ActorReceiver`](api/scala/index.html#org.apache.spark.streaming.akka.ActorReceiver)
275+
so as to store received data into Spark using `store(...)` methods. The supervisor strategy of
276+
this actor can be configured to handle failures, etc.
285277

286-
Custom [Akka UntypedActors](http://doc.akka.io/docs/akka/2.3.11/java/untyped-actors.html) can also be used to
287-
receive data. Extending [`JavaActorReceiver`](api/scala/index.html#org.apache.spark.streaming.akka.JavaActorReceiver)
288-
allows received data to be stored in Spark using `store(...)` methods. The supervisor strategy of
289-
this actor can be configured to handle failures, etc.
278+
class CustomActor extends ActorReceiver {
279+
def receive = {
280+
case data: String => store(data)
281+
}
282+
}
290283

291-
{% highlight java %}
284+
// A new input stream can be created with this custom actor as
285+
val ssc: StreamingContext = ...
286+
val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver")
292287

293-
class CustomActor extends JavaActorReceiver {
294-
@Override
295-
public void onReceive(Object msg) throws Exception {
296-
store((String) msg);
297-
}
298-
}
288+
See [ActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/ActorWordCount.scala) for an end-to-end example.
289+
</div>
290+
<div data-lang="java" markdown="1">
299291

300-
// A new input stream can be created with this custom actor as
301-
JavaStreamingContext jssc = ...;
302-
JavaDStream<String> lines = AkkaUtils.<String>createStream(jssc, Props.create(CustomActor.class), "CustomReceiver");
292+
You need to extend [`JavaActorReceiver`](api/scala/index.html#org.apache.spark.streaming.akka.JavaActorReceiver)
293+
so as to store received data into Spark using `store(...)` methods. The supervisor strategy of
294+
this actor can be configured to handle failures, etc.
303295

304-
{% endhighlight %}
296+
class CustomActor extends JavaActorReceiver {
297+
@Override
298+
public void onReceive(Object msg) throws Exception {
299+
store((String) msg);
300+
}
301+
}
305302

306-
See [JavaActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/JavaActorWordCount.scala) for an end-to-end example.
307-
</div>
308-
</div>
303+
// A new input stream can be created with this custom actor as
304+
JavaStreamingContext jssc = ...;
305+
JavaDStream<String> lines = AkkaUtils.<String>createStream(jssc, Props.create(CustomActor.class), "CustomReceiver");
306+
307+
See [JavaActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/JavaActorWordCount.scala) for an end-to-end example.
308+
</div>
309+
</div>
310+
311+
3. **Deploying:** As with any Spark applications, `spark-submit` is used to launch your application.
312+
You need to package `spark-streaming-akka_{{site.SCALA_BINARY_VERSION}}` and its dependencies into
313+
the application JAR. Make sure `spark-core_{{site.SCALA_BINARY_VERSION}}` and `spark-streaming_{{site.SCALA_BINARY_VERSION}}`
314+
are marked as `provided` dependencies as those are already present in a Spark installation. Then
315+
use `spark-submit` to launch your application (see [Deploying section](streaming-programming-guide.html#deploying-applications) in the main programming guide).
309316

310317
<span class="badge" style="background-color: grey">Python API</span> Since actors are available only in the Java and Scala libraries, AkkaUtils is not available in the Python API.

0 commit comments

Comments
 (0)