You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[SPARK-7799][STREAMING][DOCUMENT] Add the linking and deploying instructions for streaming-akka project
Since `actorStream` is an external project, we should add the linking and deploying instructions for it.
A follow up PR of #10744
Author: Shixiong Zhu <[email protected]>
Closes#10856 from zsxwing/akka-link-instruction.
allows received data to be stored in Spark using `store(...)` methods. The supervisor strategy of
266
-
this actor can be configured to handle failures, etc.
261
+
receive data. Here are the instructions.
267
262
268
-
{% highlight scala %}
263
+
1.**Linking:** You need to add the following dependency to your SBT or Maven project (see [Linking section](streaming-programming-guide.html#linking) in the main programming guide for further information).
// A new input stream can be created with this custom actor as
277
-
val ssc: StreamingContext = ...
278
-
val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver")
269
+
2.**Programming:**
279
270
280
-
{% endhighlight %}
271
+
<div class="codetabs">
272
+
<div data-lang="scala" markdown="1" >
281
273
282
-
See [ActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/ActorWordCount.scala) for an end-to-end example.
283
-
</div>
284
-
<divdata-lang="java"markdown="1">
274
+
You need to extend [`ActorReceiver`](api/scala/index.html#org.apache.spark.streaming.akka.ActorReceiver)
275
+
so as to store received data into Spark using `store(...)` methods. The supervisor strategy of
276
+
this actor can be configured to handle failures, etc.
285
277
286
-
Custom [Akka UntypedActors](http://doc.akka.io/docs/akka/2.3.11/java/untyped-actors.html) can also be used to
allows received data to be stored in Spark using `store(...)` methods. The supervisor strategy of
289
-
this actor can be configured to handle failures, etc.
278
+
class CustomActor extends ActorReceiver {
279
+
def receive = {
280
+
case data: String => store(data)
281
+
}
282
+
}
290
283
291
-
{% highlight java %}
284
+
// A new input stream can be created with this custom actor as
285
+
val ssc: StreamingContext = ...
286
+
val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver")
292
287
293
-
class CustomActor extends JavaActorReceiver {
294
-
@Override
295
-
public void onReceive(Object msg) throws Exception {
296
-
store((String) msg);
297
-
}
298
-
}
288
+
See [ActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/ActorWordCount.scala) for an end-to-end example.
289
+
</div>
290
+
<div data-lang="java" markdown="1">
299
291
300
-
// A new input stream can be created with this custom actor as
You need to extend [`JavaActorReceiver`](api/scala/index.html#org.apache.spark.streaming.akka.JavaActorReceiver)
293
+
so as to store received data into Spark using `store(...)` methods. The supervisor strategy of
294
+
this actor can be configured to handle failures, etc.
303
295
304
-
{% endhighlight %}
296
+
class CustomActor extends JavaActorReceiver {
297
+
@Override
298
+
public void onReceive(Object msg) throws Exception {
299
+
store((String) msg);
300
+
}
301
+
}
305
302
306
-
See [JavaActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/JavaActorWordCount.scala) for an end-to-end example.
307
-
</div>
308
-
</div>
303
+
// A new input stream can be created with this custom actor as
See [JavaActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/JavaActorWordCount.scala) for an end-to-end example.
308
+
</div>
309
+
</div>
310
+
311
+
3.**Deploying:** As with any Spark applications, `spark-submit` is used to launch your application.
312
+
You need to package `spark-streaming-akka_{{site.SCALA_BINARY_VERSION}}` and its dependencies into
313
+
the application JAR. Make sure `spark-core_{{site.SCALA_BINARY_VERSION}}` and `spark-streaming_{{site.SCALA_BINARY_VERSION}}`
314
+
are marked as `provided` dependencies as those are already present in a Spark installation. Then
315
+
use `spark-submit` to launch your application (see [Deploying section](streaming-programming-guide.html#deploying-applications) in the main programming guide).
309
316
310
317
<spanclass="badge"style="background-color: grey">Python API</span> Since actors are available only in the Java and Scala libraries, AkkaUtils is not available in the Python API.
0 commit comments