You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Include the following changes:
1. Add "streaming-akka" project and org.apache.spark.streaming.akka.AkkaUtils for creating an actorStream
2. Remove "StreamingContext.actorStream" and "JavaStreamingContext.actorStream"
3. Update the ActorWordCount example and add the JavaActorWordCount example
4. Make "streaming-zeromq" depend on "streaming-akka" and update the codes accordingly
Author: Shixiong Zhu <[email protected]>
Closes#10744 from zsxwing/streaming-akka-2.
allows received data to be stored in Spark using`store(...)` methods. The supervisor strategy of
266
+
this actor can be configured to handle failures, etc.
264
267
265
268
{% highlight scala %}
266
-
class CustomActor extends Actor with ActorHelper {
269
+
270
+
class CustomActor extends ActorReceiver {
267
271
def receive = {
268
272
case data: String => store(data)
269
273
}
270
274
}
275
+
276
+
// A new input stream can be created with this custom actor as
277
+
val ssc: StreamingContext = ...
278
+
val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver")
279
+
271
280
{% endhighlight %}
272
281
273
-
And a new input stream can be created with this custom actor as
282
+
See [ActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/ActorWordCount.scala) for an end-to-end example.
283
+
</div>
284
+
<divdata-lang="java"markdown="1">
285
+
286
+
Custom [Akka UntypedActors](http://doc.akka.io/docs/akka/2.3.11/java/untyped-actors.html) can also be used to
val lines = ssc.actorStream[String](Props[CustomActor],"CustomReceiver")
278
304
{% endhighlight %}
279
305
280
-
See [ActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/ActorWordCount.scala)
281
-
for an end-to-end example.
306
+
See [JavaActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/JavaActorWordCount.scala) for an end-to-end example.
307
+
</div>
308
+
</div>
309
+
310
+
<spanclass="badge"style="background-color: grey">Python API</span> Since actors are available only in the Java and Scala libraries, AkkaUtils is not available in the Python API.
Copy file name to clipboardExpand all lines: docs/streaming-programming-guide.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -659,11 +659,11 @@ methods for creating DStreams from files and Akka actors as input sources.
659
659
<span class="badge" style="background-color: grey">Python API</span> `fileStream` is not available in the Python API, only `textFileStream` is available.
660
660
661
661
-**Streams based on Custom Actors:** DStreams can be created with data streams received through Akka
662
-
actors by using `streamingContext.actorStream(actorProps, actor-name)`. See the [Custom Receiver
662
+
actors by using `AkkaUtils.createStream(ssc, actorProps, actor-name)`. See the [Custom Receiver
663
663
Guide](streaming-custom-receivers.html) for more details.
664
664
665
665
<spanclass="badge"style="background-color: grey">Python API</span> Since actors are available only in the Java and Scala
666
-
libraries, `actorStream` is not available in the Python API.
666
+
libraries, `AkkaUtils.createStream` is not available in the Python API.
667
667
668
668
-**Queue of RDDs as a Stream:** For testing a Spark Streaming application with test data, one can also create a DStream based on a queue of RDDs, using `streamingContext.queueStream(queueOfRDDs)`. Each RDD pushed into the queue will be treated as a batch of data in the DStream, and processed like a stream.
0 commit comments