-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-18741][STREAMING] Reuse or clean-up SparkContext in streaming tests #16174
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -18,6 +18,7 @@ | |
| package org.apache.spark.streaming; | ||
|
|
||
| import org.apache.spark.SparkConf; | ||
| import org.apache.spark.SparkContext$; | ||
| import org.apache.spark.streaming.api.java.JavaStreamingContext; | ||
| import org.junit.After; | ||
| import org.junit.Before; | ||
|
|
@@ -28,6 +29,7 @@ public abstract class LocalJavaStreamingContext { | |
|
|
||
| @Before | ||
| public void setUp() { | ||
| SparkContext$.MODULE$.stopActiveContext(); | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can't you just call The line is also indented incorrectly.
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Intellij and SBT were both complaining. So I did this. I'll try to rebuild and see what happens. |
||
| SparkConf conf = new SparkConf() | ||
| .setMaster("local[2]") | ||
| .setAppName("test") | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't know. I'm not a big fan of the approach you're taking here: calling this method before running tests. That feels like a sledgehammer to fix flaky tests. I think it would be better for test code to be more careful about cleaning after itself. Kinda like most tests in spark-core use
LocalSparkContextto more or less automatically do that without the need for these methods.The
ReuseableSparkContexttrait you have is a step in that direction. If you make sure all needed streaming tests are using it, and keep this state within that class, I think it would be a better change.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1. I don't like stopping SparkContext before running tests, either. It will hide the mistakes in other tests.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't this be unnecessary with more carefully written tests? that always close the context etc when done?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have to admit that the approach is far from subtle.
It seems that #16105 fixes this (also on my branch). I am closing this for now. Thanks for the feedback.