Skip to content

Commit a222644

Browse files
tianshizzHyukjinKwon
authored andcommitted
[SPARK-31267][SQL] Flaky test: WholeStageCodegenSparkSubmitSuite.Generated code on driver should not embed platform-specific constant
### What changes were proposed in this pull request? Allow customized timeouts for `runSparkSubmit`, which will make flaky tests more likely to pass by using a larger timeout value. I was able to reproduce the test failure on my laptop, which took 1.5 - 2 minutes to finish the test. After increasing the timeout, the test now can pass locally. ### Why are the changes needed? This allows slow tests to use a larger timeout, so they are more likely to succeed. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? The test was able to pass on my local env after the change. Closes #28438 from tianshizz/SPARK-31267. Authored-by: Tianshi Zhu <[email protected]> Signed-off-by: HyukjinKwon <[email protected]>
1 parent 2fb85f6 commit a222644

File tree

2 files changed

+5
-3
lines changed

2 files changed

+5
-3
lines changed

core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,7 @@ import org.apache.hadoop.conf.Configuration
3131
import org.apache.hadoop.fs.{FileStatus, FSDataInputStream, Path}
3232
import org.scalatest.{BeforeAndAfterEach, Matchers}
3333
import org.scalatest.concurrent.{Signaler, ThreadSignaler, TimeLimits}
34+
import org.scalatest.time.Span
3435
import org.scalatest.time.SpanSugar._
3536

3637
import org.apache.spark._
@@ -1419,7 +1420,7 @@ object SparkSubmitSuite extends SparkFunSuite with TimeLimits {
14191420
implicit val defaultSignaler: Signaler = ThreadSignaler
14201421

14211422
// NOTE: This is an expensive operation in terms of time (10 seconds+). Use sparingly.
1422-
def runSparkSubmit(args: Seq[String], root: String = ".."): Unit = {
1423+
def runSparkSubmit(args: Seq[String], root: String = "..", timeout: Span = 1.minute): Unit = {
14231424
val sparkHome = sys.props.getOrElse("spark.test.home", fail("spark.test.home is not set!"))
14241425
val sparkSubmitFile = if (Utils.isWindows) {
14251426
new File(s"$root\\bin\\spark-submit.cmd")
@@ -1432,7 +1433,7 @@ object SparkSubmitSuite extends SparkFunSuite with TimeLimits {
14321433
Map("SPARK_TESTING" -> "1", "SPARK_HOME" -> sparkHome))
14331434

14341435
try {
1435-
val exitCode = failAfter(1.minute) { process.waitFor() }
1436+
val exitCode = failAfter(timeout) { process.waitFor() }
14361437
if (exitCode != 0) {
14371438
fail(s"Process returned with exit code $exitCode. See the log4j logs for more detail.")
14381439
}

sql/core/src/test/scala/org/apache/spark/sql/execution/WholeStageCodegenSparkSubmitSuite.scala

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,7 @@ package org.apache.spark.sql.execution
1919

2020
import org.scalatest.{Assertions, BeforeAndAfterEach, Matchers}
2121
import org.scalatest.concurrent.TimeLimits
22+
import org.scalatest.time.SpanSugar._
2223

2324
import org.apache.spark.{SparkFunSuite, TestUtils}
2425
import org.apache.spark.deploy.SparkSubmitSuite
@@ -50,7 +51,7 @@ class WholeStageCodegenSparkSubmitSuite extends SparkFunSuite
5051
"--conf", "spark.executor.extraJavaOptions=-XX:+UseCompressedOops",
5152
"--conf", "spark.sql.adaptive.enabled=false",
5253
unusedJar.toString)
53-
SparkSubmitSuite.runSparkSubmit(argsForSparkSubmit, "../..")
54+
SparkSubmitSuite.runSparkSubmit(argsForSparkSubmit, "../..", 3.minutes)
5455
}
5556
}
5657

0 commit comments

Comments
 (0)