Skip to content

Conversation

@LuciferYang
Copy link
Contributor

@LuciferYang LuciferYang commented Aug 7, 2025

What changes were proposed in this pull request?

This pr changes the default value of the input parameter level: Option[Level] for the SparkFunSuite#withLogAppender function from None to Some(Level.INFO), in order to decouple the relevant tests from the rootLogger.level configuration in the log4j2.properties .

Why are the changes needed?

Suppose, for some reason, we change the rootLogger.level configuration value in sql/core/src/test/resources/log4j2.properties from info to warn. Subsequently, when running unit tests, failures similar to the following may occur:

build/sbt clean "sql/testOnly org.apache.spark.sql.execution.QueryExecutionSuite"

[info] - Logging plan changes for execution *** FAILED *** (63 milliseconds)
[info]   testAppender.loggingEvents.exists(((x$10: org.apache.logging.log4j.core.LogEvent) => x$10.getMessage().getFormattedMessage().contains(expectedMsg))) was false (QueryExecutionSuite.scala:243)
[info]   org.scalatest.exceptions.TestFailedException:
...

Similar issues may also arise in other test cases such as AdaptiveQueryExecSuite. Therefore, this PR modifies the default value of the level parameter to avoid such test coupling problems.

Does this PR introduce any user-facing change?

No

How was this patch tested?

  • Passed Github Actions
  • Manually verified that the aforementioned test cases no longer fail

Was this patch authored or co-authored using generative AI tooling?

No

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, got it. We have test cases to check the log.

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are these all we need to change? It seems we have many instances.

$ git grep withLogAppender sql | grep -v 'level ='
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/AnalysisSuite.scala:        withLogAppender(logAppender) {
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/ResolveHintsSuite.scala:    withLogAppender(logAppender) {
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CodeGenerationSuite.scala:    withLogAppender(appender, loggerNames = Seq(classOf[CodeGenerator[_, _]].getName),
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CodeGenerationSuite.scala:      withLogAppender(appender, loggerNames = Seq(classOf[CodeGenerator[_, _]].getName),
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/OptimizerLoggingSuite.scala:    withLogAppender(logAppender,
sql/core/src/test/scala/org/apache/spark/sql/CTEHintSuite.scala:    withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/CharVarcharTestSuite.scala:    withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/JoinHintSuite.scala:    withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/SparkSessionBuilderSuite.scala:    withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/SparkSessionBuilderSuite.scala:    withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/SparkSessionBuilderSuite.scala:    withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/QueryExecutionSuite.scala:    withLogAppender(testAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/adaptive/AdaptiveQueryExecSuite.scala:    withLogAppender(testAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/adaptive/AdaptiveQueryExecSuite.scala:      withLogAppender(
sql/core/src/test/scala/org/apache/spark/sql/execution/adaptive/AdaptiveQueryExecSuite.scala:    withLogAppender(testAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/DataSourceManagerSuite.scala:    withLogAppender(testAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/csv/CSVSuite.scala:    withLogAppender(testAppender1) {
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/csv/CSVSuite.scala:    withLogAppender(testAppender2) {
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/csv/CSVSuite.scala:          withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/v2/jdbc/JDBCTableCatalogSuite.scala:      withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/python/PythonDataSourceSuite.scala:    withLogAppender(testAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/python/PythonDataSourceSuite.scala:    withLogAppender(testAppender) {
sql/core/src/test/scala/org/apache/spark/sql/internal/SQLConfSuite.scala:    withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/internal/SQLConfSuite.scala:    withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/streaming/FileStreamSinkSuite.scala:      withLogAppender(logAppender) {
sql/hive/src/test/scala/org/apache/spark/sql/hive/MetastoreDataSourcesSuite.scala:      withLogAppender(logAppender) {

@LuciferYang
Copy link
Contributor Author

@dongjoon-hyun This is a good question. Let me try modifying the default value of the level in the withLogAppender method to see what happens.

appender: AbstractAppender,
loggerNames: Seq[String] = Seq.empty,
level: Option[Level] = None)(
level: Option[Level] = Some(Level.INFO))(
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's check if there are any related test cases that need to be specified with a different log level.

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM (Pending CIs).

@LuciferYang LuciferYang changed the title [SPARK-53168][SQL][TESTS] Decouple the test cases in the sql module from the configuration values in log4j2.properties [SPARK-53168][CORE][TESTS] Decouple the test cases in the sql module from the configuration values in log4j2.properties Aug 7, 2025
@LuciferYang LuciferYang changed the title [SPARK-53168][CORE][TESTS] Decouple the test cases in the sql module from the configuration values in log4j2.properties [SPARK-53168][CORE][TESTS] Change default value of the input parameter level for the SparkFunSuite#withLogAppender from None to Some(Level.INFO) Aug 7, 2025
@LuciferYang LuciferYang changed the title [SPARK-53168][CORE][TESTS] Change default value of the input parameter level for the SparkFunSuite#withLogAppender from None to Some(Level.INFO) [SPARK-53168][CORE][TESTS] Change default value of the input parameter level for SparkFunSuite#withLogAppender from None to Some(Level.INFO) Aug 7, 2025
@LuciferYang
Copy link
Contributor Author

image

all passed

@LuciferYang
Copy link
Contributor Author

Merged into master. Thank you @dongjoon-hyun

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants