-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-53168][CORE][TESTS] Change default value of the input parameter level for SparkFunSuite#withLogAppender from None to Some(Level.INFO)
#51895
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, got it. We have test cases to check the log.
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are these all we need to change? It seems we have many instances.
$ git grep withLogAppender sql | grep -v 'level ='
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/AnalysisSuite.scala: withLogAppender(logAppender) {
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/analysis/ResolveHintsSuite.scala: withLogAppender(logAppender) {
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CodeGenerationSuite.scala: withLogAppender(appender, loggerNames = Seq(classOf[CodeGenerator[_, _]].getName),
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CodeGenerationSuite.scala: withLogAppender(appender, loggerNames = Seq(classOf[CodeGenerator[_, _]].getName),
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/optimizer/OptimizerLoggingSuite.scala: withLogAppender(logAppender,
sql/core/src/test/scala/org/apache/spark/sql/CTEHintSuite.scala: withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/CharVarcharTestSuite.scala: withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/JoinHintSuite.scala: withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/SparkSessionBuilderSuite.scala: withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/SparkSessionBuilderSuite.scala: withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/SparkSessionBuilderSuite.scala: withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/QueryExecutionSuite.scala: withLogAppender(testAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/adaptive/AdaptiveQueryExecSuite.scala: withLogAppender(testAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/adaptive/AdaptiveQueryExecSuite.scala: withLogAppender(
sql/core/src/test/scala/org/apache/spark/sql/execution/adaptive/AdaptiveQueryExecSuite.scala: withLogAppender(testAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/DataSourceManagerSuite.scala: withLogAppender(testAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/csv/CSVSuite.scala: withLogAppender(testAppender1) {
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/csv/CSVSuite.scala: withLogAppender(testAppender2) {
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/csv/CSVSuite.scala: withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/v2/jdbc/JDBCTableCatalogSuite.scala: withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/python/PythonDataSourceSuite.scala: withLogAppender(testAppender) {
sql/core/src/test/scala/org/apache/spark/sql/execution/python/PythonDataSourceSuite.scala: withLogAppender(testAppender) {
sql/core/src/test/scala/org/apache/spark/sql/internal/SQLConfSuite.scala: withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/internal/SQLConfSuite.scala: withLogAppender(logAppender) {
sql/core/src/test/scala/org/apache/spark/sql/streaming/FileStreamSinkSuite.scala: withLogAppender(logAppender) {
sql/hive/src/test/scala/org/apache/spark/sql/hive/MetastoreDataSourcesSuite.scala: withLogAppender(logAppender) {
|
@dongjoon-hyun This is a good question. Let me try modifying the default value of the |
This reverts commit b2dc5e7.
| appender: AbstractAppender, | ||
| loggerNames: Seq[String] = Seq.empty, | ||
| level: Option[Level] = None)( | ||
| level: Option[Level] = Some(Level.INFO))( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's check if there are any related test cases that need to be specified with a different log level.
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM (Pending CIs).
log4j2.propertieslog4j2.properties
log4j2.propertieslevel for the SparkFunSuite#withLogAppender from None to Some(Level.INFO)
level for the SparkFunSuite#withLogAppender from None to Some(Level.INFO)level for SparkFunSuite#withLogAppender from None to Some(Level.INFO)
|
Merged into master. Thank you @dongjoon-hyun |

What changes were proposed in this pull request?
This pr changes the default value of the input parameter
level: Option[Level]for theSparkFunSuite#withLogAppenderfunction fromNonetoSome(Level.INFO), in order to decouple the relevant tests from therootLogger.levelconfiguration in thelog4j2.properties.Why are the changes needed?
Suppose, for some reason, we change the
rootLogger.levelconfiguration value insql/core/src/test/resources/log4j2.propertiesfrominfotowarn. Subsequently, when running unit tests, failures similar to the following may occur:Similar issues may also arise in other test cases such as
AdaptiveQueryExecSuite. Therefore, this PR modifies the default value of thelevelparameter to avoid such test coupling problems.Does this PR introduce any user-facing change?
No
How was this patch tested?
Was this patch authored or co-authored using generative AI tooling?
No