Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -2172,4 +2172,21 @@ class SQLQuerySuite extends QueryTest with SQLTestUtils with TestHiveSingleton {
}
}
}

test("SPARK-19809 NullPointerException on zero-size ORC file") {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test case should not be put in SQLQuerySuite.scala

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BTW, @gatorsmile . Which suite do you prefer? So far, this test case covers

  • Both native and hive
  • Also, STORED AS with CONVERT_METASTORE_ORC=true

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll move this into HiveOrcQuerySuite.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure. That sounds fine to me.

Seq("native", "hive").foreach { orcImpl =>
withSQLConf(SQLConf.ORC_IMPLEMENTATION.key -> orcImpl) {
withTempPath { dir =>
withTable("spark_19809") {
sql(s"CREATE TABLE spark_19809(a int) STORED AS ORC LOCATION '$dir'")
Files.touch(new File(s"${dir.getCanonicalPath}", "zero.orc"))

withSQLConf(HiveUtils.CONVERT_METASTORE_ORC.key -> "true") { // default since 2.3.0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use both true and false

Copy link
Member Author

@dongjoon-hyun dongjoon-hyun Dec 13, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ur, this test case is for convertMetastoreOrc=true which bacame default in Spark 2.3.0.

false still has the issue of Hive 1.2.1 OrcInputFormat.getSplits, so I wrote like the following in PR description.

After SPARK-22279, Apache Spark with the default configuration doesn't have this bug. Although Hive 1.2.1 library code path still has the problem, we had better have a test coverage on what we have now in order to prevent future regression on it.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add a comment in the test case?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure!

checkAnswer(sql("SELECT * FROM spark_19809"), Seq.empty)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use table("spark_19809")

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's done.

}
}
}
}
}
}
}