Skip to content

Conversation

@gatorsmile
Copy link
Member

What changes were proposed in this pull request?

It is weird to create Hive source tables when using InMemoryCatalog. We are unable to operate it. This PR is to block users to create Hive source tables.

How was this patch tested?

Fixed the test cases

|Hive support is required to insert into the following tables:
|${s.catalogTable.identifier}
""".stripMargin)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not necessary when we blocking users to create Hive source tables. When the table is data source tables, SimpleCatalogRelation will be replaced by LogicalRelation in the rule FindDataSourceTable.


test("alter table: set serde partition") {
// TODO: move this test to HiveDDLSuite.scala
ignore("alter table: set serde partition") {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These two test cases should be moved to HiveDDLSuite. However, it is ugly to copy the codes. Thus, my next plan is to improve the DDLSuite and HiveDDLSuite by creating an abstract class.

@SparkQA
Copy link

SparkQA commented Jan 15, 2017

Test build #71401 has finished for PR 16587 at commit 0d2d9f2.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@gatorsmile
Copy link
Member Author

The following test cases failed:

  • change-column.sql
  • describe.sql
  • show-tables.sql
  • show_columns.sql

All these test cases are creating hive serde tables. We might need to create another SQLQueryTestSuite in the hive package.

@gatorsmile
Copy link
Member Author

cc @cloud-fan @yhuai @hvanhovell Any comment?

@SparkQA
Copy link

SparkQA commented Jan 16, 2017

Test build #71407 has finished for PR 16587 at commit 5e0cd26.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

withTable("my_ext_tab") {
(("a", "b") :: Nil).toDF().write.parquet(tempDir.getCanonicalPath)
(1 to 10).map { i => (i, i) }.toDF("a", "b").createTempView("my_temp_tab")
sql(s"CREATE TABLE my_ext_tab using parquet LOCATION '$tempDir'")
Copy link
Member

@HyukjinKwon HyukjinKwon Jan 16, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, this is why you asked me #16586 (comment). I just ran a test on Windows to help.

 - truncate table - external table, temporary table, view (not allowed) *** FAILED *** (188 milliseconds)
   org.apache.spark.sql.AnalysisException: Path does not exist: file:/C:projectsspark	arget	mpspark-9e70280d-56dc-4063-8f40-8e62fec18394;
   at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$14.apply(DataSource.scala:382)
   at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$14.apply(DataSource.scala:370)
   at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
   at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)

Maybe, it'd be okay to just use toURI if this test is not supposed to test path variants.

val db = tableDefinition.identifier.database.get
requireDbExists(db)
val table = tableDefinition.identifier.table
if (tableDefinition.provider.isDefined && tableDefinition.provider.get.toLowerCase == "hive") {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shall we put this in HiveOnlyCheck?

.add("HelLo", "int", nullable = false)
.add("WoRLd", "int", nullable = true),
provider = Some("hive"),
provider = Some("parquet"),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shall we also use defaultProvider here?

@SparkQA
Copy link

SparkQA commented Jan 17, 2017

Test build #71485 has started for PR 16587 at commit 49e6e81.

@gatorsmile
Copy link
Member Author

retest this please

@SparkQA
Copy link

SparkQA commented Jan 17, 2017

Test build #71495 has finished for PR 16587 at commit 49e6e81.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jan 17, 2017

Test build #71528 has finished for PR 16587 at commit c6d6a24.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@gatorsmile
Copy link
Member Author

retest this please

@SparkQA
Copy link

SparkQA commented Jan 22, 2017

Test build #71797 has started for PR 16587 at commit c6d6a24.

@gatorsmile
Copy link
Member Author

retest this please

@SparkQA
Copy link

SparkQA commented Jan 22, 2017

Test build #3545 has finished for PR 16587 at commit c6d6a24.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@cloud-fan
Copy link
Contributor

LGTM

@asfgit asfgit closed this in 772035e Jan 23, 2017
@gatorsmile
Copy link
Member Author

Thanks! Merging to master.

uzadude pushed a commit to uzadude/spark that referenced this pull request Jan 27, 2017
…port is Not Enabled

### What changes were proposed in this pull request?
It is weird to create Hive source tables when using InMemoryCatalog. We are unable to operate it. This PR is to block users to create Hive source tables.

### How was this patch tested?
Fixed the test cases

Author: gatorsmile <[email protected]>

Closes apache#16587 from gatorsmile/blockHiveTable.
cmonkey pushed a commit to cmonkey/spark that referenced this pull request Feb 15, 2017
…port is Not Enabled

### What changes were proposed in this pull request?
It is weird to create Hive source tables when using InMemoryCatalog. We are unable to operate it. This PR is to block users to create Hive source tables.

### How was this patch tested?
Fixed the test cases

Author: gatorsmile <[email protected]>

Closes apache#16587 from gatorsmile/blockHiveTable.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants