Skip to content

Conversation

@cloud-fan
Copy link
Contributor

What changes were proposed in this pull request?

After #15054 , there is no place in Spark SQL that need SessionCatalog.tableExists to check temp views, so this PR makes SessionCatalog.tableExists only check permanent table/view and removes some hacks.

This PR also improves the getTempViewOrPermanentTableMetadata that is introduced in #15054 , to make the code simpler.

How was this patch tested?

existing tests

@cloud-fan
Copy link
Contributor Author

cc @yhuai @gatorsmile

@SparkQA
Copy link

SparkQA commented Sep 20, 2016

Test build #65654 has finished for PR 15160 at commit d00860f.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

return Seq.empty[Row]
} else {
throw new AnalysisException(s"Table ${tableIdentWithDB.unquotedString} already exists.")
throw new AnalysisException(s"Table ${table.identifier.unquotedString} already exists.")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The test failure is caused by this change. After this PR, table.identifier does not always have the database name.

@gatorsmile
Copy link
Member

LGTM, except one comment about test failure.

Also, went over all the existing usages of SessionCatalog.tableExists. Every call looks fine.

@SparkQA
Copy link

SparkQA commented Sep 21, 2016

Test build #65715 has finished for PR 15160 at commit fa57ee8.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Sep 21, 2016

Test build #65719 has finished for PR 15160 at commit 0c06e6b.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@gatorsmile
Copy link
Member

LGTM

@cloud-fan
Copy link
Contributor Author

thanks for the review, merging to master!

@asfgit asfgit closed this in b50b34f Sep 22, 2016
asfgit pushed a commit that referenced this pull request Sep 23, 2016
…tements on Temporary Views

### What changes were proposed in this pull request?
This PR is to backport #15054 and #15160 to Spark 2.0.

- When the permanent tables/views do not exist but the temporary view exists, the expected error should be `NoSuchTableException` for partition-related ALTER TABLE commands. However, it always reports a confusing error message. For example,
```
Partition spec is invalid. The spec (a, b) must match the partition spec () defined in table '`testview`';
```
- When the permanent tables/views do not exist but the temporary view exists, the expected error should be `NoSuchTableException` for `ALTER TABLE ... UNSET TBLPROPERTIES`. However, it reports a missing table property. For example,
```
Attempted to unset non-existent property 'p' in table '`testView`';
```
- When `ANALYZE TABLE` is called on a view or a temporary view, we should issue an error message. However, it reports a strange error:
```
ANALYZE TABLE is not supported for Project
```

- When inserting into a temporary view that is generated from `Range`, we will get the following error message:
```
assertion failed: No plan for 'InsertIntoTable Range (0, 10, step=1, splits=Some(1)), false, false
+- Project [1 AS 1#20]
   +- OneRowRelation$
```

This PR is to fix the above four issues.

There is no place in Spark SQL that need `SessionCatalog.tableExists` to check temp views, so this PR makes `SessionCatalog.tableExists` only check permanent table/view and removes some hacks.

### How was this patch tested?
Added multiple test cases

Author: gatorsmile <[email protected]>

Closes #15174 from gatorsmile/PR15054Backport.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants