Skip to content

Conversation

@maropu
Copy link
Member

@maropu maropu commented Jan 7, 2016

Rework from #9935 because it's stale.

@SparkQA
Copy link

SparkQA commented Jan 7, 2016

Test build #48914 has finished for PR 10635 at commit c51d95b.

  • This patch fails to build.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jan 7, 2016

Test build #48925 has finished for PR 10635 at commit fa45a41.

  • This patch fails to build.
  • This patch merges cleanly.
  • This patch adds no public classes.

@maropu
Copy link
Member Author

maropu commented Jan 7, 2016

retest this please

@SparkQA
Copy link

SparkQA commented Jan 8, 2016

Test build #48988 has finished for PR 10635 at commit 8bdd481.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@marmbrus
Copy link
Contributor

marmbrus commented Jan 8, 2016

Does this actually let you use one source to compile against both versions of Spark?

@maropu
Copy link
Member Author

maropu commented Jan 8, 2016

@marmbrus @smola ISTM users can compile the source codes that only have compatibility issues of these types.

@marmbrus
Copy link
Contributor

marmbrus commented Jan 8, 2016

I could be missing something, but it seems like since you had to manually exclude things here to make it compile, users are going to run into problems. Can you actually make a program that uses these types and try and see if you can make it work with both Spark 1.5 and Spark 1.6?

@maropu
Copy link
Member Author

maropu commented Jan 13, 2016

@marmbrus okay and I'll try it in a day.

@maropu
Copy link
Member Author

maropu commented Jan 13, 2016

@marmbrus checked;
Codes below can be compiled in v1.5.2 though, they cannot be compile in v.1.6.0.
https://github.com/maropu/spark-compat-test/blob/master/src/test/scala/test/SparkCompatTestSuite.scala

Also, I checked that this pr resolved this issue, that is, we can compile them in both versions.

@maropu
Copy link
Member Author

maropu commented Jan 14, 2016

I found that aggregation functions such as Max and Min in org.apache.spark.sql.catalyst.expressions.aggregate has the same issue because the ticket in SPARK-11505 splits these functions into pieces in each file.

@SparkQA
Copy link

SparkQA commented Jan 14, 2016

Test build #49355 has finished for PR 10635 at commit cc971a7.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@marmbrus
Copy link
Contributor

I'm confused. The repo you link to compiles fine against Spark 1.6.0.

> compile
[info] Updating {file:/Users/marmbrus/workspace/spark-compat-test/}spark-compat-test...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[warn] Scala version was updated by one of library dependencies:
[warn]  * org.scala-lang:scala-library:(2.10.4, 2.10.0) -> 2.10.5
[warn] To force scalaVersion, add the following:
[warn]  ivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true) }
[warn] Run 'evicted' to see detailed eviction warnings
[info] Compiling 1 Scala source to /Users/marmbrus/workspace/spark-compat-test/target/scala-2.10/classes...
[warn] there were 3 deprecation warning(s); re-run with -deprecation for details
[warn] one warning found

@maropu
Copy link
Member Author

maropu commented Jan 14, 2016

Sorry that you get confused.
The codes in the link can be compiled because lib/ has /spark-catalyst_2.10-2.0.0-SNAPSHOT.jar modified by this patch.
So, you need to say commands below;

$git clone https://github.com/maropu/spark-compat-test.git
$cd spark-compat-test/
$rm lib/spark-catalyst_2.10-2.0.0-SNAPSHOT.jar
$./bin/sbt test
...
[info] Compiling 1 Scala source to /Users/maropu/IdeaProjects/temps/spark-compat-test/target/scala-2.10/classes...
[info] Compiling 1 Scala source to /Users/maropu/IdeaProjects/temps/spark-compat-test/target/scala-2.10/test-classes...
[error] /Users/maropu/IdeaProjects/temps/spark-compat-test/src/test/scala/test/SparkCompatTestSuite.scala:9: not found: type GenericArrayData
[error]     val keyArray = new GenericArrayData(Array(1, 2, 3))
[error]                        ^
[error] /Users/maropu/IdeaProjects/temps/spark-compat-test/src/test/scala/test/SparkCompatTestSuite.scala:10: not found: type GenericArrayData
[error]     val valArray = new GenericArrayData(Array(0, 0, 0))
[error]                        ^
[error] /Users/maropu/IdeaProjects/temps/spark-compat-test/src/test/scala/test/SparkCompatTestSuite.scala:11: not found: type ArrayBasedMapData
[error]     val mapData = new ArrayBasedMapData(keyArray, valArray)
[error]                       ^
[error] three errors found
[error] (test:compileIncremental) Compilation failed
[error] Total time: 9 s, completed Jan 14, 2016 2:26:05 PM

smola and others added 3 commits January 21, 2016 21:11
Added type aliases in org.apache.spark.sql.types for classes
moved to org.apache.spark.sql.catalyst.util.
@maropu
Copy link
Member Author

maropu commented Jan 21, 2016

@marmbrus ping

@SparkQA
Copy link

SparkQA commented Jan 21, 2016

Test build #49876 has finished for PR 10635 at commit 57a57fc.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@marmbrus
Copy link
Contributor

Okay, this seems good, but can you reopen it against branch-1.6? I don't think that we want to continue to offer compatibility once the major version changes in 2.0.

@maropu
Copy link
Member Author

maropu commented Jan 26, 2016

@marmbrus Okay, I'll do that.

@maropu maropu closed this Jan 26, 2016
asfgit pushed a commit that referenced this pull request Feb 1, 2016
Changed a target at branch-1.6 from #10635.

Author: Takeshi YAMAMURO <[email protected]>

Closes #10915 from maropu/pr9935-v3.
@maropu maropu deleted the pr9935 branch July 5, 2017 11:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants