Closed
Description
suggested by @soronpo at
https://contributors.scala-lang.org/t/spark-as-a-scala-gateway-drug-and-the-2-12-failure/1747/44
I had listed Spark at #161 as "probably out of scope", but perhaps that could change now that Spark is on Scala 2.12
some concerns I can think of are:
- build tool — I think Spark is built with Maven?
- currently literally everything else in the community build is sbt based
- dbuild is supposed to support multiple build tools, but I'm not sure what the status of Maven support is
- does an sbt build exist?
- overall size, complexity, length of build? I've heard it's a big build, am I wrong? two concerns here:
- possible difficulty adding & maintaining it
- possibly bloating the community build runtimes if the tests take a long time to run
- would we even meaningfully be testing much unless we do cluster-based tests, which would be out of scope for the community build?
note that an alternative approach would be for Spark to add "latest Scala nightly" to their own CI matrix. this might be more practical than taking this on at our end
Metadata
Metadata
Assignees
Labels
No labels