Skip to content

Conversation

@srowen
Copy link
Member

@srowen srowen commented May 15, 2017

What changes were proposed in this pull request?

Update Scala to 2.11.11 and zinc to 0.3.15

How was this patch tested?

Existing tests.

@SparkQA
Copy link

SparkQA commented May 15, 2017

Test build #76933 has finished for PR 17982 at commit 2028dbd.

  • This patch fails to build.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member Author

srowen commented May 15, 2017

@ScrapCodes it's good you're here because this might be something you can help with. (CC @dragos as well)

I'm updating to scala 2.11.11, and it looks like the loadFiles method that SparkILoop hooks into was removed in 2.11.9. I think it's about the same to hook into process() instead and do Spark init after it's done. Do you know of any reason that's a bad idea? I'm pushing the change anyway to see.

@ScrapCodes
Copy link
Member

Hi Sean,
Taking a quick look at the changes that removed loadFiles. changes

It seems startup is the correct alternative, i.e. just before the call to printWelcome.
See startup.

@srowen
Copy link
Member Author

srowen commented May 15, 2017

@ScrapCodes ah I should have pointed out the other constraint: we want this to still work with 2.11.8. Right now Spark would not run vs Scala 2.11.9+ at runtime, but if we tried to hook into startup it wouldn't work with 2.11.8 because that method didn't exist. That's why I was looking at process.

@ScrapCodes
Copy link
Member

ScrapCodes commented May 15, 2017

Actually, now I am wondering, how would it work without createInerpreter is executed. Your constraint, makes things a bit tricky. I will think more before I reply.

@ScrapCodes
Copy link
Member

ScrapCodes commented May 15, 2017

If I am not able to come up with the better alternative, then following not so ideal option exist.

  1. we can do a scala version check and have two different code paths based on scala version.
    In scala 2.11.8+ we can hook our init just before printWelcome() [EDIT] This won't help either. :(
    In scala 2.11.11 we can do it just before loopPostInit().

@dragos
Copy link
Contributor

dragos commented May 15, 2017

If I'm not mistaken the current patch will fail on files passed via -i to spark-shell, since Spark is initialized after process is done (the SparkContext is not available during initialization, when -i files are loaded).

I'm not sure if there's a better way, especially one that works on all 2.11 series. Mea culpa for not adding a comment on loadFiles in the Scala REPL when I worked on the Spark Shell. Maybe @som-snytt has a suggestion for inserting initializeSpark in the initialization pipeline?

@SparkQA
Copy link

SparkQA commented May 15, 2017

Test build #76936 has finished for PR 17982 at commit 794d18d.

  • This patch fails from timeout after a configured wait of `250m`.
  • This patch merges cleanly.
  • This patch adds no public classes.

@som-snytt
Copy link

The contract was never well-defined.

sbt overrides createInterpreter and runs its initialCommands before returning. So that runs before REPL finishes init in loopPostInit: no $intp, power mode not entered, start code and -i not run yet.

I can't tell just by looking if that suits your needs here, but it's the most ancient hook enshrined by sbt.

https://github.com/sbt/sbt/blob/0.13/compile/interface/src/main/scala/xsbt/ConsoleInterface.scala#L27

@adriaanm is taking executive action to modularize/make API for REPL reuse. Possibly you're aware or he's only just started and is in stealth mode. That will address the issue in future.

@srowen
Copy link
Member Author

srowen commented May 15, 2017

Yeah, this was always a bit of a hack as this isn't an API. createInterpreter sounds like a promising place to hook into for now as it exists across these versions, and would mean Spark is initted a little earlier, before instead of after this middle block in here:

  def process(settings: Settings): Boolean = savingContextLoader {
    this.settings = settings
    createInterpreter()

    // sets in to some kind of reader depending on environmental cues
    in = in0.fold(chooseReader(settings))(r => SimpleReader(r, out, interactive = true))
    globalFuture = future {
      intp.initializeSynchronous()
      loopPostInit()
      !intp.reporter.hasErrors
    }
    loadFiles(settings)
    printWelcome()
...

@SparkQA
Copy link

SparkQA commented May 15, 2017

Test build #76946 has finished for PR 17982 at commit e1482c3.

  • This patch fails from timeout after a configured wait of `250m`.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented May 19, 2017

Test build #3728 has started for PR 17982 at commit e1482c3.

@SparkQA
Copy link

SparkQA commented May 19, 2017

Test build #3741 has finished for PR 17982 at commit e1482c3.

  • This patch fails from timeout after a configured wait of `250m`.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented May 19, 2017

Test build #77096 has finished for PR 17982 at commit b5a7035.

  • This patch fails from timeout after a configured wait of `250m`.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member Author

srowen commented May 20, 2017

Darn. I don't know if this is going to work. I'm closing this for now.

@srowen srowen closed this May 20, 2017
@som-snytt
Copy link

Thanks for the effort. I'll take a hack soon. If it's hopeless, I'll at least try to track developments with the new REPL API.

@tovbinm
Copy link
Member

tovbinm commented Feb 9, 2018

@som-snytt @srowen what about 2.11.12?

@srowen
Copy link
Member Author

srowen commented Feb 9, 2018

Unless it changed the internals of the interpreter to afford a different place to hack in the init, don't think it is any different. However the build and most tests already work with 2.12.d

@tovbinm
Copy link
Member

tovbinm commented Feb 9, 2018

2.12 would be even better. I haven't seen a ticket related to 2.12 upgrade. I can give it a try.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants