Skip to content

Conversation

@srowen
Copy link
Member

@srowen srowen commented Dec 31, 2014

Add servlet-api excludes from SPARK-1776 to avoid javax.servlet.FilterRegistration signer error in branch 1.0.

@SparkQA
Copy link

SparkQA commented Dec 31, 2014

Test build #24977 has started for PR 3864 at commit 4fcfd50.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Jan 1, 2015

Test build #24977 has finished for PR 3864 at commit 4fcfd50.

  • This patch fails some tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/24977/
Test FAILed.

@JoshRosen
Copy link
Contributor

Hmm, only one test failing now. Maybe it's flaky; let's see.

@JoshRosen
Copy link
Contributor

Jenkins, retest this please.

@SparkQA
Copy link

SparkQA commented Jan 1, 2015

Test build #24980 has started for PR 3864 at commit 4fcfd50.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Jan 1, 2015

Test build #24980 has finished for PR 3864 at commit 4fcfd50.

  • This patch fails some tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/24980/
Test FAILed.

@JoshRosen
Copy link
Contributor

It looks like a few core tests are failing; I'll check out this branch locally to see if I can reproduce the issue.

@JoshRosen
Copy link
Contributor

That ContextCleanerSuite fails locally on my machine, too:

[info] ContextCleanerSuite:
[info] - cleanup RDD
[info] - cleanup shuffle
[info] - cleanup broadcast
[info] - automatically cleanup RDD
[info] - automatically cleanup shuffle
[info] - automatically cleanup broadcast
[info] - automatically cleanup RDD + shuffle + broadcast
[info] - automatically cleanup RDD + shuffle + broadcast in distributed mode *** FAILED ***
[info]   org.apache.spark.SparkException: Job aborted due to stage failure: Master removed our application: FAILED
[info]   at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1054)
[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1038)
[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1036)
[info]   at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
[info]   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
[info]   at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1036)
[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:635)
[info]   at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:635)
[info]   at scala.Option.foreach(Option.scala:236)
[info]   at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:635)
[info]   ...
[info] ScalaTest

It turns out that this test uses local-cluster mode and it looks like it's failing because executors can't be launched.

Here's a log from Jenkins:

14/12/31 18:23:31.752 ERROR ExecutorRunner: Error running executor
java.io.IOException: Cannot run program "/home/jenkins/workspace/SparkPullRequestBuilder/core/./bin/compute-classpath.sh" (in directory "."): error=2, No such file or directory
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1041)
    at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:759)
    at org.apache.spark.deploy.worker.CommandUtils$.buildJavaOpts(CommandUtils.scala:72)
    at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:37)
    at org.apache.spark.deploy.worker.ExecutorRunner.getCommandSeq(ExecutorRunner.scala:110)
    at org.apache.spark.deploy.worker.ExecutorRunner.fetchAndRunExecutor(ExecutorRunner.scala:125)
    at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:58)
Caused by: java.io.IOException: error=2, No such file or directory
    at java.lang.UNIXProcess.forkAndExec(Native Method)
    at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
    at java.lang.ProcessImpl.start(ProcessImpl.java:130)
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1022)
    ... 6 more
14/12/31 18:23:31.753 INFO Worker: Asked to launch executor app-20141231182331-0000/1 for ContextCleanerSuite
14/12/31 18:23:31.756 INFO Master: Removing executor app-20141231182331-0000/0 because it is FAILED
14/12/31 18:23:31.756 INFO Worker: Executor app-20141231182331-0000/0 finished with state FAILED message java.io.IOException: Cannot run program "/home/jenkins/workspace/SparkPullRequestBuilder/core/./bin/compute-classpath.sh" (in directory "."): error=2, No such file or directory
14/12/31 18:23:31.757 INFO Master: Launching executor app-20141231182331-0000/2 on worker worker-20141231182331-localhost-51737
14/12/31 18:23:31.757 ERROR ExecutorRunner: Error running executor
java.io.IOException: Cannot run program "/home/jenkins/workspace/SparkPullRequestBuilder/core/./bin/compute-classpath.sh" (in directory "."): error=2, No such file or directory
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1041)
    at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:759)
    at org.apache.spark.deploy.worker.CommandUtils$.buildJavaOpts(CommandUtils.scala:72)
    at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:37)
    at org.apache.spark.deploy.worker.ExecutorRunner.getCommandSeq(ExecutorRunner.scala:110)
    at org.apache.spark.deploy.worker.ExecutorRunner.fetchAndRunExecutor(ExecutorRunner.scala:125)
    at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:58)
Caused by: java.io.IOException: error=2, No such file or directory
    at java.lang.UNIXProcess.forkAndExec(Native Method)
    at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
    at java.lang.ProcessImpl.start(ProcessImpl.java:130)
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1022)
    ... 6 more

In this case, it looks like the path to compute-classpath is wrong, since it includes core:

/home/jenkins/workspace/SparkPullRequestBuilder/core/./bin/compute-classpath.sh

This suggests that SPARK_HOME may not be set properly for these tests:

val classPath = Utils.executeAndGetOutput(

We hit this same error at #3850, which suggests that maybe the SBT build for branch-1.0 is broken.

@JoshRosen
Copy link
Contributor

It turns out that we don't have Jenkins jobs to run the SBT tests for maintenance branches, so this bug could have been introduced weeks ago and we've only just noticed it now; I've opened https://issues.apache.org/jira/browse/SPARK-5053 to track progress towards adding SBT jobs for those branches.

@JoshRosen
Copy link
Contributor

It looks like all of the failing tests use local-cluster mode; this seems similar to a symptom of an issue fixed by #1734, so it could be due to issues with SPARK_HOME or unexpected changes to the current working directory.

@JoshRosen
Copy link
Contributor

I'm going to merge this now, since it fixes the original issue in branch-1.0. There are still other things that are broken / flaky in that build, but I'll address them separately in a different PR / new JIRAs. By the way, we now have SBT Jenkins builds for the backport branches, which will make it easier to debug these issues in the future.

@JoshRosen
Copy link
Contributor

(You can close this PR now that I've merged it; GitHub won't auto-close against branch-1.0).

asfgit pushed a commit that referenced this pull request Jan 7, 2015
… to "javax.servlet.FilterRegistration's signer information" errors

Add servlet-api excludes from SPARK-1776 to avoid javax.servlet.FilterRegistration signer error in branch 1.0.

Author: Sean Owen <[email protected]>

Closes #3864 from srowen/SPARK-5039 and squashes the following commits:

4fcfd50 [Sean Owen] Add servlet-api excludes from SPARK-1776 to avoid javax.servlet.FilterRegistration signer error
@srowen srowen closed this Jan 7, 2015
@srowen srowen deleted the SPARK-5039 branch January 7, 2015 22:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants