This repository was archived by the owner on Jan 9, 2020. It is now read-only.
forked from apache/spark
-
Notifications
You must be signed in to change notification settings - Fork 117
This repository was archived by the owner on Jan 9, 2020. It is now read-only.
Launcher doesn't stop when SparkException thrown #143
Copy link
Copy link
Closed
Description
As reported by @kimoonkim at #120 (comment):
Found another case Launcher did not stop. My k8s cluster uses an overlay network, which was broken because the overlay daemons crashed. This prevented the launcher from uploading local jars. The good news is that our
Clientcode correctly recognized this and shut down the pod and service:Exception in thread "main" org.apache.spark.SparkException: Failed to submit the application to the driver pod. at org.apache.spark.deploy.kubernetes.Client$$anonfun$run$7$$anonfun$apply$6.apply(Client.scala:169) at org.apache.spark.deploy.kubernetes.Client$$anonfun$run$7$$anonfun$apply$6.apply(Client.scala:131) at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2530) at org.apache.spark.deploy.kubernetes.Client$$anonfun$run$7.apply(Client.scala:131) at org.apache.spark.deploy.kubernetes.Client$$anonfun$run$7.apply(Client.scala:109) at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2530) at org.apache.spark.deploy.kubernetes.Client.run(Client.scala:109) at org.apache.spark.deploy.kubernetes.Client$.main(Client.scala:812) at org.apache.spark.deploy.kubernetes.Client.main(Client.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:747) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:178) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:117) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)However,
LoggingPodStatusWatcherwas not informed of this and kept going:2017-02-23 13:00:06 INFO LoggingPodStatusWatcher:54 - Application status for spark-pi-1487883497166 (phase: Running) 2017-02-23 13:00:07 INFO LoggingPodStatusWatcher:54 - Application status for spark-pi-1487883497166 (phase: Running) 2017-02-23 13:00:08 INFO LoggingPodStatusWatcher:54 - Application status for spark-pi-1487883497166 (phase: Running) 2017-02-23 13:00:09 INFO LoggingPodStatusWatcher:54 - Application status for spark-pi-1487883497166 (phase: Running) 2017-02-23 13:00:10 INFO LoggingPodStatusWatcher:54 - Application status for spark-pi-1487883497166 (phase: Running) 2017-02-23 13:00:11 INFO LoggingPodStatusWatcher:54 - Application status for spark-pi-1487883497166 (phase: Running) 2017-02-23 13:00:12 INFO LoggingPodStatusWatcher:54 - Application status for spark-pi-1487883497166 (phase: Running) 2017-02-23 13:00:13 INFO LoggingPodStatusWatcher:54 - Application status for spark-pi-1487883497166 (phase: Running) ...
Metadata
Metadata
Assignees
Labels
No labels