Skip to content

Conversation

@sameeragarwal
Copy link
Member

What changes were proposed in this pull request?

This patch adds support for better handling of exceptions inside catch blocks if the code within the block throws an exception. For instance here is the code in a catch block before this change in WriterContainer.scala:

logError("Aborting task.", cause)
// call failure callbacks first, so we could have a chance to cleanup the writer.
TaskContext.get().asInstanceOf[TaskContextImpl].markTaskFailed(cause)
if (currentWriter != null) {
  currentWriter.close()
}
abortTask()
throw new SparkException("Task failed while writing rows.", cause)

If markTaskFailed or currentWriter.close throws an exception, we currently lose the original cause. This PR fixes this problem by implementing a utility function Utils.tryWithSafeCatch that suppresses (Throwable.addSuppressed) the exception that are thrown within the catch block and rethrowing the original exception.

How was this patch tested?

No new functionality added

@SparkQA
Copy link

SparkQA commented Apr 7, 2016

Test build #55196 has finished for PR 12234 at commit eebd2ef.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@sameeragarwal sameeragarwal changed the title [SPARK-14454] Better exception handling while marking tasks as failed [WIP][SPARK-14454] Better exception handling while marking tasks as failed Apr 7, 2016
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so the thing is this exception doesn't make it to the driver. it would be great if the error message that made it to the driver can contain the error for both, and the original exception's cause. then users know there is another exception that failed during close/callback, and they can go look up in the executor for the full stacktrace

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice - didn't know this existed

@SparkQA
Copy link

SparkQA commented Apr 7, 2016

Test build #55197 has finished for PR 12234 at commit 7964b2d.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Apr 7, 2016

Test build #55198 has finished for PR 12234 at commit 94b37f2.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

* fail as well. This would then suppress the original/likely more meaningful
* exception from the original `out.write` call.
*/
def tryWithSafeCatchAndFailureCallbacks[T](block: => T)(catchBlock: => Unit): T = {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems redundant with the method above; they can be unified right?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, thanks. folded these changes in the method above

@SparkQA
Copy link

SparkQA commented Apr 7, 2016

Test build #55199 has finished for PR 12234 at commit c9aaff0.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@sameeragarwal
Copy link
Member Author

@davies are there other occurrences of this pattern?

@sameeragarwal sameeragarwal changed the title [WIP][SPARK-14454] Better exception handling while marking tasks as failed [SPARK-14454] Better exception handling while marking tasks as failed Apr 8, 2016
@sameeragarwal
Copy link
Member Author

test this please

@SparkQA
Copy link

SparkQA commented Apr 8, 2016

Test build #55297 has finished for PR 12234 at commit e41cae8.

  • This patch fails from timeout after a configured wait of 250m.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Apr 8, 2016

Test build #55323 has finished for PR 12234 at commit e41cae8.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@davies
Copy link
Contributor

davies commented Apr 9, 2016

LGTM,
Merging this into master, thanks!

@asfgit asfgit closed this in 813e96e Apr 9, 2016
@rxin
Copy link
Contributor

rxin commented Apr 9, 2016

@sameeragarwal can you create one for 1.6 backport?

asfgit pushed a commit that referenced this pull request Apr 11, 2016
…failed

Backports #12234 to 1.6. Original description below:

## What changes were proposed in this pull request?

This patch adds support for better handling of exceptions inside catch blocks if the code within the block throws an exception. For instance here is the code in a catch block before this change in `WriterContainer.scala`:

```scala
logError("Aborting task.", cause)
// call failure callbacks first, so we could have a chance to cleanup the writer.
TaskContext.get().asInstanceOf[TaskContextImpl].markTaskFailed(cause)
if (currentWriter != null) {
  currentWriter.close()
}
abortTask()
throw new SparkException("Task failed while writing rows.", cause)
```

If `markTaskFailed` or `currentWriter.close` throws an exception, we currently lose the original cause. This PR fixes this problem by implementing a utility function `Utils.tryWithSafeCatch` that suppresses (`Throwable.addSuppressed`) the exception that are thrown within the catch block and rethrowing the original exception.

## How was this patch tested?

No new functionality added

Author: Sameer Agarwal <[email protected]>

Closes #12272 from sameeragarwal/fix-exception-1.6.
zzcclp pushed a commit to zzcclp/spark that referenced this pull request Apr 12, 2016
…failed

Backports apache#12234 to 1.6. Original description below:

## What changes were proposed in this pull request?

This patch adds support for better handling of exceptions inside catch blocks if the code within the block throws an exception. For instance here is the code in a catch block before this change in `WriterContainer.scala`:

```scala
logError("Aborting task.", cause)
// call failure callbacks first, so we could have a chance to cleanup the writer.
TaskContext.get().asInstanceOf[TaskContextImpl].markTaskFailed(cause)
if (currentWriter != null) {
  currentWriter.close()
}
abortTask()
throw new SparkException("Task failed while writing rows.", cause)
```

If `markTaskFailed` or `currentWriter.close` throws an exception, we currently lose the original cause. This PR fixes this problem by implementing a utility function `Utils.tryWithSafeCatch` that suppresses (`Throwable.addSuppressed`) the exception that are thrown within the catch block and rethrowing the original exception.

## How was this patch tested?

No new functionality added

Author: Sameer Agarwal <[email protected]>

Closes apache#12272 from sameeragarwal/fix-exception-1.6.

(cherry picked from commit c12db0d)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants