-
- Downloads
[SPARK-14454] Better exception handling while marking tasks as failed
## What changes were proposed in this pull request? This patch adds support for better handling of exceptions inside catch blocks if the code within the block throws an exception. For instance here is the code in a catch block before this change in `WriterContainer.scala`: ```scala logError("Aborting task.", cause) // call failure callbacks first, so we could have a chance to cleanup the writer. TaskContext.get().asInstanceOf[TaskContextImpl].markTaskFailed(cause) if (currentWriter != null) { currentWriter.close() } abortTask() throw new SparkException("Task failed while writing rows.", cause) ``` If `markTaskFailed` or `currentWriter.close` throws an exception, we currently lose the original cause. This PR fixes this problem by implementing a utility function `Utils.tryWithSafeCatch` that suppresses (`Throwable.addSuppressed`) the exception that are thrown within the catch block and rethrowing the original exception. ## How was this patch tested? No new functionality added Author: Sameer Agarwal <sameer@databricks.com> Closes #12234 from sameeragarwal/fix-exception.
Showing
- core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala 2 additions, 6 deletions...rc/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala
- core/src/main/scala/org/apache/spark/scheduler/Task.scala 10 additions, 4 deletionscore/src/main/scala/org/apache/spark/scheduler/Task.scala
- core/src/main/scala/org/apache/spark/util/Utils.scala 19 additions, 10 deletionscore/src/main/scala/org/apache/spark/util/Utils.scala
- sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/WriterContainer.scala 34 additions, 37 deletions...che/spark/sql/execution/datasources/WriterContainer.scala
Loading
Please register or sign in to comment