Skip to content
Snippets Groups Projects
Commit b061bd51 authored by Yin Huai's avatar Yin Huai Committed by Cheng Lian
Browse files

[SQL] In InsertIntoFSBasedRelation.insert, log cause before abort job/task.

We need to add a log entry before calling `abortTask`/`abortJob`. Otherwise, an exception from `abortTask`/`abortJob` will shadow the real cause.

cc liancheng

Author: Yin Huai <yhuai@databricks.com>

Closes #6105 from yhuai/logCause and squashes the following commits:

8dfe0d8 [Yin Huai] Log cause.
parent 10c546e9
No related branches found
No related tags found
No related merge requests found
...@@ -121,6 +121,7 @@ private[sql] case class InsertIntoFSBasedRelation( ...@@ -121,6 +121,7 @@ private[sql] case class InsertIntoFSBasedRelation(
writerContainer.commitJob() writerContainer.commitJob()
relation.refresh() relation.refresh()
} catch { case cause: Throwable => } catch { case cause: Throwable =>
logError("Aborting job.", cause)
writerContainer.abortJob() writerContainer.abortJob()
throw new SparkException("Job aborted.", cause) throw new SparkException("Job aborted.", cause)
} }
...@@ -143,6 +144,7 @@ private[sql] case class InsertIntoFSBasedRelation( ...@@ -143,6 +144,7 @@ private[sql] case class InsertIntoFSBasedRelation(
} }
writerContainer.commitTask() writerContainer.commitTask()
} catch { case cause: Throwable => } catch { case cause: Throwable =>
logError("Aborting task.", cause)
writerContainer.abortTask() writerContainer.abortTask()
throw new SparkException("Task failed while writing rows.", cause) throw new SparkException("Task failed while writing rows.", cause)
} }
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment