Skip to content
Snippets Groups Projects
Commit 04a49edf authored by zsxwing's avatar zsxwing Committed by Kay Ousterhout
Browse files

[SPARK-9497] [SPARK-9509] [CORE] Use ask instead of askWithRetry

`RpcEndpointRef.askWithRetry` throws `SparkException` rather than `TimeoutException`. Use ask to replace it because we don't need to retry here.

Author: zsxwing <zsxwing@gmail.com>

Closes #7824 from zsxwing/SPARK-9497 and squashes the following commits:

7bfc2b4 [zsxwing] Use ask instead of askWithRetry
parent fc0e57e5
No related branches found
No related tags found
No related merge requests found
......@@ -27,7 +27,7 @@ import org.apache.spark.deploy.{ApplicationDescription, ExecutorState}
import org.apache.spark.deploy.DeployMessages._
import org.apache.spark.deploy.master.Master
import org.apache.spark.rpc._
import org.apache.spark.util.{ThreadUtils, Utils}
import org.apache.spark.util.{RpcUtils, ThreadUtils, Utils}
/**
* Interface allowing applications to speak with a Spark deploy cluster. Takes a master URL,
......@@ -248,7 +248,8 @@ private[spark] class AppClient(
def stop() {
if (endpoint != null) {
try {
endpoint.askWithRetry[Boolean](StopAppClient)
val timeout = RpcUtils.askRpcTimeout(conf)
timeout.awaitResult(endpoint.ask[Boolean](StopAppClient))
} catch {
case e: TimeoutException =>
logInfo("Stop request to Master timed out; it may already be shut down.")
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment