Skip to content
Snippets Groups Projects
Commit e269c24d authored by Sandeep's avatar Sandeep Committed by Patrick Wendell
Browse files

SPARK-1469: Scheduler mode should accept lower-case definitions and have...

... nicer error messages

There are  two improvements to Scheduler Mode:
1. Made the built in ones case insensitive (fair/FAIR, fifo/FIFO).
2. If an invalid mode is given we should print a better error message.

Author: Sandeep <sandeep@techaddict.me>

Closes #388 from techaddict/1469 and squashes the following commits:

a31bbd5 [Sandeep] SPARK-1469: Scheduler mode should accept lower-case definitions and have nicer error messages There are  two improvements to Scheduler Mode: 1. Made the built in ones case insensitive (fair/FAIR, fifo/FIFO). 2. If an invalid mode is given we should print a better error message.
parent 82349fbd
No related branches found
No related tags found
No related merge requests found
...@@ -25,5 +25,5 @@ package org.apache.spark.scheduler ...@@ -25,5 +25,5 @@ package org.apache.spark.scheduler
object SchedulingMode extends Enumeration { object SchedulingMode extends Enumeration {
type SchedulingMode = Value type SchedulingMode = Value
val FAIR,FIFO,NONE = Value val FAIR, FIFO, NONE = Value
} }
...@@ -99,8 +99,13 @@ private[spark] class TaskSchedulerImpl( ...@@ -99,8 +99,13 @@ private[spark] class TaskSchedulerImpl(
var schedulableBuilder: SchedulableBuilder = null var schedulableBuilder: SchedulableBuilder = null
var rootPool: Pool = null var rootPool: Pool = null
// default scheduler is FIFO // default scheduler is FIFO
val schedulingMode: SchedulingMode = SchedulingMode.withName( private val schedulingModeConf = conf.get("spark.scheduler.mode", "FIFO")
conf.get("spark.scheduler.mode", "FIFO")) val schedulingMode: SchedulingMode = try {
SchedulingMode.withName(schedulingModeConf.toUpperCase)
} catch {
case e: java.util.NoSuchElementException =>
throw new SparkException(s"Urecognized spark.scheduler.mode: $schedulingModeConf")
}
// This is a var so that we can reset it for testing purposes. // This is a var so that we can reset it for testing purposes.
private[spark] var taskResultGetter = new TaskResultGetter(sc.env, this) private[spark] var taskResultGetter = new TaskResultGetter(sc.env, this)
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment