Skip to content
Snippets Groups Projects
Commit 769aced9 authored by Kousuke Saruta's avatar Kousuke Saruta Committed by Patrick Wendell
Browse files

[SPARK-5329][WebUI] UIWorkloadGenerator should stop SparkContext.

UIWorkloadGenerator don't stop SparkContext. I ran UIWorkloadGenerator and try to watch the result at WebUI but Jobs are marked as finished.
It's because SparkContext is not stopped.

Author: Kousuke Saruta <sarutak@oss.nttdata.co.jp>

Closes #4112 from sarutak/SPARK-5329 and squashes the following commits:

bcc0fa9 [Kousuke Saruta] Disabled scalastyle for a bock comment
86a3b95 [Kousuke Saruta] Fixed UIWorkloadGenerator to stop SparkContext in it
parent c93a57f0
No related branches found
No related tags found
No related merge requests found
......@@ -22,20 +22,23 @@ import scala.util.Random
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.scheduler.SchedulingMode
// scalastyle:off
/**
* Continuously generates jobs that expose various features of the WebUI (internal testing tool).
*
* Usage: ./bin/spark-class org.apache.spark.ui.UIWorkloadGenerator [master] [FIFO|FAIR]
* Usage: ./bin/spark-class org.apache.spark.ui.UIWorkloadGenerator [master] [FIFO|FAIR] [#job set (4 jobs per set)]
*/
// scalastyle:on
private[spark] object UIWorkloadGenerator {
val NUM_PARTITIONS = 100
val INTER_JOB_WAIT_MS = 5000
def main(args: Array[String]) {
if (args.length < 2) {
if (args.length < 3) {
println(
"usage: ./bin/spark-class org.apache.spark.ui.UIWorkloadGenerator [master] [FIFO|FAIR]")
"usage: ./bin/spark-class org.apache.spark.ui.UIWorkloadGenerator " +
"[master] [FIFO|FAIR] [#job set (4 jobs per set)]")
System.exit(1)
}
......@@ -45,6 +48,7 @@ private[spark] object UIWorkloadGenerator {
if (schedulingMode == SchedulingMode.FAIR) {
conf.set("spark.scheduler.mode", "FAIR")
}
val nJobSet = args(2).toInt
val sc = new SparkContext(conf)
def setProperties(s: String) = {
......@@ -84,7 +88,7 @@ private[spark] object UIWorkloadGenerator {
("Job with delays", baseData.map(x => Thread.sleep(100)).count)
)
while (true) {
(1 to nJobSet).foreach { _ =>
for ((desc, job) <- jobs) {
new Thread {
override def run() {
......@@ -101,5 +105,6 @@ private[spark] object UIWorkloadGenerator {
Thread.sleep(INTER_JOB_WAIT_MS)
}
}
sc.stop()
}
}
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment