-
- Downloads
[SPARK-1740] [PySpark] kill the python worker
Kill only the python worker related to cancelled tasks. The daemon will start a background thread to monitor all the opened sockets for all workers. If the socket is closed by JVM, this thread will kill the worker. When an task is cancelled, the socket to worker will be closed, then the worker will be killed by deamon. Author: Davies Liu <davies.liu@gmail.com> Closes #1643 from davies/kill and squashes the following commits: 8ffe9f3 [Davies Liu] kill worker by deamon, because runtime.exec() is too heavy 46ca150 [Davies Liu] address comment acd751c [Davies Liu] kill the worker when task is canceled
Showing
- core/src/main/scala/org/apache/spark/SparkEnv.scala 3 additions, 2 deletionscore/src/main/scala/org/apache/spark/SparkEnv.scala
- core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala 4 additions, 5 deletions...rc/main/scala/org/apache/spark/api/python/PythonRDD.scala
- core/src/main/scala/org/apache/spark/api/python/PythonWorkerFactory.scala 49 additions, 15 deletions...ala/org/apache/spark/api/python/PythonWorkerFactory.scala
- python/pyspark/daemon.py 18 additions, 6 deletionspython/pyspark/daemon.py
- python/pyspark/tests.py 51 additions, 0 deletionspython/pyspark/tests.py
Loading
Please register or sign in to comment