-
- Downloads
[SPARK-3993] [PySpark] fix bug while reuse worker after take()
After take(), maybe there are some garbage left in the socket, then next task assigned to this worker will hang because of corrupted data. We should make sure the socket is clean before reuse it, write END_OF_STREAM at the end, and check it after read out all result from python. Author: Davies Liu <davies.liu@gmail.com> Author: Davies Liu <davies@databricks.com> Closes #2838 from davies/fix_reuse and squashes the following commits: 8872914 [Davies Liu] fix tests 660875b [Davies Liu] fix bug while reuse worker after take()
Showing
- core/src/main/scala/org/apache/spark/SparkEnv.scala 2 additions, 0 deletionscore/src/main/scala/org/apache/spark/SparkEnv.scala
- core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala 10 additions, 1 deletion...rc/main/scala/org/apache/spark/api/python/PythonRDD.scala
- python/pyspark/daemon.py 4 additions, 1 deletionpython/pyspark/daemon.py
- python/pyspark/serializers.py 1 addition, 0 deletionspython/pyspark/serializers.py
- python/pyspark/tests.py 18 additions, 1 deletionpython/pyspark/tests.py
- python/pyspark/worker.py 9 additions, 2 deletionspython/pyspark/worker.py
Loading
Please register or sign in to comment