-
- Downloads
[SPARK-11324][STREAMING] Flag for closing Write Ahead Logs after a write
Currently the Write Ahead Log in Spark Streaming flushes data as writes need to be made. S3 does not support flushing of data, data is written once the stream is actually closed. In case of failure, the data for the last minute (default rolling interval) will not be properly written. Therefore we need a flag to close the stream after the write, so that we achieve read after write consistency. cc tdas zsxwing Author: Burak Yavuz <brkyvz@gmail.com> Closes #9285 from brkyvz/caw-wal.
Showing
- streaming/src/main/scala/org/apache/spark/streaming/util/FileBasedWriteAheadLog.scala 5 additions, 1 deletion.../apache/spark/streaming/util/FileBasedWriteAheadLog.scala
- streaming/src/main/scala/org/apache/spark/streaming/util/WriteAheadLogUtils.scala 14 additions, 1 deletion.../org/apache/spark/streaming/util/WriteAheadLogUtils.scala
- streaming/src/test/scala/org/apache/spark/streaming/util/WriteAheadLogSuite.scala 25 additions, 7 deletions.../org/apache/spark/streaming/util/WriteAheadLogSuite.scala
Please register or sign in to comment