Skip to content
Snippets Groups Projects
Commit c1ff2103 authored by Mosharaf Chowdhury's avatar Mosharaf Chowdhury
Browse files

Fixed some comments.

parent 8dc44bfa
No related branches found
No related tags found
No related merge requests found
-Dspark.shuffle.class=spark.CustomBlockedInMemoryShuffle -Dspark.blockedLocalFileShuffle.maxRxConnections=2 -Dspark.blockedLocalFileShuffle.maxTxConnections=2 -Dspark.blockedLocalFileShuffle.blockSize=256 -Dspark.blockedLocalFileShuffle.minKnockInterval=50 -Dspark.blockedInMemoryShuffle.maxRxConnections=2 -Dspark.blockedInMemoryShuffle.maxTxConnections=2 -Dspark.blockedInMemoryShuffle.minKnockInterval=50 -Dspark.blockedInMemoryShuffle.maxKnockInterval=2000 -Dspark.parallelLocalFileShuffle.maxRxConnections=2 -Dspark.parallelLocalFileShuffle.maxTxConnections=2 -Dspark.parallelLocalFileShuffle.minKnockInterval=50 -Dspark.parallelLocalFileShuffle.maxKnockInterval=2000 -Dspark.parallelInMemoryShuffle.maxRxConnections=2 -Dspark.parallelInMemoryShuffle.maxTxConnections=2 -Dspark.parallelInMemoryShuffle.minKnockInterval=50 -Dspark.parallelInMemoryShuffle.maxKnockInterval=2000 -Dspark.shuffle.class=spark.CustomBlockedInMemoryShuffle -Dspark.blockedLocalFileShuffle.maxRxConnections=2 -Dspark.blockedLocalFileShuffle.maxTxConnections=2 -Dspark.blockedLocalFileShuffle.blockSize=256 -Dspark.blockedLocalFileShuffle.minKnockInterval=50 -Dspark.blockedInMemoryShuffle.maxRxConnections=2 -Dspark.blockedInMemoryShuffle.maxTxConnections=2 -Dspark.blockedInMemoryShuffle.minKnockInterval=50 -Dspark.blockedInMemoryShuffle.maxKnockInterval=2000 -Dspark.blockedInMemoryShuffle.blockSize=256 -Dspark.parallelLocalFileShuffle.maxRxConnections=2 -Dspark.parallelLocalFileShuffle.maxTxConnections=2 -Dspark.parallelLocalFileShuffle.minKnockInterval=50 -Dspark.parallelLocalFileShuffle.maxKnockInterval=2000 -Dspark.parallelInMemoryShuffle.maxRxConnections=2 -Dspark.parallelInMemoryShuffle.maxTxConnections=2 -Dspark.parallelInMemoryShuffle.minKnockInterval=50 -Dspark.parallelInMemoryShuffle.maxKnockInterval=2000
...@@ -13,9 +13,9 @@ import scala.collection.mutable.{ArrayBuffer, HashMap} ...@@ -13,9 +13,9 @@ import scala.collection.mutable.{ArrayBuffer, HashMap}
* *
* An implementation of shuffle using local memory served through custom server * An implementation of shuffle using local memory served through custom server
* where receivers create simultaneous connections to multiple servers by * where receivers create simultaneous connections to multiple servers by
* setting the 'spark.blockedLocalFileShuffle.maxRxConnections' config option. * setting the 'spark.blockedInMemoryShuffle.maxRxConnections' config option.
* *
* By controlling the 'spark.blockedLocalFileShuffle.blockSize' config option * By controlling the 'spark.blockedInMemoryShuffle.blockSize' config option
* one can also control the largest block size to divide each map output into. * one can also control the largest block size to divide each map output into.
* Essentially, instead of creating one large output file for each reducer, maps * Essentially, instead of creating one large output file for each reducer, maps
* create multiple smaller files to enable finer level of engagement. * create multiple smaller files to enable finer level of engagement.
......
...@@ -13,7 +13,7 @@ import scala.collection.mutable.{ArrayBuffer, HashMap} ...@@ -13,7 +13,7 @@ import scala.collection.mutable.{ArrayBuffer, HashMap}
* *
* An implementation of shuffle using local memory served through custom server * An implementation of shuffle using local memory served through custom server
* where receivers create simultaneous connections to multiple servers by * where receivers create simultaneous connections to multiple servers by
* setting the 'spark.parallelLocalFileShuffle.maxRxConnections' config option. * setting the 'spark.parallelInMemoryShuffle.maxRxConnections' config option.
* *
* TODO: Add support for compression when spark.compress is set to true. * TODO: Add support for compression when spark.compress is set to true.
*/ */
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment