Skip to content
Snippets Groups Projects
Commit 1656aae2 authored by lewuathe's avatar lewuathe Committed by Aaron Davidson
Browse files

[SPARK-5073] spark.storage.memoryMapThreshold have two default value

Because major OS page sizes is about 4KB, the default value of spark.storage.memoryMapThreshold is integrated to 2 * 4096

Author: lewuathe <lewuathe@me.com>

Closes #3900 from Lewuathe/integrate-memoryMapThreshold and squashes the following commits:

e417acd [lewuathe] [SPARK-5073] Update docs/configuration
834aba4 [lewuathe] [SPARK-5073] Fix style
adcea33 [lewuathe] [SPARK-5073] Integrate memory map threshold to 2MB
fcce2e5 [lewuathe] [SPARK-5073] spark.storage.memoryMapThreshold have two default value
parent 33132609
No related branches found
No related tags found
No related merge requests found
...@@ -31,7 +31,8 @@ import org.apache.spark.util.Utils ...@@ -31,7 +31,8 @@ import org.apache.spark.util.Utils
private[spark] class DiskStore(blockManager: BlockManager, diskManager: DiskBlockManager) private[spark] class DiskStore(blockManager: BlockManager, diskManager: DiskBlockManager)
extends BlockStore(blockManager) with Logging { extends BlockStore(blockManager) with Logging {
val minMemoryMapBytes = blockManager.conf.getLong("spark.storage.memoryMapThreshold", 2 * 4096L) val minMemoryMapBytes = blockManager.conf.getLong(
"spark.storage.memoryMapThreshold", 2 * 1024L * 1024L)
override def getSize(blockId: BlockId): Long = { override def getSize(blockId: BlockId): Long = {
diskManager.getFile(blockId.name).length diskManager.getFile(blockId.name).length
......
...@@ -678,7 +678,7 @@ Apart from these, the following properties are also available, and may be useful ...@@ -678,7 +678,7 @@ Apart from these, the following properties are also available, and may be useful
</tr> </tr>
<tr> <tr>
<td><code>spark.storage.memoryMapThreshold</code></td> <td><code>spark.storage.memoryMapThreshold</code></td>
<td>8192</td> <td>2097152</td>
<td> <td>
Size of a block, in bytes, above which Spark memory maps when reading a block from disk. Size of a block, in bytes, above which Spark memory maps when reading a block from disk.
This prevents Spark from memory mapping very small blocks. In general, memory This prevents Spark from memory mapping very small blocks. In general, memory
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment