-
- Downloads
[SPARK-15796][CORE] Reduce spark.memory.fraction default to avoid overrunning...
[SPARK-15796][CORE] Reduce spark.memory.fraction default to avoid overrunning old gen in JVM default config ## What changes were proposed in this pull request? Reduce `spark.memory.fraction` default to 0.6 in order to make it fit within default JVM old generation size (2/3 heap). See JIRA discussion. This means a full cache doesn't spill into the new gen. CC andrewor14 ## How was this patch tested? Jenkins tests. Author: Sean Owen <sowen@cloudera.com> Closes #13618 from srowen/SPARK-15796.
Showing
- core/src/main/scala/org/apache/spark/memory/UnifiedMemoryManager.scala 4 additions, 4 deletions.../scala/org/apache/spark/memory/UnifiedMemoryManager.scala
- core/src/test/scala/org/apache/spark/DistributedSuite.scala 1 addition, 1 deletioncore/src/test/scala/org/apache/spark/DistributedSuite.scala
- docs/configuration.md 4 additions, 3 deletionsdocs/configuration.md
- docs/tuning.md 17 additions, 1 deletiondocs/tuning.md
Loading
Please register or sign in to comment