-
- Downloads
SPARK-5393. Flood of util.RackResolver log messages after SPARK-1714
Previously I had tried to solve this with by adding a line in Spark's log4j-defaults.properties. The issue with the message in log4j-defaults.properties was that the log4j.properties packaged inside Hadoop was getting picked up instead. While it would be ideal to fix that as well, we still want to quiet this in situations where a user supplies their own custom log4j properties. Author: Sandy Ryza <sandy@cloudera.com> Closes #4192 from sryza/sandy-spark-5393 and squashes the following commits: 4d5dedc [Sandy Ryza] Only set log level if unset 46e07c5 [Sandy Ryza] SPARK-5393. Flood of util.RackResolver log messages after SPARK-1714
Showing
- core/src/main/resources/org/apache/spark/log4j-defaults.properties 0 additions, 1 deletion...main/resources/org/apache/spark/log4j-defaults.properties
- core/src/main/scala/org/apache/spark/SparkContext.scala 1 addition, 1 deletioncore/src/main/scala/org/apache/spark/SparkContext.scala
- core/src/test/scala/org/apache/spark/SparkContextSchedulerCreationSuite.scala 1 addition, 1 deletion...org/apache/spark/SparkContextSchedulerCreationSuite.scala
- yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala 7 additions, 0 deletions...in/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala
- yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnSparkHadoopUtil.scala 0 additions, 4 deletions...la/org/apache/spark/deploy/yarn/YarnSparkHadoopUtil.scala
- yarn/src/main/scala/org/apache/spark/scheduler/cluster/YarnClusterScheduler.scala 1 addition, 17 deletions...apache/spark/scheduler/cluster/YarnClusterScheduler.scala
- yarn/src/main/scala/org/apache/spark/scheduler/cluster/YarnScheduler.scala 8 additions, 4 deletions...la/org/apache/spark/scheduler/cluster/YarnScheduler.scala
Loading
Please register or sign in to comment