Skip to content
Snippets Groups Projects

Repository graph

You can move around the graph by using the arrow keys.
Select Git revision
  • baesline-eviction-with-logging
  • evict-by-size
  • master default protected
  • working
  • v2.3.0
  • v2.3.0-rc4
  • v2.3.0-rc3
  • v2.3.0-rc2
  • v2.3.0-rc1
  • v2.2.1
  • v2.2.1-rc2
  • v2.2.1-rc1
  • v2.1.2
  • v2.1.2-rc4
  • v2.1.2-rc3
  • v2.1.2-rc2
  • v2.1.2-rc1
  • v2.2.0
  • v2.1.1
  • v2.1.0
  • v2.0.2
  • v1.6.3
  • v2.0.1
  • v2.0.0
24 results
Created with Raphaël 2.2.01May30Apr2928272625262524252423222120212019201918191817161514131213121110111091097656543432131Mar1Apr31Mar30292827282726252425242324232221222120212019201918171615141514151413121098765[SPARK-20459][SQL] JdbcUtils throws IllegalStateException: Cause already initialized after getting SQLException[SPARK-20540][CORE] Fix unstable executor requests.[SPARK-20540][CORE] Fix unstable executor requests.[SPARK-20540][CORE] Fix unstable executor requests.[SPARK-20464][SS] Add a job group and description for streaming queries and fix cancellation of running jobs using the job group[SPARK-20464][SS] Add a job group and description for streaming queries and fix cancellation of running jobs using the job group[SPARK-20517][UI] Fix broken history UI download link[SPARK-20517][UI] Fix broken history UI download link[SPARK-20517][UI] Fix broken history UI download link[SPARK-20534][SQL] Make outer generate exec return empty rows[SPARK-20534][SQL] Make outer generate exec return empty rows[SPARK-20290][MINOR][PYTHON][SQL] Add PySpark wrapper for eqNullSafe[SPARK-20541][SPARKR][SS] support awaitTermination without timeout[SPARK-20541][SPARKR][SS] support awaitTermination without timeout[SPARK-20490][SPARKR] Add R wrappers for eqNullSafe and ! / not[MINOR][DOCS][PYTHON] Adding missing boolean type for replacement value in fillna[MINOR][DOCS][PYTHON] Adding missing boolean type for replacement value in fillna[SPARK-20535][SPARKR] R wrappers for explode_outer and posexplode_outer[SPARK-20492][SQL] Do not print empty parentheses for invalid primitive types in parser[SPARK-20492][SQL] Do not print empty parentheses for invalid primitive types in parser[SPARK-20521][DOC][CORE] The default of 'spark.worker.cleanup.appDataTtl' should be 604800 in spark-standalone.md[SPARK-20521][DOC][CORE] The default of 'spark.worker.cleanup.appDataTtl' should be 604800 in spark-standalone.md[SPARK-20442][PYTHON][DOCS] Fill up documentations for functions in Column API in PySpark[SPARK-20493][R] De-duplicate parse logics for DDL-like type strings in R[SPARK-20533][SPARKR] SparkR Wrappers Model should be private and value should be lazy[SPARK-19791][ML] Add doc and example for fpgrowth[SPARK-19791][ML] Add doc and example for fpgrowth[SPARK-20477][SPARKR][DOC] Document R bisecting k-means in R programming guide[SPARK-20477][SPARKR][DOC] Document R bisecting k-means in R programming guide[SPARK-20487][SQL] Display `serde` for `HiveTableScan` node in explained plan[SPARK-20487][SQL] Display `serde` for `HiveTableScan` node in explained plan[SPARK-19525][CORE] Add RDD checkpoint compression support[SPARK-19525][CORE] Add RDD checkpoint compression support[SPARK-20471] Remove AggregateBenchmark testsuite warning: Two level hashmap is disabled but vectorized hashmap is enabled[SPARK-20471] Remove AggregateBenchmark testsuite warning: Two level hashmap is disabled but vectorized hashmap is enabled[SPARK-20514][CORE] Upgrade Jetty to 9.3.11.v20160721[SPARK-20514][CORE] Upgrade Jetty to 9.3.11.v20160721[SPARK-20496][SS] Bug in KafkaWriter Looks at Unanalyzed Plans[SPARK-20496][SS] Bug in KafkaWriter Looks at Unanalyzed Plans[SPARK-20496][SS] Bug in KafkaWriter Looks at Unanalyzed Plans
Loading