Skip to content
Snippets Groups Projects

Repository graph

You can move around the graph by using the arrow keys.
Select Git revision
  • baesline-eviction-with-logging
  • evict-by-size
  • master default protected
  • working
  • v2.3.0
  • v2.3.0-rc4
  • v2.3.0-rc3
  • v2.3.0-rc2
  • v2.3.0-rc1
  • v2.2.1
  • v2.2.1-rc2
  • v2.2.1-rc1
  • v2.1.2
  • v2.1.2-rc4
  • v2.1.2-rc3
  • v2.1.2-rc2
  • v2.1.2-rc1
  • v2.2.0
  • v2.1.1
  • v2.1.0
  • v2.0.2
  • v1.6.3
  • v2.0.1
  • v2.0.0
24 results
Created with Raphaël 2.2.031May3031302928272627262526252425242524232223222120191817181716151615161514131211121110910910910987876543232130Apr2928272625262524252423222120212019201918191817[SPARK-20244][CORE] Handle incorrect bytesRead metrics when using PySpark[SPARK-20940][CORE] Replace IllegalAccessError with IllegalStateException[SPARK-20940][CORE] Replace IllegalAccessError with IllegalStateException[SPARK-20940][CORE] Replace IllegalAccessError with IllegalStateException[SPARK-20894][SS] Resolve the checkpoint location in driver and use the resolved path in state store[SPARK-20876][SQL][BACKPORT-2.2] If the input parameter is float type for ceil or floor,the result is not we expected[SPARK-19236][SQL][FOLLOW-UP] Added createOrReplaceGlobalTempView method[SPARK-20633][SQL] FileFormatWriter should not wrap FetchFailedException[SPARK-20288] Avoid generating the MapStatus by stageId in BasicSchedulerIntegrationSuite[SPARK-20790][MLLIB] Correctly handle negative values for implicit feedback in ALS[SPARK-20790][MLLIB] Correctly handle negative values for implicit feedback in ALS[DOCS][MINOR] Scaladoc fixes (aka typo hunting)[SPARK-20877][SPARKR][WIP] add timestamps to test runs[SPARK-20877][SPARKR][WIP] add timestamps to test runsRevert "[SPARK-20392][SQL] Set barrier to prevent re-entering a tree"[SPARK-20275][UI] Do not display "Completed" column for in-progress applications[SPARK-20275][UI] Do not display "Completed" column for in-progress applications[SPARK-20275][UI] Do not display "Completed" column for in-progress applications[SPARK-20213][SQL] Fix DataFrameWriter operations in SQL UI tab[SPARK-20883][SPARK-20376][SS] Refactored StateStore APIs and added conf to choose implementation[SPARK-20924][SQL] Unable to call the function registered in the not-current database[SPARK-20924][SQL] Unable to call the function registered in the not-current databaseHOTFIX: fix Scalastyle break introduced in 4d57981cfb18e7500cde6c03ae46c7c9b697d064[SPARK-20333] HashPartitioner should be compatible with num of child RDD's partitions.[SPARK-19236][CORE] Added createOrReplaceGlobalTempView method[SPARK-20899][PYSPARK] PySpark supports stringIndexerOrderType in RFormula[SPARK-20916][SQL] Improve error message for unaliased subqueries in FROM clause[MINOR] Fix some indent issues.[SPARK-20909][SQL] Add build-int SQL function - DAYOFWEEK[SPARK-19968][SS] Use a cached instance of `KafkaProducer` instead of creating one every batch.[SPARK-19968][SS] Use a cached instance of `KafkaProducer` instead of creating one every batch.[SPARK-8184][SQL] Add additional function description for weekofyear[SPARK-8184][SQL] Add additional function description for weekofyear[SPARK-20907][TEST] Use testQuietly for test suites that generate long log output[SPARK-20907][TEST] Use testQuietly for test suites that generate long log output[SPARK-20750][SQL] Built-in SQL Function Support - REPLACE[SPARK-20758][SQL] Add Constant propagation optimization[SPARK-20881][SQL] Clearly document the mechanism to choose between two sources of statistics[SPARK-20841][SQL] Support table column aliases in FROM clause[SPARK-20908][SQL] Cache Manager: Hint should be ignored in plan matching
Loading