Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
S
spark
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Model registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
cs525-sp18-g07
spark
Repository graph
Repository graph
You can move around the graph by using the arrow keys.
cb324f61150c962aeabf0a779f6a09797b3d5072
Select Git revision
Branches
4
baesline-eviction-with-logging
evict-by-size
master
default
protected
working
Tags
20
v2.3.0
v2.3.0-rc4
v2.3.0-rc3
v2.3.0-rc2
v2.3.0-rc1
v2.2.1
v2.2.1-rc2
v2.2.1-rc1
v2.1.2
v2.1.2-rc4
v2.1.2-rc3
v2.1.2-rc2
v2.1.2-rc1
v2.2.0
v2.1.1
v2.1.0
v2.0.2
v1.6.3
v2.0.1
v2.0.0
24 results
Begin with the selected commit
Created with Raphaël 2.2.0
18
Oct
17
16
17
16
15
14
13
12
13
12
11
10
11
10
9
8
7
6
5
4
3
2
1
30
Sep
29
28
27
28
27
26
25
24
23
24
23
22
23
22
21
20
19
18
17
16
15
14
15
14
13
14
13
12
11
10
9
8
7
6
7
6
7
6
5
6
5
4
3
4
3
2
3
2
1
31
Aug
1
Sep
31
Aug
[SPARK-17974] try 2) Refactor FileCatalog classes to simplify the inheritance tree
[SPARK-17711] Compress rolled executor log
[SPARK-17711] Compress rolled executor log
[SPARK-17388] [SQL] Support for inferring type date/timestamp/decimal for partition column
[SPARK-17899][SQL][FOLLOW-UP] debug mode should work for corrupted table
[SPARK-17751][SQL][BACKPORT-2.0] Remove spark.sql.eagerAnalysis and Output the Plan if Existed in AnalysisException
[SQL][STREAMING][TEST] Follow up to remove Option.contains for Scala 2.10 compatibility
[SQL][STREAMING][TEST] Follow up to remove Option.contains for Scala 2.10 compatibility
[SQL][STREAMING][TEST] Fix flaky tests in StreamingQueryListenerSuite
[SQL][STREAMING][TEST] Fix flaky tests in StreamingQueryListenerSuite
Revert "[SPARK-17974] Refactor FileCatalog classes to simplify the inheritance tree"
[SPARK-17974] Refactor FileCatalog classes to simplify the inheritance tree
[SPARK-17620][SQL] Determine Serde by hive.default.fileformat when Creating Hive Serde Tables
[SPARK-17731][SQL][STREAMING] Metrics for structured streaming for branch-2.0
[SPARK-17751][SQL] Remove spark.sql.eagerAnalysis and Output the Plan if Existed in AnalysisException
[SPARK-17839][CORE] Use Nio's directbuffer instead of BufferedInputStream in order to avoid additional copy from os buffer cache to user buffer
Fix example of tf_idf with minDocFreq
Fix example of tf_idf with minDocFreq
[SPARK-17892][SQL][2.0] Do Not Optimize Query in CTAS More Than Once #15048
[MINOR][SQL] Add prettyName for current_database function
[MINOR][SQL] Add prettyName for current_database function
Preparing development version 1.6.4-SNAPSHOT
Preparing Spark release v1.6.3
Prepare branch-1.6 for 1.6.3 release.
[SPARK-17819][SQL][BRANCH-2.0] Support default database in connection URIs for Spark Thrift Server
[SPARK-17947][SQL] Add Doc and Comment about spark.sql.debug
[SPARK-17819][SQL] Support default database in connection URIs for Spark Thrift Server
Revert "[SPARK-17637][SCHEDULER] Packed scheduling for Spark tasks across executors"
[SPARK-17637][SCHEDULER] Packed scheduling for Spark tasks across executors
[SPARK-17953][DOCUMENTATION] Fix typo in SparkSession scaladoc
[SPARK-17953][DOCUMENTATION] Fix typo in SparkSession scaladoc
[SPARK-16980][SQL] Load only catalog table partition metadata required to answer a query
[SPARK-17946][PYSPARK] Python crossJoin API similar to Scala
[SPARK-17900][SQL] Graduate a list of Spark SQL APIs to stable
[SPARK-11775][PYSPARK][SQL] Allow PySpark to register Java UDF
[SPARK-16063][SQL] Add storageLevel to Dataset
[SPARK-17863][SQL] should not add column into Distinct
[SPARK-17863][SQL] should not add column into Distinct
Revert "[SPARK-17620][SQL] Determine Serde by hive.default.fileformat when Creating Hive Serde Tables"
[SPARK-17620][SQL] Determine Serde by hive.default.fileformat when Creating Hive Serde Tables
Loading