Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
S
spark
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Model registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
cs525-sp18-g07
spark
Repository graph
Repository graph
You can move around the graph by using the arrow keys.
a3626ca333e6e1881e2f09ccae0fa8fa7243223e
Select Git revision
Branches
4
baesline-eviction-with-logging
evict-by-size
master
default
protected
working
Tags
20
v2.3.0
v2.3.0-rc4
v2.3.0-rc3
v2.3.0-rc2
v2.3.0-rc1
v2.2.1
v2.2.1-rc2
v2.2.1-rc1
v2.1.2
v2.1.2-rc4
v2.1.2-rc3
v2.1.2-rc2
v2.1.2-rc1
v2.2.0
v2.1.1
v2.1.0
v2.0.2
v1.6.3
v2.0.1
v2.0.0
24 results
Begin with the selected commit
Created with Raphaël 2.2.0
15
Mar
14
15
14
15
14
13
12
10
9
8
7
6
5
4
3
2
1
2
1
28
Feb
1
Mar
28
Feb
27
28
27
26
25
24
25
24
23
22
21
20
19
18
17
16
15
14
13
12
11
10
9
8
7
8
7
6
7
6
7
6
5
4
3
2
1
31
Jan
30
29
28
27
26
25
24
23
24
23
22
23
21
22
21
20
21
20
19
20
19
18
17
18
17
16
15
14
13
12
13
12
[SPARK-18066][CORE][TESTS] Add Pool usage policies test coverage for FIFO & FAIR Schedulers
[MINOR][CORE] Fix a info message of `prunePartitions`
[SPARK-19960][CORE] Move `SparkHadoopWriter` to `internal/io/`
[SPARK-13450] Introduce ExternalAppendOnlyUnsafeRowArray. Change CartesianProductExec, SortMergeJoin, WindowExec to use it
[SPARK-19872] [PYTHON] Use the correct deserializer for RDD construction for coalesce/repartition
[SPARK-19872] [PYTHON] Use the correct deserializer for RDD construction for coalesce/repartition
[SPARK-19944][SQL] Move SQLConf from sql/core to sql/catalyst (branch-2.1)
[SPARK-19889][SQL] Make TaskContext callbacks thread safe
[SPARK-19877][SQL] Restrict the nested level of a view
[SPARK-19817][SS] Make it clear that `timeZone` is a general option in DataStreamReader/Writer
[SPARK-18112][SQL] Support reading data from Hive 2.1 metastore
[SPARK-19828][R] Support array type in from_json in R
[SPARK-19887][SQL] dynamic partition keys can be null or empty string
[SPARK-19918][SQL] Use TextFileFormat in implementation of TextInputJsonDataSource
[SPARK-19887][SQL] dynamic partition keys can be null or empty string
[SPARK-19817][SQL] Make it clear that `timeZone` option is a general option in DataFrameReader/Writer.
[SPARK-18966][SQL] NOT IN subquery with correlated expressions may return incorrect result
[SPARK-19933][SQL] Do not change output of a subquery
[SPARK-19933][SQL] Do not change output of a subquery
[SPARK-19923][SQL] Remove unnecessary type conversions per call in Hive
[SPARK-18961][SQL] Support `SHOW TABLE EXTENDED ... PARTITION` statement
[SPARK-11569][ML] Fix StringIndexer to handle null value properly
[SPARK-19940][ML][MINOR] FPGrowthModel.transform should skip duplicated items
[SPARK-19922][ML] small speedups to findSynonyms
[SPARK-18874][SQL] Fix 2.10 build after moving the subquery rules to optimization
[SPARK-19850][SQL] Allow the use of aliases in SQL function calls
[SPARK-19944][SQL] Move SQLConf from sql/core to sql/catalyst
[SPARK-18874][SQL] First phase: Deferring the correlated predicate pull up to Optimizer phase
[SPARK-19391][SPARKR][ML] Tweedie GLM API for SparkR
[SPARK-19921][SQL][TEST] Enable end-to-end testing using different Hive metastore versions.
[SPARK-19924][SQL] Handle InvocationTargetException for all Hive Shim
[MINOR][ML] Improve MLWriter overwrite error message
[SPARK-19916][SQL] simplify bad file handling
[SPARK-17495][SQL] Support date, timestamp and interval types in Hive hash
[SPARK-19853][SS] uppercase kafka topics fail when startingOffsets are SpecificOffsets
[SPARK-19853][SS] uppercase kafka topics fail when startingOffsets are SpecificOffsets
[SPARK-19282][ML][SPARKR] RandomForest Wrapper and GBT Wrapper return param "maxDepth" to R models
[SPARK-19831][CORE] Reuse the existing cleanupThreadExecutor to clean up the directories of finished applications to avoid the block
[DOCS][SS] fix structured streaming python example
[DOCS][SS] fix structured streaming python example
Loading