Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
S
spark
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Model registry
Operate
Environments
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
cs525-sp18-g07
spark
Commits
a23ed25f
Commit
a23ed25f
authored
12 years ago
by
Matei Zaharia
Browse files
Options
Downloads
Patches
Plain Diff
Add a class comment to Accumulator
parent
61b6382a
No related branches found
Branches containing commit
No related tags found
Tags containing commit
No related merge requests found
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
python/pyspark/accumulators.py
+12
-0
12 additions, 0 deletions
python/pyspark/accumulators.py
with
12 additions
and
0 deletions
python/pyspark/accumulators.py
+
12
−
0
View file @
a23ed25f
...
...
@@ -76,6 +76,18 @@ def _deserialize_accumulator(aid, zero_value, accum_param):
class
Accumulator
(
object
):
"""
A shared variable that can be accumulated, i.e., has a commutative and associative
"
add
"
operation. Worker tasks on a Spark cluster can add values to an Accumulator with the C{+=}
operator, but only the driver program is allowed to access its value, using C{value}.
Updates from the workers get propagated automatically to the driver program.
While C{SparkContext} supports accumulators for primitive data types like C{int} and
C{float}, users can also define accumulators for custom types by providing a custom
C{AccumulatorParam} object with a C{zero} and C{addInPlace} method. Refer to the doctest
of this module for an example.
"""
def
__init__
(
self
,
aid
,
value
,
accum_param
):
"""
Create a new Accumulator with a given initial value and AccumulatorParam object
"""
from
pyspark.accumulators
import
_accumulatorRegistry
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment