Skip to content
Snippets Groups Projects
Commit 9c297df3 authored by Bryan Cutler's avatar Bryan Cutler Committed by Davies Liu
Browse files

[MINOR] [PYSPARK] [EXAMPLES] Changed examples to use SparkSession.sparkContext instead of _sc

## What changes were proposed in this pull request?

Some PySpark examples need a SparkContext and get it by accessing _sc directly from the session.  These examples should use the provided property `sparkContext` in `SparkSession` instead.

## How was this patch tested?
Ran modified examples

Author: Bryan Cutler <cutlerb@gmail.com>

Closes #13303 from BryanCutler/pyspark-session-sparkContext-MINOR.
parent 698ef762
No related branches found
No related tags found
No related merge requests found
...@@ -67,7 +67,7 @@ if __name__ == "__main__": ...@@ -67,7 +67,7 @@ if __name__ == "__main__":
.appName("PythonALS")\ .appName("PythonALS")\
.getOrCreate() .getOrCreate()
sc = spark._sc sc = spark.sparkContext
M = int(sys.argv[1]) if len(sys.argv) > 1 else 100 M = int(sys.argv[1]) if len(sys.argv) > 1 else 100
U = int(sys.argv[2]) if len(sys.argv) > 2 else 500 U = int(sys.argv[2]) if len(sys.argv) > 2 else 500
......
...@@ -70,7 +70,7 @@ if __name__ == "__main__": ...@@ -70,7 +70,7 @@ if __name__ == "__main__":
.appName("AvroKeyInputFormat")\ .appName("AvroKeyInputFormat")\
.getOrCreate() .getOrCreate()
sc = spark._sc sc = spark.sparkContext
conf = None conf = None
if len(sys.argv) == 3: if len(sys.argv) == 3:
......
...@@ -53,7 +53,7 @@ if __name__ == "__main__": ...@@ -53,7 +53,7 @@ if __name__ == "__main__":
.appName("ParquetInputFormat")\ .appName("ParquetInputFormat")\
.getOrCreate() .getOrCreate()
sc = spark._sc sc = spark.sparkContext
parquet_rdd = sc.newAPIHadoopFile( parquet_rdd = sc.newAPIHadoopFile(
path, path,
......
...@@ -32,7 +32,7 @@ if __name__ == "__main__": ...@@ -32,7 +32,7 @@ if __name__ == "__main__":
.appName("PythonPi")\ .appName("PythonPi")\
.getOrCreate() .getOrCreate()
sc = spark._sc sc = spark.sparkContext
partitions = int(sys.argv[1]) if len(sys.argv) > 1 else 2 partitions = int(sys.argv[1]) if len(sys.argv) > 1 else 2
n = 100000 * partitions n = 100000 * partitions
......
...@@ -46,7 +46,7 @@ if __name__ == "__main__": ...@@ -46,7 +46,7 @@ if __name__ == "__main__":
.appName("PythonTransitiveClosure")\ .appName("PythonTransitiveClosure")\
.getOrCreate() .getOrCreate()
sc = spark._sc sc = spark.sparkContext
partitions = int(sys.argv[1]) if len(sys.argv) > 1 else 2 partitions = int(sys.argv[1]) if len(sys.argv) > 1 else 2
tc = sc.parallelize(generateGraph(), partitions).cache() tc = sc.parallelize(generateGraph(), partitions).cache()
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment