PySpark SparkContext Explained

pyspark.SparkContext is an entry point to the PySpark functionality that is used to communicate with the cluster and to create an RDD, accumulator, and broadcast variables. In this article, you…

0 Comments

What is SparkContext? Explained

SparkContext has been available since Spark 1.x (JavaSparkContext for Java) and it used to be an entry point to Spark and PySpark before introducing SparkSession in 2.0. Creating SparkContext is…

0 Comments

SparkSession vs SparkContext

SparkSession vs SparkContext - Since earlier versions of Spark or Pyspark, SparkContext (JavaSparkContext for Java) is an entry point to Spark programming with RDD and to connect to Spark Cluster,…

0 Comments

SparkSession vs SQLContext

In Spark, SparkSession is an entry point to the Spark application and SQLContext is used to process structured data that contains rows and columns Here, I will mainly focus on…

0 Comments