Spark Session - Entry point

How to Create a Spark Session Using Scala Spark

Creating a Spark session is a crucial step in any big data processing task. This beginner's guide will walk you through setting up a Spark session using Scala.

A SparkSession is the main entry point to using Spark. When you start writing a Spark program, the first thing you typically do is create an instance of SparkSession. Before Spark 2.0, SparkContext was used as the entry point, but now SparkSession has taken its place while still retaining many features of SparkContext for backward compatibility.

SparkSession will also include the followings:

1.Spark Context

2.SQL Context

3.Streaming Context

4.Hive Context

Lets write our first Scala and Spark Program

Now that we've set up Scala Spark on our local machine, it's time to write our very first program. which will create a sparksession.

import org.apache.spark.sql.SparkSession

object ScalaSparkTutorial {
    
    def main(args: Array[String]): Unit = {
    
        val sparkSession = SparkSession
          .builder()
          .appName("Our First scala spark code")
          .master("local")
          .getOrCreate()
    
        sparkSession.stop()
    
    }
    
}

Note:- This spark session looks very small because we have not used any additional conf. please check here to check list of conf