Apache Spark Cluster Internals: How spark jobs will be computed by the spark cluster
Reading Time: 2 minutes In this blog we are explain how the spark cluster compute the jobs. Spark jobs are collection of stages and stages are collection of tasks. So before the deep dive first we see the spark cluster architecture. In the above cluster we can see the driver program it is a main program of our spark program, driver program is running on the master node of Continue Reading