Reading Time: < 1 minute
In this slide, we will see internal architecture of spark cluster i.e what is driver, worker,executer and cluster manager, how spark program will be run on cluster and what are jobs,stages and task.
For video of above session click here
Very Nice explanantion Sandeep. I have one question, Spark handle Multithreading concept? Means one core in spark can process multiple tasks in form threads at a time?