Apache spark internals

In this slide, we will see internal architecture of spark cluster i.e what is driver, worker,executer and cluster manager, how spark program will be run on cluster and what are jobs,stages and task.

For video of above session click here

1 thought on “Apache spark internals

  1. Very Nice explanantion Sandeep. I have one question, Spark handle Multithreading concept? Means one core in spark can process multiple tasks in form threads at a time?

Leave a Reply

Knoldus Pune Careers - Hiring Freshers

Get a head start on your career at Knoldus. Join us!

%d bloggers like this: