How Cores and Memory are used in spark Program

Importance of Cores and Memory in a Spark Program

In a Spark program, the allocation and management of cores and memory play crucial roles in determining the performance and scalability of your application. Here's why they are important:

Core
A "core" is a part of the CPU (Central Processing Unit) that reads and executes instructions. Modern CPUs can have multiple cores, allowing them to perform multiple tasks simultaneously.
Memory

RAM (Random Access Memory) is a type of volatile memory used by computers to store data that is actively being used or processed.

How memory & Cores are important part of a scala spark program?