Spark worker memory doesn't set executor memory to maximum #75697
Labels
spark
stale
15 days without activity
tech-issues
The user has a technical issue about an application
triage
Triage is needed
Name and Version
bitnami/spark:3.5.2
What architecture are you using?
amd64
What steps will reproduce the bug?
Simply start with the defaults and set
SPARK_WORKER_MEMORY=1G
to some other valuer other than 1G.What is the expected behavior?
It is expected that the executor memory will also be updated to the value set in
SPARK_WORKER_MEMORY
What do you see instead?
In the console, it can be seen that the executor memory is still the dafault of 1G
Additional information
Passing
SPARK_EXECUTOR_MEMORY
in the docker compose doesn't do anything.The text was updated successfully, but these errors were encountered: