Follow

Spark Jobs Hung with java.lang.OutOfMemoryError: GC overhead limit exceeded

In a low node-count system you may experience issues with Analytics hanging, and the stderr for Spark showing "java.lang.OutOfMemoryError: GC overhead limit exceeded". In this scenario, the heap size for the Spark Worker and Executor should be increased.

1) Open /opt/interset/spark/conf/spark-env.sh in a text editor (e.g. vi)
2) Modify the value of "SPARK_WORKER_MEMORY" to a higher value.
3) Uncomment, and modify "SPARK_EXECUTOR_MEMORY" to a higher value.

These values need to be realistic based on the amount of memory available on the system(s). If you have questions or concerns about these values, please reach out to support@interset.com.

Was this article helpful?
0 out of 0 found this helpful
Have more questions? Submit a request

Comments

Powered by Zendesk