Follow

How To: Kill all "ACCEPTED" Spark Jobs in YARN

You may encounter a scenario where a large backlog of jobs is seen in YARN, generally this is caused by the frequency of analytics executions being too close together.

To remedy this issue, run the attached script (contents below), which will find all YARN jobs in the "ACCEPTED" state and kill them:

for app in `yarn application -list | awk '$6 == "ACCEPTED" { print $1 }'`; do yarn application -kill "$app";  done

Additionally, you will likely want to modify the cron entry (crontab -e as the Spark user where Analytics is installed) to increase the gap between Analytics executions.

Was this article helpful?
0 out of 0 found this helpful
Have more questions? Submit a request

Comments

Powered by Zendesk