Spark will run as many parallel tasks as the number of cores you have specified. So, if you have 4 executors and 4 executor cores, in total 4 x 4 = 16 tasks would be running in parallel. So one way I've found to solve my problem is to limit the number of executor cores. In that case, things would be done in a more round robin fashion. Man scrawls suicide note on side of van then DECAPITATES himself by tying a rope around his neck and attaching it to a fire hydrant before driving off Body of Hugo Rodriguez, 35, was found in West.