How does Spark cluster sharing work?



1 comment

  • Avatar
    Sarath Botlagunta

    By default every job runs in a new runtime instance even if a different runtime instance with same template is already running.

    To be able to share the runtime, Turn Off the option "Terminate On Completion". This will ensure runtime instance is not terminated on the completion of the job. Any new job that is started with the same template will reuse the same instance instead of creating a new one.

    Runtime Instances can also be terminated by two more properties:

    • Idle timeout (Amount of idle time after which cluster will auto terminate)
    • Total Duration (Total duration for the cluster to be up and running)
    Comment actions Permalink

Please sign in to leave a comment.