How do I increase my Spark cluster boot disk size?

Completed

Comments

1 comment

  • Avatar
    Mike Z

    To increase the boot disk size for your Spark cluster you need to add the below "Config Setting" to your runtime and specify a value.

    syntasa.runtime.boot.disk.size

    By default the Syntasa sets the Spark cluster boot disk size to 30GB, you can typically safely get by with setting the boot disk size to 100. Please see screenshot for how the setting looks in the runtime.

     

    0
    Comment actions Permalink

Please sign in to leave a comment.