Why am I running out of disk space when installing Python dependencies on Spark cluster?

Completed

Comments

1 comment

  • Avatar
    Mike Z

    It looks like you are running out of space on your boot disk. Some packages, such as Tensorflow, installs a number of dependencies requiring much more disk space than usual. If you do only a pip install tensorflow where you are not specifying the version, it is possible that several versions of the package get downloaded before installing the most recent compatible version gets installed. This also takes up disk space.

    Given the above information, there are two things that can be attempted. First, is to specify the dependency version of the packages you are attempting to install, which, is a best practice we recommend whenever possible. The other option is to increase the boot disk size by following these instructions.

    0
    Comment actions Permalink

Please sign in to leave a comment.