How do I find out what Python modules are installed on the Spark cluster connected to my Notebook?
CompletedPre version 7.0
I've tried code that works in the Spark Processor (see below) in my notebook but keep getting errors. What is the correct code to validate dependencies and versions that are installed on the Spark cluster connected to my notebook?
Code Run (works in Spark Processor, not Notebook)
%%spark
import os
x = os.system("pip freeze")
print(x)
Example Error (the installed package displayed changes each time run)
An error was encountered:
Unrecognized token 'cachetools': was expecting ('true', 'false' or 'null')
at [Source: cachetools==5.0.0; line: 1, column: 11]
Please sign in to leave a comment.
Comments
2 comments