I\'ve installed OpenJDK 13.0.1 and python 3.8 and spark 2.4.4. Instructions to test the install is to run .\\bin\\pyspark from the root of the spark installation. I\'m not
Its python and pyspark version mismatch like John rightly pointed out. For a newer python version you can try,
pip install --upgrade pyspark
That will update the package, if one is available. If this doesn't help then you might have to downgrade to a compatible version of python.
pyspark package doc clearly states:
NOTE: If you are using this with a Spark standalone cluster you must ensure that the version (including minor version) matches or you may experience odd errors.