The problem is quite simple: You have a local spark instance (either cluster or just running it in local mode) and you want to read from gs://
Considering that it has been awhile since the last answer, I though I would share my recent solution. Note, the following instruction is for Spark 2.4.4.
Make sure that all the environment variables are properly set up for you Spark application to run. This is:
a. SPARK_HOME pointing to the location where you have saved Spark installations.
b. GOOGLE_APPLICATION_CREDENTIALS pointing to the location where json key is. If you have just downloaded it, it will be in your ~/Downloads
c. JAVA_HOME pointing to the location where you have your Java 8* "Home" folder.
If you are on Linux/Mac OS you can use export VAR=DIR, where VAR is variable and DIR the location, or if you want to set them up permanently, you can add them to ~/.bash_profile or ~/.zshrc files.
For Windows OS users, in cmd write set VAR=DIR for shell related operations, or setx VAR DIR to store the variables permanently.
That has worked for me and I hope it help others too.
* Spark works on Java 8, therefore some of its features might not be compatible with the latest Java Development Kit.