问题
I created a model using Keras library and saved the model as .json and its weights with .h5 extension. How can I download this onto my local machine?
to save the model I followed this link
回答1:
This worked for me !! Use PyDrive API
!pip install -U -q PyDrive
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials
# 1. Authenticate and create the PyDrive client.
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)
# 2. Save Keras Model or weights on google drive
# create on Colab directory
model.save('model.h5')
model_file = drive.CreateFile({'title' : 'model.h5'})
model_file.SetContentFile('model.h5')
model_file.Upload()
# download to google drive
drive.CreateFile({'id': model_file.get('id')})
Same for weights
model.save_weights('model_weights.h5')
weights_file = drive.CreateFile({'title' : 'model_weights.h5'})
weights_file.SetContentFile('model_weights.h5')
weights_file.Upload()
drive.CreateFile({'id': weights_file.get('id')})
Now, check your google drive.
On next run, try reloading the weights
# 3. reload weights from google drive into the model
# use (get shareable link) to get file id
last_weight_file = drive.CreateFile({'id': '1sj...'})
last_weight_file.GetContentFile('last_weights.mat')
model.load_weights('last_weights.mat')
回答2:
Try this
from google.colab import files
files.download("model.json")
回答3:
Here is a solution that worked for me:
Setup authentication b/w Google Colab and Your Drive:
Steps:
-Paste the code as is below
-This process will generate two URLs for authentication to complete, where you would have to copy the tokens and paste in the bar provided
!apt-get install -y -qq software-properties-common python-software-properties module-init-tools
!add-apt-repository -y ppa:alessandro-strada/ppa 2>&1 > /dev/null
!apt-get update -qq 2>&1 > /dev/null
!apt-get -y install -qq google-drive-ocamlfuse fuse
from google.colab import auth
auth.authenticate_user()
from oauth2client.client import GoogleCredentials
creds = GoogleCredentials.get_application_default()
import getpass
!google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret} < /dev/null 2>&1 | grep URL
vcode = getpass.getpass()
!echo {vcode} | google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret}
Once this authentication is done, use the following codes to establish the connection:
!mkdir -p drive
!google-drive-ocamlfuse drive
Now to see the list of files in your Google Drive:
!ls drive
To save the Keras model output to Drive, the process is exactly the same as storing in local drive:
-Run the Keras model as usual
Once the model is trained say you want to store your model outputs (.h5 and json) into the app folder of your Google Drive:
model_json = model.to_json()
with open("drive/app/model.json", "w") as json_file:
json_file.write(model_json)
# serialize weights to HDF5
model.save_weights("drive/app/model_weights.h5")
print("Saved model to drive")
You will find the files in the respective folder of Google Drive, from where you can download as we can see below:
回答4:
files.download does not let you directly download large files. A workaround is to save your weights on Google drive, using this pydrive snippet below. Just change the filename.txt for your weights.h5 file
# Install the PyDrive wrapper & import libraries.
# This only needs to be done once in a notebook.
!pip install -U -q PyDrive
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials
# Authenticate and create the PyDrive client.
# This only needs to be done once in a notebook.
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)
# Create & upload a file.
uploaded = drive.CreateFile({'title': 'filename.csv'})
uploaded.SetContentFile('filename.csv')
uploaded.Upload()
print('Uploaded file with ID {}'.format(uploaded.get('id')))
回答5:
To download the model to the local system, the following code would work- Downloading json file:
model_json = model.to_json()
with open("model1.json","w") as json_file:
json_file.write(model_jason)
files.download("model1.json")
Downloading weights:
model.save('weights.h5')
files.download('weights.h5')
回答6:
You can run the following after training.
saver = tf.train.Saver()
save_path = saver.save(session, "data/dm.ckpt")
print('done saving at',save_path)
Then check the location where the ckpt files were saved.
import os
print( os.getcwd() )
print( os.listdir('data') )
Finally download the files with weight!
from google.colab import files
files.download( "data/dm.ckpt.meta" )
来源:https://stackoverflow.com/questions/48924165/google-colaboratory-weight-download-export-saved-models