google-colaboratory

Running code cells sequentially in google colab

…衆ロ難τιáo~ 提交于 2021-01-28 00:40:18
问题 I want to run the code cells in Google Colab in sequence. For example: Cell[1] from catboost import CatBoostRegressor #do something here Cell[2] clf = CatBoostRegressor(task_type='GPU') #do some more things here But when I select "Run all" all cells seem to run in parallel, so my code does not work. When I do the same thing in a Kaggle Kernel it runs perfectly i.e, first cell[1] is executed, and then cell[2] and so on. I have tried searching for this in Google Colab but failed to come up with

Google Colaboratory : OSError: [Errno 5] Input/output error

寵の児 提交于 2021-01-21 10:16:32
问题 I am using Google Colaboratory, and mounting Google Drive. When I access a csv file, it gets me the following error: OSError: [Errno 5] Input/output error. This did not happen before. How can I access to the csv file as I used to? I have tried this, but did not work: Input/output error while using google colab with google drive This happened after conducting the following code. for segment_id in tqdm(range(segment_num)): with h5py.File(os.path.join(INPUT_PATH, "train.h5"), "r") as f: train

Google Colaboratory : OSError: [Errno 5] Input/output error

回眸只為那壹抹淺笑 提交于 2021-01-21 10:13:15
问题 I am using Google Colaboratory, and mounting Google Drive. When I access a csv file, it gets me the following error: OSError: [Errno 5] Input/output error. This did not happen before. How can I access to the csv file as I used to? I have tried this, but did not work: Input/output error while using google colab with google drive This happened after conducting the following code. for segment_id in tqdm(range(segment_num)): with h5py.File(os.path.join(INPUT_PATH, "train.h5"), "r") as f: train

How to run nbconvert on notebook in google colaboratory

断了今生、忘了曾经 提交于 2021-01-21 05:19:42
问题 I have a simple jupyter notebook, say foo.ipynb . I wish simply to run nbconvert in the usual way: on my local machine I would execute !jupyter nbconvert foo.ipynb in the notebook itself, or jupyter notebook foo.ipynb in a shell. On Google Colaboratory, this does not work. Of course this is because foo.ipynb is not running locally on the drive, but the usual methods to connect Drive and Colab are not working in this case. Question: is running nbconvert on a Colab notebook possible from within

Google Colab API

|▌冷眼眸甩不掉的悲伤 提交于 2021-01-20 18:59:33
问题 Is there a Google Colab API? I'm looking to accomplish things like: Create users Create notebooks Share notebooks with users Retrieve the contents of a notebook 回答1: As pointed out in the other answer, Colab Notebooks are files on your Google Drive. Hence, sharing them or retrieving their contents can be done via the Google Drive API (from what I understand you can use the webContentLink to download it). However, your first question is: Where is the Google Colab API? For anyone coming here

is it possible to increase the ram in google colab with another way?

[亡魂溺海] 提交于 2021-01-05 09:14:34
问题 When I run this code in google colab n = 100000000 i = [] while True: i.append(n * 10**66) it happens to me all the time. My data is huge. After hitting 12.72 GB RAM, but I don't immediately get to the crash prompt and the option to increase my RAM. I have just this Your session crashed after using all available RAM. View runtime logs What is the solution ? Is there another way ? 回答1: You either need to upgrade to Colab Pro or if your computer itself has more RAM than the VM for Colab, you

TypeError('Keyword argument not understood:', 'groups') in keras.models load_model

一曲冷凌霜 提交于 2021-01-04 02:37:15
问题 After training a model using Google Colab, I downloaded it using the following command (inside Google Colab): model.save('model.h5') from google.colab import files files.download('model.h5') My problem is that when I try to load the downloaded model.h5 using my local machine (outside Google Colab), I get the following error: [input] from keras.models import load_model model = load_model(model.h5) [output] Traceback (most recent call last): File "test.py", line 2, in <module> model = load

TypeError('Keyword argument not understood:', 'groups') in keras.models load_model

早过忘川 提交于 2021-01-04 02:36:06
问题 After training a model using Google Colab, I downloaded it using the following command (inside Google Colab): model.save('model.h5') from google.colab import files files.download('model.h5') My problem is that when I try to load the downloaded model.h5 using my local machine (outside Google Colab), I get the following error: [input] from keras.models import load_model model = load_model(model.h5) [output] Traceback (most recent call last): File "test.py", line 2, in <module> model = load

File system scheme '[local]' not implemented in Google Colab TPU

。_饼干妹妹 提交于 2021-01-02 19:13:11
问题 I am using TPU runtime in Google Colab, but having problems in reading files (not sure). I initialized TPU using: import tensorflow as tf import os import tensorflow_datasets as tfds resolver = tf.distribute.cluster_resolver.TPUClusterResolver(tpu='grpc://' + os.environ['COLAB_TPU_ADDR']) tf.config.experimental_connect_to_cluster(resolver) # This is the TPU initialization code that has to be at the beginning. tf.tpu.experimental.initialize_tpu_system(resolver) print("All devices: ", tf.config

Can't find kaggle.json file in google colab

限于喜欢 提交于 2020-12-30 05:12:48
问题 I'm trying to download the kaggle imagenet object localization challenge data into google colab so that I can use it to train my model. Kaggle uses an API for easy and fast access to their datasets. (https://github.com/Kaggle/kaggle-api) However, when calling the command "kaggle competitions download -c imagenet-object-localization-challenge" in google colab, it can't find the kaggle.json file which contains my username and api-key. I haven't had this problem on my mac when running a jupyter