google-cloud-datalab

DataLab Cloud Deployment 403 Error

为君一笑 提交于 2019-12-01 11:23:56
I'm trying to deploy DataLab. Have confirmed my project is in a US zone. I've tried creating new projects and deploying there, to no avail. It appears everything works properly up to this point. Important to note, my project ID does not have the preceeding s~ (unsure if that matters or if it's simply a notation used in DataLab / Google Cloud. I have tried ~10 times over the course of two days with no success. Nov 7 13:32:06 datalab-deploy-main-20151107-13-29-51 startupscript: Verifying that Managed VMs are enabled and ready. Nov 7 13:32:06 datalab-deploy-main-20151107-13-29-51 startupscript:

How to install gcp in Python?

家住魔仙堡 提交于 2019-12-01 09:07:11
问题 Lots of the BigQuery examples begin with: import gcp.bigquery as bq But I get ImportError: No module named gcp.bigquery whenever I try to run this. How do I install this library? I'm working in a virtualenv with python 2.7. I've tried pip install gcp , pip install gcloud , and pip install google-api-python-client . None of them help and I can't find any documentation. Help! UPDATE: the reason I want to use gcp is that I want to get data from BigQuery, preferably in CSV form, from within a

DataLab Cloud Deployment 403 Error

女生的网名这么多〃 提交于 2019-12-01 07:24:13
问题 I'm trying to deploy DataLab. Have confirmed my project is in a US zone. I've tried creating new projects and deploying there, to no avail. It appears everything works properly up to this point. Important to note, my project ID does not have the preceeding s~ (unsure if that matters or if it's simply a notation used in DataLab / Google Cloud. I have tried ~10 times over the course of two days with no success. Nov 7 13:32:06 datalab-deploy-main-20151107-13-29-51 startupscript: Verifying that

Adding python libraries to google datalab environment

耗尽温柔 提交于 2019-12-01 06:07:34
I'm using google datalab on google cloud platform. Worked great on the first try and I love how easy it is to now run a jupyter notebook server in the cloud (faster than starting up a localhost server). It's fantastic. But now I want to install python libraries not included in the basic datalab environment (specifically I need the Bokeh plotting library). So I opened a google cloud shell from the google cloud console where I manage this jupyter notebook instance, installed miniconda and then the bokeh library. Everything ran without error (e.g. bokeh installs several dependencies along the way

Write a Pandas DataFrame to Google Cloud Storage or BigQuery

大兔子大兔子 提交于 2019-11-27 00:44:30
问题 Hello and thanks for your time and consideration. I am developing a Jupyter Notebook in the Google Cloud Platform / Datalab. I have created a Pandas DataFrame and would like to write this DataFrame to both Google Cloud Storage(GCS) and/or BigQuery. I have a bucket in GCS and have, via the following code, created the following objects: import gcp import gcp.storage as storage project = gcp.Context.default().project_id bucket_name = 'steve-temp' bucket_path = bucket_name bucket = storage.Bucket