google-cloud-platform

ERROR: Create Version failed. Bad model detected with error: "Failed to load model: Could not load the model

倾然丶 夕夏残阳落幕 提交于 2021-01-29 15:50:05
问题 clf = svm.SVC() # Giving test data as input clf.fit(X_train, y_train) joblib.dump(clf, 'model.joblib') GCP_PROJECT = 'career-banao-project' BUCKET_NAME="career_banao_bucket" MODEL_BUCKET = 'gs://career_banao_bucket' VERSION_NAME = 'v1' MODEL_NAME = 'career_banao_model' !gsutil mb $MODEL_BUCKET !gsutil cp ./model.joblib $MODEL_BUCKET !gcloud ai-platform models create $MODEL_NAME !gcloud ai-platform versions create $VERSION_NAME \ --model=$MODEL_NAME \ --framework='scikit-learn' \ --runtime

Is it possible to add my own custom transformation plugin to Cloud data fusion either in Basic edition or in Enterprise edition. Please enlighten

北城以北 提交于 2021-01-29 14:52:04
问题 As I understand there are many transformation plug-ins available in Google cloud data fusion Hub. However, if I want to create my own specific custom plug-in, can I add that plug-in to Google data fusion and use in my pipeline? Please enlighten me. 回答1: In order to add a custom plugin to DataFusion (considering that you have already implemented it), you have to follow the steps bellow: 1) Click on the + button 2) Click on upload in the Plugin part 3) Drag your plugin JAR to the box, click

Airflow trigger DAG anytime after a google sheet is being updated

六月ゝ 毕业季﹏ 提交于 2021-01-29 14:30:58
问题 Is there any way I can schedule a DAG to be triggered right after a google sheet is being updated? Not sure if I get any answer from this doc : https://airflow.readthedocs.io/en/latest/_api/airflow/providers/google/suite/hooks/sheets/index.html 回答1: @Alejandro's direction is right but just expanding on to his answer. You can use HttpSensor operator to do a get request to sheet file by google drive api HttpSensor( task_id='http_sensor_check', http_conn_id='http_default', endpoint='https://www

How do I add a SSH key to Google Cloud Compute Engine VM Linux instance?

孤者浪人 提交于 2021-01-29 14:24:33
问题 I have a Linux VM instance running in Google Cloud Platform. I tried to copy my public key to ~/.ssh/authorized_keys and I can successfully SSH to my VM. But sometimes ~/.ssh/authorized_keys is flushed and I have to copy the public key again. It is really a pain to add public key every time. How do I add a public key permanently? 回答1: This ~/.ssh/authorized_keys takes the ssh keys from the metadata. It best you keep your ssh public keys in the metadata as mentioned here and there’s also a

Is there a way to directly read the content of a JSON file in a bucket Google Cloud Datastore via node.js without having to download it before?

柔情痞子 提交于 2021-01-29 14:23:44
问题 I am a Python developer, but the circumstances of a project I am working on now, oblige me to find a solution in Node.js. I have check the documentation In the class File, I have this method: createReadStream, but who force me to download in local before read it. However, the solution I search is just like to save the content in a variable so that I can read and interpret as I want. This is the script of createReadStream() method: var storage = require('@google-cloud/storage')(); var bucket =

Dialogflow email address from speech

a 夏天 提交于 2021-01-29 14:20:42
问题 Does anyone have any suggestions for obtaining a user’s email address through speech? Written is quite straight forward as email addresses follow a pattern to some degree, but using speech is quite difficult. Is it best to simply ask the user to read out the characters one by one? 回答1: Dialogflow provides System entites for most common user inputs. You can use sys.email entity for your purpose and then use it in your fulfillment. Getting the above email address in your webhook fulfillment :

Google Cloud Pub/Sub with different message types

霸气de小男生 提交于 2021-01-29 13:46:28
问题 Within the same application I send different message types that have a completely different format and that are totally unrelated. What is the best practice to tackle this problem? I see two different approaches here : Filter at application level, which means I receive all messages on the same puller (same subscription) Create a new subscription, this means the application will have two pullers running (one for each message type) 回答1: You anwsered your question with 2. point. If the message

Shutdown script not executing on a Google Cloud VM

馋奶兔 提交于 2021-01-29 13:35:38
问题 I am trying to get a shutdown script to execute using a Google Cloud compute VM. I see this output when running gcloud compute connect-to-serial-port startup-test-v Apr 8 22:01:25 startup-test-v shutdown-script: INFO Starting shutdown scripts. Apr 8 22:01:25 startup-test-v shutdown-script: INFO Found shutdown-script in metadata. Apr 8 22:01:26 startup-test-v shutdown-script: INFO shutdown-script: No change requested; skipping update for [startup-test-v]. Apr 8 22:01:27 startup-test-v shutdown

Unable to fetch a block from the channel in hyperledger fabric

随声附和 提交于 2021-01-29 13:23:50
问题 Trying to achieve multicloud architecture between Azure and GCP. We have the orderer in a separate vm running in Azure. Now trying to join a peer which is running in another vm in google cloud platform. Our requirement is to join that peer to the channel in azure network. Inorder to join the peer to the channel, we tried fetching the genesis block from the orderer. But getting the following error: peer channel fetch newest genesis.block -c composerchannelrest --orderer orderer0:7050 --tls -

Accessing AWS S3 from within google GCP

泪湿孤枕 提交于 2021-01-29 12:59:51
问题 We were doing most of our cloud processing (and still do) using AWS. However, we also now have some credits on GCP and would like to use and want to explore interoperability between the cloud providers. In particular, I was wondering if it is possible to use AWS S3 from within GCP. I am not talking about migrating the data but whether there is some API which will allow AWS S3 to work seamlessly from within GCP. We have a lot of data and databases that are hosted on AWS S3 and would prefer to