google-cloud-data-fusion

Access CDAP Rest API of a Cloud Data Fusion Instance

孤者浪人 提交于 2021-02-16 14:20:18
问题 How do you access the CDAP REST API of a Cloud Data Fusion instance? I would like to use Cloud Composer to orchestrate my pipelines. I have an Enterprise Edition instance with private IP enabled, but i'm not able to find any documentation on how to access the REST API. The instance details page only shows a /22 IP address range - it does not specify a specific IP. Do I access using the IAP protected URL for the UI? 回答1: You can get the CDAP API endpoint for your Data Fusion instances using

Google Cloud Data Fusion — building pipeline from REST API endpoint source

只谈情不闲聊 提交于 2021-02-11 06:12:23
问题 Attempting to build a pipeline to read from a 3rd party REST API endpoint data source. I am using the HTTP (version 1.2.0) plugin found in the Hub. The response request URL is: https://api.example.io/v2/somedata?return_count=false A sample of response body: { "paging": { "token": "12456789", "next": "https://api.example.io/v2/somedata?return_count=false&__paging_token=123456789" }, "data": [ { "cID": "aerrfaerrf", "first": true, "_id": "aerfaerrfaerrf", "action": "aerrfaerrf", "time": "1970

Cloud Data Fusion Preview environment

◇◆丶佛笑我妖孽 提交于 2021-02-10 06:24:29
问题 We can configure the compute profile to run the pipeline on a custom cluster that I create, however for preview I cannot specify the compute profile. There are some custom transformations i need to use which requires me to install some external jar on the data-proc cluster for the code to work. I would like to test it before i deploy the code using the "preview run" Is there away i can achieve this. I don't see any property that i can set to choose the compute profile at the time of preview

Is it possible to add my own custom transformation plugin to Cloud data fusion either in Basic edition or in Enterprise edition. Please enlighten

北城以北 提交于 2021-01-29 14:52:04
问题 As I understand there are many transformation plug-ins available in Google cloud data fusion Hub. However, if I want to create my own specific custom plug-in, can I add that plug-in to Google data fusion and use in my pipeline? Please enlighten me. 回答1: In order to add a custom plugin to DataFusion (considering that you have already implemented it), you have to follow the steps bellow: 1) Click on the + button 2) Click on upload in the Plugin part 3) Drag your plugin JAR to the box, click

Data Fusion: Pass runtime argument from one pipeline to another

▼魔方 西西 提交于 2020-12-15 05:18:05
问题 I am having a runtime argument set at namespace which is business_date: ${logicalStartTime(yyyy-MM-dd)} . I am using this argument in my pipeline and want to use the same in other pipeline. There are many pipelines back to back and I want to the value to be same throughout the pipelines once calculated in the first pipeline. suppose the value is calculates as '2020-08-20 20:14:11' and once the pipeline one succeeded i am passing this argument to pipeline 2, but as this arguments are defined

How to set runtime arguments in a CDAP/DATA FUSION pipeline?

痞子三分冷 提交于 2020-12-13 18:56:10
问题 In addition to Argument Setter Plugin, is there any other way to set runtime arguments in a pipeline? For example,I calculated the total number of error messages,and I want to set in a runtime argument so that the email sender can use it? Someone can take a look and help me. Thanks. 回答1: There are multiple ways you can set the runtime argument of a pipeline. Argument Setter plugin Passing runtime argument when starting a pipeline Setting Preferences 来源: https://stackoverflow.com/questions

How to set runtime arguments in a CDAP/DATA FUSION pipeline?

泄露秘密 提交于 2020-12-13 18:54:31
问题 In addition to Argument Setter Plugin, is there any other way to set runtime arguments in a pipeline? For example,I calculated the total number of error messages,and I want to set in a runtime argument so that the email sender can use it? Someone can take a look and help me. Thanks. 回答1: There are multiple ways you can set the runtime argument of a pipeline. Argument Setter plugin Passing runtime argument when starting a pipeline Setting Preferences 来源: https://stackoverflow.com/questions

Dataproc operation failure: INVALID_ARGUMENT: User not authorized to act as service account

拜拜、爱过 提交于 2020-12-12 05:07:40
问题 I'm tring to run a pipeline from Cloud Data Fusion, but im receiving the following error: io.cdap.cdap.runtime.spi.provisioner.dataproc.DataprocRuntimeException: Dataproc operation failure: INVALID_ARGUMENT: User not authorized to act as service account 'XXXXXXXX-compute@developer.gserviceaccount.com'. To act as a service account, user must have one of [Owner, Editor, Service Account Actor] roles. See https://cloud.google.com/iam/docs/understanding-service-accounts for additional details.

Saleforce plug-in error in google cloud data fusion

与世无争的帅哥 提交于 2020-06-27 22:56:26
问题 I'm testing salesforce connectivity from Google Cloud Data Fusion. I get this error "Error: No discoverable found for request POST /v3/namespaces/system/apps/pipeline/services/studio/methods/v1/contexts/default/validations/stage HTTP/1.1" when clicking on the get schema button in the connector. Authentication details are all correct, I have tested it outside using Postman. 回答1: Can you go to the System Admin link at the top right hand corner and check the status for Pipeline Studio? If it's