apache-nifi

NIFI - QueryDatabaseTable processor. How to query rows which is modified?

廉价感情. 提交于 2019-12-10 12:20:00
问题 I am working on NIFI Data Flow where my usecase is fetch mysql table data and put into hdfs/local file system. I have built a data flow pipeline where i used querydatabaseTable processor ------ ConvertRecord --- putFile processor. My Table Schema ---> id,name,city,Created_date I am able to receive files in destination even when i am inserting new records in table But, but .... When i am updating exsiting rows then processor is not fetching those records looks like it has some limitation. My

Apache Nifi/Cassandra - how to load CSV into Cassandra table

痞子三分冷 提交于 2019-12-10 09:33:46
问题 I have various CSV files incoming several times per day, storing timeseries data from sensors, which are parts of sensors stations. Each CSV is named after the sensor station and sensor id from which it is coming from, for instance "station1_sensor2.csv". At the moment, data is stored like this : > cat station1_sensor2.csv 2016-05-04 03:02:01.001000+0000;0; 2016-05-04 03:02:01.002000+0000;0.1234; 2016-05-04 03:02:01.003000+0000;0.2345; I have created a Cassandra table to store them and to be

NiFi convert json to csv using ConvertRecord

佐手、 提交于 2019-12-09 04:49:27
i have a stream of json in apache nifi that contain dynamic fields (maximum 11 fields) and i want to convert it to csv file. sample json: { "field1":"some text", "field2":"some text", "field3":"some text", "field4":"some text", "field5":"some text", "field6":"some text", "field7":"some text" } i don't wanna using replace or json evaluate; how i do it with ConvertRecord? using this processor is so odd and hard to work... Clear expression about dynamic fields: i have 11 fields at total. one record may have contain 7 fields, and next record may contain 11 fields and next 9 fields... The steps

Name of attribute for “Put Response Body In Attribute” in invokeHTTP

邮差的信 提交于 2019-12-09 03:58:27
问题 I have a endpoint would return response as follow. { "result": [ {}, .... {}] } I am trying to use invokeHTTP and enable “Put Response Body In Attribute” to keep origin flowfile and response from api. but it seems add a attribute named $.result as follow Is there any way to set a proper name for result attribute ? Thanks. 回答1: You try to extract results using a JSON path. However this is not possible from within InvokeHttp. You may want to use EvaluateJsonPath processor. Documentation for Put

Transfer data from vertica to Redshift using Apache Nifi

不问归期 提交于 2019-12-08 19:30:34
I want to transfer data from vertica to redshift using apache nifi. which are the processors and configuration I need to set? If Vertica and Redshift have "well-behaved" JDBC drivers, you can set up a DBCPConnectionPool for each, then a SQL processor such as ExecuteSQL , QueryDatabaseTable , or GenerateTableFetch (the latter of which generates SQL for use in ExecuteSQL). These will get your records into Avro format, then (prior to NiFi 1.2.0) you can use ConvertAvroToJSON -> ConvertJSONToSQL -> PutSQL to get your records inserted into Redshift. In NiFi 1.2.0, you can use set up an AvroReader

Nifi - Client Certificate Authorization Error

左心房为你撑大大i 提交于 2019-12-08 13:01:51
问题 I have installed secured Nifi installation and wanted to authenticate using secured client certificate. Authentication went fine but it failed at authorization AccessDeniedExceptionMapper identity[CN=nifi-admin, OU=NIFI], groups[] does not have permission to access the requested resource. No applicable policies could be found. Returning Forbidden response. Please note that it is fresh installation and the idea is to use the nipyapi for automating admin tasks. (without logging into UI) I have

In nifi usgae of Evaluate jsonpath processor will it affect performance impact because of attribute creation

廉价感情. 提交于 2019-12-08 09:33:17
问题 I'm trying to integrate nifi REST API's with my application. So by mapping input and output from my application, I am trying to call nifi REST api for flow creation. So, in my use case most of the times I will extract the JSON values and will apply expression languages. So, for simplifying all the use-cases I am using evaluate JSONpath processor for fetching all attributes using jsonpath and apply expression language function on that in extract processor. Below is the flow diagram regarding

Nifi processor is not parsing JSON correctly

末鹿安然 提交于 2019-12-08 06:19:30
问题 I am using EvaluateJsonPath to extract one particular value from JSON. I am using the follwoing JSONPath expression: $.data[?(@.containerType == 'SOURCE' && @.path == 'SOURCE_KYLO_DATALAKE')].id This is the JSON document I'm calling the JSONPath on : {"data":[{"id":"dc18bf87-c5a6-4600-9584-e79fb988b1d0","path":["@Rakesh.Prasad@diageo.com"],"tag":"0","type":"CONTAINER","containerType":"HOME"},{"id":"42e52055-4deb-4d5d-942f-4e1c4e48c35e","path":["BPM"],"tag":"3","type":"CONTAINER",

ExecuteSQL doesn't select table if it having dateTime Offset value?

笑着哭i 提交于 2019-12-08 06:01:37
问题 I have created table with single column having data type -dateTimeOffset value and inserted some values. create table dto (dto datetimeoffset(7)) insert into dto values (GETDATE()) -- inserts date and time with 0 offset insert into dto values (SYSDATETIMEOFFSET()) -- current date time and offset insert into dto values ('20131114 08:54:00 +10:00') -- manual way In Nifi,i have specified "Select * from dto" query in Execute SQL . It shows below error.., java.lang.IllegalArgumentException:

Transfer data from vertica to Redshift using Apache Nifi

我的未来我决定 提交于 2019-12-08 04:10:52
问题 I want to transfer data from vertica to redshift using apache nifi. which are the processors and configuration I need to set? 回答1: If Vertica and Redshift have "well-behaved" JDBC drivers, you can set up a DBCPConnectionPool for each, then a SQL processor such as ExecuteSQL, QueryDatabaseTable, or GenerateTableFetch (the latter of which generates SQL for use in ExecuteSQL). These will get your records into Avro format, then (prior to NiFi 1.2.0) you can use ConvertAvroToJSON ->