apache-nifi

Jolt transform json - how to add default fields

给你一囗甜甜゛ 提交于 2019-12-24 10:03:54
问题 I have below input JSON : { "id": "2ef8a2ee-054f-4b43-956a-8aa4f51a41d5", "type": "VOICE", "tags": [ { "id": "some id 1", "description": "some description 1" }, { "id": "some id 2", "description": "some description 2" } ], "transcription": { "key1": "val1", "key2": "val2" } } But, output JSON should look like similarly, and add only default values: { "id": "2ef8a2ee-054f-4b43-956a-8aa4f51a41d5", "created": "2019-06-18T18:12:37", "firstName": "Khusan", "lastName": "Sharipov", "status": "OPEN"

NiFi moveHDFS processor appears to do nothing

余生颓废 提交于 2019-12-24 09:49:44
问题 Simply trying to automate a move of files from one HDFS dir to another. Doing this with a moveHDFS processor in Apache NiFi, but when starting the processor nothing seems to happen. The processor metrics remain at zero after long amount of time and looking at the bulletin board shows no errors (logging level set to INFO), the only logging output in the bulletin board is: 14:50:04 HSTINFO1e637d0d-0163-1000-7bde-a7993ae403e8 MoveHDFS[id=1e637d0d-0163-1000-7bde-a7993ae403e8] Initialized a new

Secure communication between Ingress Controller (Traefik) and backend service on Kubernetes

℡╲_俬逩灬. 提交于 2019-12-24 08:56:38
问题 I'm trying to secure Nifi in a Kubernetes cluster, behind a Traefik proxy. Both are running as services in K8S. Traefik is secured with a public certificate. I want it to redirect calls to nifi, while securing the communication between Traefik (as an Ingress Controller) and the backend pods : Nifi. Looks like the secure confiuration should lire in my Ingress YAML descriptor. Looks like I should issue a CA root to generate Nifi self signed certificate and load this CA Root in Traefik so it can

Apache NIFI REST API (jwt) access via Apache Knox gateway

情到浓时终转凉″ 提交于 2019-12-24 08:17:15
问题 I am looking for resources to configure Apache KNOXTOKEN service to access Apache NIFI REST API. I already have KNOXSSO configured, and am able to access the NIFI UI through it. However, I could not find resources to make NIFI REST services securely accessible via Curl and JWT. Pointers appreciated. 回答1: Minor tweak to the other suggestion here... When integrating with KnoxSSO, NiFi accepts the Knox JWT token in a cookie. By default, I believe this cookie is named hadoop-jwt . If you're

Apache Nifi Expression Language - toDate formatting

こ雲淡風輕ζ 提交于 2019-12-24 08:03:48
问题 I am trying to format a date string using the Apache Nifi expression language and the Replace Text processor(regex). Given a date string date_str : "2018-12-05T11:44:39.717+01:00", I wish to convert this to: correct_mod_date_str: "2018-12-05 10:44:39.717", (notice how the date is converted to UTC, and character 'T' replaced by a space.) To do this, I am currently using: toDate("yyyy-MM-dd'T'HH:mm:ss.SSSSSSXXX"):format("yyyy-MM-dd HH:mm:ss.SSS", '+00:00') and this works perfectly. However,

How do you connect NIFI to IBM MessageHub through PublishKafka processor?

谁说胖子不能爱 提交于 2019-12-24 08:00:10
问题 I am trying to connect NIFI to IBM MessageHub but I am not getting any connection working. Does anyone have a working example on how to configure it? I get a timeout exception in the bulletinboard. I have configured a PublishKafka_0_11 1.4.0 processor . Configured it as SASL_SSL, added standard ssl context service, added the jaas.conf KafkaClient { org.apache.kafka.common.security.plain.PlainLoginModule required serviceName="Message Hub-bq" username="xxxxxx" password="xxxxxx"; }; And in the

Upsert mongo array of object with condition

风格不统一 提交于 2019-12-24 07:45:18
问题 I have a collection with some documents and each document has ha list of objects representing a temporal interval in epoch with a keyword. I want to update the final value of the interval of each object in the right document where the ending value is greater than a value. If nothing can be updated I want to insert a new interval with both start and end as the new value and with the keyword used in the query. I'm using nifi to perform this task and the update block, with upsert enabled. I can

Upsert mongo array of object with condition

喜夏-厌秋 提交于 2019-12-24 07:42:04
问题 I have a collection with some documents and each document has ha list of objects representing a temporal interval in epoch with a keyword. I want to update the final value of the interval of each object in the right document where the ending value is greater than a value. If nothing can be updated I want to insert a new interval with both start and end as the new value and with the keyword used in the query. I'm using nifi to perform this task and the update block, with upsert enabled. I can

Zero flow files in onTrigger() method of AbstractProcessor of Apache Nifi

ぃ、小莉子 提交于 2019-12-24 07:18:47
问题 I am developing a custom processor for Apache NiFi. I have created nar of my processor and put it in the lib folder of nifi and started the nifi. I have setup the remote debugger in eclipse and enabled breakpoint on first line of onTrigger() . While debugging I am running one processor at a time in my nifi pipeline. I can find single flow file in the input queue of my custom processor, however my custom processor is not receiving any flow file. When I start my custom processor, it hits

Defining Apache Avro Schema fullname in Apache NiFi

我的梦境 提交于 2019-12-24 07:15:04
问题 Using NiFi 1.7.1 (which uses Java Avro 1.8.1) and in the AvroSchemaRegistry, I'm trying to define a schema which has the fields name and app.name at the top level. According to the Avro docs[1] I would assume that I could just define the fullname like normal "name": "app.name" but I hit the error Illegal character in: app.name . It's true that the name portion of the fullname does not allow dots but according to the docs: "If the name specified contains a dot, then it is assumed to be a