问题
I have produced following data on topic named SENSOR_STATUS_DETAILS in json:
1001
{
"sensorid": 1001,
"status": "CONNECTED",
"lastconnectedtime": "2020-05-31 22:31:54"
}
1002
{
"sensorid": 1002,
"status": "CONNECTED",
"lastconnectedtime": "2020-05-31 22:33:37"
}
I am trying to make a table from it as:
CREATE TABLE STATUS_IB_TABLE (ROWKEY INT KEY,
sensorid INTEGER,
status VARCHAR,
lastconnectedtime STRING)
WITH (TIMESTAMP='lastconnectedtime', TIMESTAMP_FORMAT='yyyy-MM-dd HH:mm:ss', KAFKA_TOPIC='SENSOR_STATUS_DETAILS', VALUE_FORMAT='JSON', KEY='sensorid');
The rowkey ksqlDB is making is as follows:
I want the rowkey to be sensorid.... I don't know whats happening
please help me on this.
thanks in advance!!
PS:
Confluent Platform version: 5.5
回答1:
The issue here is that the data being produced to the Kafka topic SENSOR_STATUS_DETAILS has a STRING key, not an INT key.
If you take the STRING "1001" it just happens to serialize to the same number of bytes as an INT. If you deserialize those same bytes as an INT you get the number 825241649.
You have two options:
- Change how you are producing the data so that you are producing a serialized 32-bit integer as the Kafka message's key and continue to import as
ROWKEY INT KEY, or Change your
CREATE TABLEstatement to have aSTRINGkey:CREATE TABLE STATUS_IB_TABLE ( ROWKEY STRING KEY, // <-- string key sensorid STRING, // <-- matching type here. status VARCHAR, lastconnectedtime STRING ) WITH (<as above>);
The first option would likely result in slightly better performance
来源:https://stackoverflow.com/questions/62120707/ksqldb-not-taking-rowkey-properly