问题
We have a data flow pipeline where logs are sent from a websocket endpoint, which need to be pushed to Splunk after doing simple data enhancing (password masking etc).
I was checking if Kafka can be used for this because the volumes are really high. So, the possible flow is:
Websocket Endpoint ---------some-wss-connector--------> Kafka Topic --------splunk-connector----------> Splunk
I found the connector for pushing to Splunk at: https://github.com/splunk/kafka-connect-splunk and it works good.
Need help in the other connector (bolded above) to read from websocket endpoints in Kafka and push to a topic. Has anyone worked on something similar please? Please advice.
NOTE: I've looked at WebSocket to Kafka Topic Scala API but that's using Akka (with Scala) but I am afraid Akka/Scala is not part of our enterprise at this moment.
Thanks in advance. Vinay
回答1:
Well, I guess you´ll need some sort of adapter application (be it in the form of kafka connect or not). My idea is to write a small spring boot application which implements a websocket client, see this link to get an idea of how (scroll a bit down): https://www.sitepoint.com/implementing-spring-websocket-server-and-client/
Then after that, you can push the messages received via websocket to apache kafka using spring kafka, which works fine in my experience: https://spring.io/projects/spring-kafka
For Serialization from SpringBoot App to Kafka you can user JSON, however I recommmend the full bells and whistles with AVRO and a schema registry: https://docs.confluent.io/current/schema-registry/index.html
The main drawback of this idea is the fact that you get another application you need to implement, maintain, deploy and scale.
Hope that helps!
来源:https://stackoverflow.com/questions/57960475/kafka-connector-to-consume-data-from-websockets-and-push-to-topic