apache-storm

KafkaSpout throws NoClassDefFoundError for log4j

落爺英雄遲暮 提交于 2019-12-01 17:42:47
For some reason I get the following error when I try to run my topology on a Storm cluster: java.lang.NoClassDefFoundError: Could not initialize class org.apache.log4j.Log4jLoggerFactory at org.apache.log4j.Logger.getLogger(Logger.java:39) at kafka.utils.Logging$class.logger(Logging.scala:24) at kafka.consumer.SimpleConsumer.logger$lzycompute(SimpleConsumer.scala:30) at kafka.consumer.SimpleConsumer.logger(SimpleConsumer.scala:30) at kafka.utils.Logging$class.info(Logging.scala:67) at kafka.consumer.SimpleConsumer.info(SimpleConsumer.scala:30) at kafka.consumer.SimpleConsumer.liftedTree1$1

KafkaSpout throws NoClassDefFoundError for log4j

白昼怎懂夜的黑 提交于 2019-12-01 17:41:55
问题 For some reason I get the following error when I try to run my topology on a Storm cluster: java.lang.NoClassDefFoundError: Could not initialize class org.apache.log4j.Log4jLoggerFactory at org.apache.log4j.Logger.getLogger(Logger.java:39) at kafka.utils.Logging$class.logger(Logging.scala:24) at kafka.consumer.SimpleConsumer.logger$lzycompute(SimpleConsumer.scala:30) at kafka.consumer.SimpleConsumer.logger(SimpleConsumer.scala:30) at kafka.utils.Logging$class.info(Logging.scala:67) at kafka

Twitter storm example running in local mode cannot delete file

馋奶兔 提交于 2019-12-01 17:25:59
I am running the storm starter project ( https://github.com/nathanmarz/storm-starter ) and it throws the following error after running for a little while. 23135 [main] ERROR org.apache.zookeeper.server.NIOServerCnxn - Thread Thread[main,5,main] died java.io.IOException: Unable to delete file: C:\Users\[user directory]\AppData\Local\Temp\a0894222-6a8a-4f80-8655-3ad6a0c10021\version-2\log.1 at org.apache.commons.io.FileUtils.forceDelete(FileUtils.java:1390) at org.apache.commons.io.FileUtils.cleanDirectory(FileUtils.java:1044) at org.apache.commons.io.FileUtils.deleteDirectory(FileUtils.java:977

Twitter storm example running in local mode cannot delete file

夙愿已清 提交于 2019-12-01 17:08:43
问题 I am running the storm starter project (https://github.com/nathanmarz/storm-starter) and it throws the following error after running for a little while. 23135 [main] ERROR org.apache.zookeeper.server.NIOServerCnxn - Thread Thread[main,5,main] died java.io.IOException: Unable to delete file: C:\Users\[user directory]\AppData\Local\Temp\a0894222-6a8a-4f80-8655-3ad6a0c10021\version-2\log.1 at org.apache.commons.io.FileUtils.forceDelete(FileUtils.java:1390) at org.apache.commons.io.FileUtils

How to run WordCountTopology from storm-starter in Intellij

半城伤御伤魂 提交于 2019-12-01 17:08:18
问题 I work with Storm for a while already, but want to get started with development. As suggested, I am using IntelliJ (up to now, I was using Eclipse and did only write topologies against Java API). I was also looking at https://github.com/apache/storm/tree/master/examples/storm-starter#intellij-idea This documentation is not complete. I was not able to run anything in Intellij first. I could figure out, that I need to remove the scope of storm-core dependency (in storm-starter pom.xml). (found

Max number of tuple replays on Storm Kafka Spout

孤者浪人 提交于 2019-12-01 16:09:02
We’re using Storm with the Kafka Spout. When we fail messages, we’d like to replay them, but in some cases bad data or code errors will cause messages to always fail a Bolt, so we’ll get into an infinite replay cycle. Obviously we’re fixing errors when we find them, but would like our topology to be generally fault tolerant. How can we ack() a tuple after it’s been replayed more than N times? Looking through the code for the Kafka Spout, I see that it was designed to retry with an exponential backoff timer and the comments on the PR state: "The spout does not terminate the retry cycle (it is

Max number of tuple replays on Storm Kafka Spout

人走茶凉 提交于 2019-12-01 15:08:50
问题 We’re using Storm with the Kafka Spout. When we fail messages, we’d like to replay them, but in some cases bad data or code errors will cause messages to always fail a Bolt, so we’ll get into an infinite replay cycle. Obviously we’re fixing errors when we find them, but would like our topology to be generally fault tolerant. How can we ack() a tuple after it’s been replayed more than N times? Looking through the code for the Kafka Spout, I see that it was designed to retry with an exponential

How to debug Apache Storm in Local Cluster/Mode through eclipse

痴心易碎 提交于 2019-12-01 13:31:27
Using the following Q&A I managed to get debugging enabled through eclipse on an Apache Storm cluster (running locally). How to debug Apache Storm in Eclipse? My conf/storm.yaml has the following line to enable debugging on the worker nodes: worker.childopts: "-agentlib:jdwp=transport=dt_socket,address=8000,server=y,suspend=y" When I submit a Topology to storm to run (in a cluster), I can set breakpoints and view variables in my editor. But when I try to run it locally (In Local Mode ), I can't seem to connect (Connection Refused) - through eclipse. # I'm using storm crawler, I submit a

How to debug Apache Storm in Local Cluster/Mode through eclipse

允我心安 提交于 2019-12-01 10:33:46
问题 Using the following Q&A I managed to get debugging enabled through eclipse on an Apache Storm cluster (running locally). How to debug Apache Storm in Eclipse? My conf/storm.yaml has the following line to enable debugging on the worker nodes: worker.childopts: "-agentlib:jdwp=transport=dt_socket,address=8000,server=y,suspend=y" When I submit a Topology to storm to run (in a cluster), I can set breakpoints and view variables in my editor. But when I try to run it locally (In Local Mode), I can

Storm command fails with NoClassDefFoundError after adding jsoup as provided dependency

杀马特。学长 韩版系。学妹 提交于 2019-12-01 06:51:27
I'm using JSoup in my project and I've declared the dependency in my POM file. It compiles just fine and runs fine too, but only when I used the jar with all dependencies and change the have the scope of the dependency to compiled . If I change this scope to provided , then I can still compile just fine, but not run it. It gives me the ClassNotFoundException . I have included the necessary JAR file in the classpath and also the path variables but I'm still facing this problem. I can get working with the compile option but it's really irking me at the back of my mind why I can't get it running