Cassandra batch prepared statement size warning

依然范特西╮ 提交于 2019-12-18 09:34:48

问题


I see this error continuously in the debug.log in cassandra,

WARN  [SharedPool-Worker-2] 2018-05-16 08:33:48,585 BatchStatement.java:287 - Batch of prepared statements for [test, test1] is of size 6419, exceeding specified threshold of 5120 by 1299.

In this

   where,
                    6419 - Input payload size (Batch)
                    5120 - Threshold size
                    1299 - Byte size above threshold value

so as per this ticket in Cassandra, https://github.com/krasserm/akka-persistence-cassandra/issues/33 I see that it is due to the increase in input payload size so I Increased the commitlog_segment_size_in_mb in cassandra.yml to 60mb and we are not facing this warning anymore.

Is this Warning harmful? Increasing the commitlog_segment_size_in_mb will it affect anything in performance?


回答1:


This is not related to the commit log size directly, and I wonder why its change lead to disappearing of the warning...

The batch size threshold is controlled by batch_size_warn_threshold_in_kb parameter that is default to 5kb (5120 bytes).

You can increase this parameter to higher value, but you really need to have good reason for using batches - it would be nice to understand the context of their usage...




回答2:


commit_log_segment_size_in_mb represents your block size for commit log archiving or point-in-time backup. These are only active if you have configured archive_command or restore_command in your commitlog_archiving.properties file. Default size is 32mb.

As per Expert Apache Cassandra Administration book:

you must ensure that value of commitlog_segment_size_in_mb must be twice the value of max_mutation_size_in_kb.

you can take reference of this:

Mutation of 17076203 bytes is too large for the maxiumum size of 16777216



来源:https://stackoverflow.com/questions/50385262/cassandra-batch-prepared-statement-size-warning

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!