Netty ByteBuf dont get the full message of a large input when decoding

无人久伴 提交于 2020-01-03 05:35:07

问题


There is a legacy application which does json over TCP.

I use a netty server, and I want to convert the json into a Pojo via Gson.

In order to do this, I convert created a Decoder which take as input a ByteBuf and create My Object.

The problem is the Bytebuf has a size of 1024, and the json is much much more bigger than that (>20mo)

@Sharable
public class RequestDecoder extends MessageToMessageDecoder<ByteBuf> {

    @Override
    protected void decode(ChannelHandlerContext ctx, ByteBuf msg, List<Object> out) throws Exception {
        InputStream json = new ByteBufInputStream(msg);
        InputStreamReader inputStreamReader = new InputStreamReader(json, Charsets.UTF_8);
        JsonReader reader;
        reader = new JsonReader(inputStreamReader);
        reader.setLenient(true);
        Request object = new Gson().fromJson(reader, Request.class);
        try {
            reader.close();
        } catch (IOException e) {
            throw new RuntimeException("uncloseable reader", e);
        }
        out.add(object);
    }
}

I always get a :

com.google.gson.stream.MalformedJsonException: Unterminated string at line 1 column 1020

and when I debug I only get the 1024 first byte in the "ByteBuf msg"

How can I set a ByteBuf capacity to its max capacity? (when debugging I see that ByteBuf is a SimpleLeakAwareByteBuf with a maxCapacity of 2147483647)

How can I get rid of this limitation?


回答1:


Try using different decoder. For instance, try ReplayingDecoder .

There are number of decoders available in Netty. You will have to choose the most suitable decoder and implement it according to your requirement.

When using the ReplayingDecoder, make sure to know how many bytes you would like to read exactly. Usually, the size of bytes to read is sent at the beginning of the network packet. This really depends on your server + client protocol implementation.




回答2:


I added a DelimiterBasedFrameDecoder at the beginning of the pipeline :

pipeline.addLast("framer", new DelimiterBasedFrameDecoder(Integer.MAX_VALUE, Unpooled.wrappedBuffer(new byte[]{'E', 'O', 'F', '\n'})));

Because I discovered that all json were terminated with "EOF\n" (due to legacy issues)

That's ok for me now, but what will happen when this legacy issue is resolved?



来源:https://stackoverflow.com/questions/22563167/netty-bytebuf-dont-get-the-full-message-of-a-large-input-when-decoding

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!