Reading nested JSON in Google Dataflow / Apache Beam

核能气质少年 提交于 2019-12-19 10:34:11

问题


It is possible to read unnested JSON files on Cloud Storage with Dataflow via:

p.apply("read logfiles", TextIO.Read.from("gs://bucket/*").withCoder(TableRowJsonCoder.of()));

If I just want to write those logs with minimal filtering to BigQuery I can do so by using a DoFn like this one:

private static class Formatter extends DoFn<TableRow,TableRow> {

        @Override
        public void processElement(ProcessContext c) throws Exception {

            // .clone() since input is immutable
            TableRow output = c.element().clone();

            // remove misleading timestamp field
            output.remove("@timestamp");

            // set timestamp field by using the element's timestamp
            output.set("timestamp", c.timestamp().toString());

            c.output(output);
        }
    }
}

However, I don't know how to access nested fields in the JSON file this way.

  1. If the TableRow contains a RECORD named r, is it possible to access its keys/values without further serialization/deserialization?
  2. If I need to serialize/deserialize myself with the Jackson library, does it make more sense to use a the standard Coder of TextIO.Read instead of TableRowJsonCoder, to gain some of the performance back that I loose this way?

EDIT

The files are new-line delimited, and look something like this:

{"@timestamp":"2015-x", "message":"bla", "r":{"analyzed":"blub", "query": {"where":"9999"}}}
{"@timestamp":"2015-x", "message":"blub", "r":{"analyzed":"bla", "query": {"where":"1111"}}}

回答1:


Your best bet is probably to do what you described in #2 and use Jackson directly. It makes the most sense to let the TextIO read do what it is built for -- reading lines from a file with the string coder -- and then use a DoFn to actually parse the elements. Something like the following:

PCollection<String> lines = pipeline
  .apply(TextIO.from("gs://bucket/..."));
PCollection<TableRow> objects = lines
  .apply(ParDo.of(new DoFn<String, TableRow>() {
    @Override
    public void processElement(ProcessContext c) {
      String json = c.element();
      SomeObject object = /* parse json using Jackson, etc. */;
      TableRow row = /* create a table row from object */;
      c.output(row);
    }
  });

Note that you could also do this using multiple ParDos.



来源:https://stackoverflow.com/questions/41984229/reading-nested-json-in-google-dataflow-apache-beam

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!