How to filter datastore data before mapping to cloud storage using the MapReduce API?
Regarding the code lab here , how can we filter datastore data within the mapreduce jobs rather than fetching all objects for a certain entity kind? In the mapper pipeline definition below, the only one input reader parameter is the entity kind to process and I can't see other parameters of type filter in the InputReader class that could help. output = yield mapreduce_pipeline.MapperPipeline( "Datastore Mapper %s" % entity_type, "main.datastore_map", "mapreduce.input_readers.DatastoreInputReader", output_writer_spec="mapreduce.output_writers.FileOutputWriter", params={ "input_reader":{ "entity