Running periodic Dataflow job
问题 I have to join data from Google Datastore and Google BigTable to produce some report. I need to execute that operation every minute. Is it possible to accomplish with Google Cloud Dataflow (assuming the processing itself should not take long time and/or can be split in independent parallel jobs)? Should I have endless loop inside the "main" creating and executing the same pipeline again and again? If most of time in such scenario is taken by bringing up the VMs, is it possible to instruct the