We have an architecture using 2 pubsub topic/subscription pairs:
T1
is triggered by a cronjob periodically (every 5
I ran into the same problem, I wanted better control over .ack()
. Looking at the nodejs library from google, it would an option to refactor the ack()
to return a promise so the function can wait for ack()
to complete.
Subscriber.prototype.ack_ = function(message) {
var breakLease = this.breakLease_.bind(this, message);
this.histogram.add(Date.now() - message.received);
if (this.writeToStreams_ && this.isConnected_()) {
this.acknowledge_(message.ackId, message.connectionId).then(breakLease);
return;
}
this.inventory_.ack.push(message.ackId);
this.setFlushTimeout_().then(breakLease);
};
How are you triggering your functions?
According to the docs, if your function is consuming pubsub messages, then you should use the pubsub trigger. When using the pubsub trigger, the library is not needed. Simply call callback()
at the end of your function, and the pubsub message will be properly acknowledged.
For what you intend to do, I don't think your current architecture is the proper option.
I would move your first step to Google App Engine with a cron task, and making this task simply move messages from T2
to T1
, leaving the function having the trigger S2
and processing the message.
So, your jobs would be published on T2
, and you'd have a GAE app with a pull subscription S2
triggered by a cron task, and this app would re-publish the message to T1
. Then your function would be triggered by a subscription S1
to topic T1
, and would run the job in the message, avoiding the extra-processing of importing the pubsub library, and using the product as expected.
Furthermore, I'm not sure how you are originally publishing your jobs to the topic, but Task Queues are a good GAE (and product-agnostic in Alpha) option for rate-limiting tasks.
A GAE app only used for this (setting a 1 max instance) would be within the always free limit, so costs would not be noticeably increased.
A developer from the node.js pubsub client confirmed that using the client to pull messages from a Cloud Function is not a supported use case.
The alternative is to use the service APIs. However, the REST APIs have their own caveats when attempting to pull all messages from a subscription.