We have an architecture using 2 pubsub topic/subscription pairs:
T1 is triggered by a cronjob periodically (every 5
How are you triggering your functions?
According to the docs, if your function is consuming pubsub messages, then you should use the pubsub trigger. When using the pubsub trigger, the library is not needed. Simply call callback() at the end of your function, and the pubsub message will be properly acknowledged.
For what you intend to do, I don't think your current architecture is the proper option.
I would move your first step to Google App Engine with a cron task, and making this task simply move messages from T2 to T1, leaving the function having the trigger S2 and processing the message.
So, your jobs would be published on T2, and you'd have a GAE app with a pull subscription S2 triggered by a cron task, and this app would re-publish the message to T1. Then your function would be triggered by a subscription S1 to topic T1, and would run the job in the message, avoiding the extra-processing of importing the pubsub library, and using the product as expected.
Furthermore, I'm not sure how you are originally publishing your jobs to the topic, but Task Queues are a good GAE (and product-agnostic in Alpha) option for rate-limiting tasks.
A GAE app only used for this (setting a 1 max instance) would be within the always free limit, so costs would not be noticeably increased.