Azure Function & Document DB

。_饼干妹妹 提交于 2019-12-22 06:18:23

问题


I'm curious how the scaling working on the Azure Functions with relation to outputting to Document DB.

Basically what happens when Document DB returns a 429 because I'm exceeded my allocated throughput? I ask because when I had the lowest level of Azure Functions combined with the lowest level of Document DB and proceeded to call the function 1000 times in 20 sec I was only seeing 700-800 actual documents inserted into my document db collection. When I scaled Document DB up to the max with the same lowest Function level again I only received 700-800 documents in my doc db collection. However when I scaled the Function up to the max with the document db at the max I get all 1000. When I drop doc db down to the min I only got 300ish....though it does seem like I've locked the doc db account up and that it's still retrying the insert until it can succeed.

So I'm just confused as to this is scaling and if I could get some insight so I could better tune various aspect of the function or app.


回答1:


Yes, it does currently retry on 429, waiting the suggested amount of time as per the DocDB response. There's currently no absolute timeout so the retries will continue until they get through (I'm double-checking right now if this is the expected behavior).

In your first scenario, if you wait long enough for the throttle to be removed, do all 1000 eventually show up?

I'd like to try replicating this -- are you sticking 1000 items in a queue before enabling your function? Or calling it some other way?

The specific retry code that's running is here if you're curious: https://github.com/Azure/azure-webjobs-sdk-extensions/blob/master/src/WebJobs.Extensions.DocumentDB/DocumentDBUtility.cs#L36



来源:https://stackoverflow.com/questions/36460452/azure-function-document-db

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!