throttling

'ab' program freezes after lots of requests, why?

我的未来我决定 提交于 2019-11-26 11:56:49
问题 Whenever I use \'ab\' to benchmark a web server, it will freeze for a while after having sent lots of requests, only to continue after 20 seconds or so. Consider the following HTTP server simulator, written in Ruby: require \'socket\' RESPONSE = \"HTTP/1.1 200 OK\\r\\n\" + \"Connection: close\\r\\n\" + \"\\r\\n\" + \"\\r\\n\" buffer = \"\" server = TCPServer.new(\"127.0.0.1\", 3000) # Create TCP server at port 3000. server.listen(1024) # Set backlog to 1024. while true client = server.accept

How to throttle requests in a Web Api?

≡放荡痞女 提交于 2019-11-26 11:53:31
问题 I\'m trying to implement request throttling via the following: Best way to implement request throttling in ASP.NET MVC? I\'ve pulled that code into my solution and decorated an API controller endpoint with the attribute: [Route(\"api/dothis/{id}\")] [AcceptVerbs(\"POST\")] [Throttle(Name = \"TestThrottle\", Message = \"You must wait {n} seconds before accessing this url again.\", Seconds = 5)] [Authorize] public HttpResponseMessage DoThis(int id) {...} This compiles but the attribute\'s code

Simulate delayed and dropped packets on Linux

自闭症网瘾萝莉.ら 提交于 2019-11-26 10:58:26
I would like to simulate packet delay and loss for UDP and TCP on Linux to measure the performance of an application. Is there a simple way to do this? ephemient netem leverages functionality already built into Linux and userspace utilities to simulate networks. This is actually what Mark's answer refers to, by a different name. The examples on their homepage already show how you can achieve what you've asked for: Examples Emulating wide area network delays This is the simplest example, it just adds a fixed amount of delay to all packets going out of the local Ethernet. # tc qdisc add dev eth0

How do I throttle my site's API users?

半腔热情 提交于 2019-11-26 10:06:53
问题 The legitimate users of my site occasionally hammer the server with API requests that cause undesirable results. I want to institute a limit of no more than say one API call every 5 seconds or n calls per minute (haven\'t figured out the exact limit yet). I could obviously log every API call in a DB and do the calculation on every request to see if they\'re over the limit, but all this extra overhead on EVERY request would be defeating the purpose. What are other less resource-intensive

Best way to implement request throttling in ASP.NET MVC?

老子叫甜甜 提交于 2019-11-26 02:23:09
We're experimenting with various ways to throttle user actions in a given time period : Limit question/answer posts Limit edits Limit feed retrievals For the time being, we're using the Cache to simply insert a record of user activity - if that record exists if/when the user does the same activity, we throttle. Using the Cache automatically gives us stale data cleaning and sliding activity windows of users, but how it will scale could be a problem. What are some other ways of ensuring that requests/user actions can be effectively throttled (emphasis on stability)? Jarrod Dixon Here's a generic

Best way to implement request throttling in ASP.NET MVC?

主宰稳场 提交于 2019-11-26 01:09:46
问题 We\'re experimenting with various ways to throttle user actions in a given time period : Limit question/answer posts Limit edits Limit feed retrievals For the time being, we\'re using the Cache to simply insert a record of user activity - if that record exists if/when the user does the same activity, we throttle. Using the Cache automatically gives us stale data cleaning and sliding activity windows of users, but how it will scale could be a problem. What are some other ways of ensuring that

Throttling asynchronous tasks

隐身守侯 提交于 2019-11-26 00:22:25
问题 I would like to run a bunch of async tasks, with a limit on how many tasks may be pending completion at any given time. Say you have 1000 URLs, and you only want to have 50 requests open at a time; but as soon as one request completes, you open up a connection to the next URL in the list. That way, there are always exactly 50 connections open at a time, until the URL list is exhausted. I also want to utilize a given number of threads if possible. I came up with an extension method,