How to configure concurrency in .NET Core Web API?

独自空忆成欢 提交于 2019-11-26 21:01:09

问题


In the old WCF days, you had control over service concurrency via MaxConcurrentCalls setting. MaxConcurrentCalls defaulted to 16 concurrent calls but you could raise or lower that value based upon your needs.

How do you control server side concurrency in .NET Core Web API? We probably need to limit it in our case as too many concurrent requests can impede overall server performance.


回答1:


ASP.NET Core application concurrency is handled by its web server. For example:

Kestrel

var host = new WebHostBuilder()
    .UseKestrel(options => options.ThreadCount = 8)

It is not recommended to set Kestrel thread count to a large value like 1K due to Kestrel async-based implementation.

More info: Is Kestrel using a single thread for processing requests like Node.js?

New Limits property has been introduced in ASP.NET Core 2.0 Preview 2.

You can now add limits for the following:

  1. Maximum Client Connections
  2. Maximum Request Body Size
  3. Maximum Request Body Data Rate

For example:

.UseKestrel(options =>
{
    options.Limits.MaxConcurrentConnections = 100;
}

IIS

When Kestrel runs behind a reverse proxy you could tune the proxy itself. For example, you could configure IIS application pool in web.config or in aspnet.config:

<configuration>
  <system.web>
    <applicationPool
        maxConcurrentRequestsPerCPU="5000"
        maxConcurrentThreadsPerCPU="0"
        requestQueueLimit="5000" />
  </system.web>
</configuration>

Of course Nginx and Apache have their own concurrency settings.



来源:https://stackoverflow.com/questions/44391268/how-to-configure-concurrency-in-net-core-web-api

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!