Using Parallel Processing in C# to test a site's ability to withstand a DDOS

纵饮孤独 提交于 2021-02-11 07:14:14

问题


I have a website and I am also exploring Parallel Processing in C# and I thought it would be a good idea to see if I could write my own DDOS test script to see how the site would handle a DDOS attack. However when I run it, there only seems to be 13 threads in use and they always return 200 status codes, never anything to suggest the response wasn't quick and accurate and when going to the site and refreshing at the same time as the script runs the site loads quickly.

I know there are tools out there for penetration tests and so on but I was just wondering why I couldn't use a Parallel loop to make enough concurrent HTTP requests to a site that it would struggle to load fast and return a response. It seems I get more problems from a Twitter Rush just by tweeting out a link to a new page on the site and the 100s of BOTS that all rush concurrently to the site to rip, scan, check it etc than anything I can throw at it using a Parallel loop.

Is there something I am doing wrong that limits the number of concurrent threads or is this something I cannot control. I could just throw numerous long winded search queries that I know would scan the whole DB returning 0 results in each request as I have seen this in action and depending on the size of the data to be scanned and the complexity of the search query it can cause CPU spikes and slow loads.

So without a lecture on using other tools is there a way to throw a 100+ parallel requests for a page to be loaded rather than a max of 13 threads which it handles perfectly.

Here is the code, the URL and no of HTTP requests to make are passed in as command line parameters.

static void Attack(string url, int limit)
{
    Console.WriteLine("IN Attack = {0}, requests = {1}", url, limit);
    try
    {
        Parallel.For(0, limit, i =>
        {

            HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(url);
            webRequest.ServicePoint.ConnectionLimit = limit;
            HttpWebResponse webResponse = webRequest.GetResponse() as HttpWebResponse;

            int statuscode = Convert.ToInt32(webResponse.StatusCode);

            Console.WriteLine("iteration {0} on thread {1} Status {2}", i,
                                Thread.CurrentThread.ManagedThreadId, statuscode);
        });
    }
    catch (AggregateException exc)
    {
        exc.InnerExceptions.ToList().ForEach(e =>
        {
            Console.WriteLine(e.Message);
        });
    }
    catch (Exception ex)
    {
        Console.WriteLine("In Exception: " + ex.Message.ToString());
    }
    finally
    {
        Console.WriteLine("All finished");
    }
}

回答1:


you can try it like:

        var socketsHandler = new SocketsHttpHandler
        {
            PooledConnectionLifetime = TimeSpan.FromSeconds(1),
            PooledConnectionIdleTimeout = TimeSpan.FromSeconds(1),
            MaxConnectionsPerServer = 10
        };

        var client = new HttpClient(socketsHandler);

        for (var i = 0; i < limit; i++)
        {
            _ = await client.GetAsync(url);
        }



回答2:


The Parallel.For method is using threads from the ThreadPool. The initial number of threads in the pool is usually small (comparable to the number of logical processors in the machine). When the pool is starved, new threads are injected at a rate of one every 500 msec. The easy way to solve your problem is simply to increase the number of the create-immediately-on-demand threads, using the SetMinThreads method:

ThreadPool.SetMinThreads(1000, 10);

This is not scalable though, because each thread allocates 1MB of memory for its stack, so you can't have millions of them. The scalable solution is to go async, which makes minimal use of threads.



来源:https://stackoverflow.com/questions/61398458/using-parallel-processing-in-c-sharp-to-test-a-sites-ability-to-withstand-a-ddo

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!