I\'m trying to implement a limited web crawler in C# (for a few hundred sites only) using HttpWebResponse.GetResponse() and Streamreader.ReadToEnd() , also tried using Strea
Thank you all for answers, they've helped me to dig in proper direction. I've faced with the same performance issue, though proposed solution to change application config file (as I understood that solution is for web applications) doesn't fit my needs, my solution is shown below:
HttpWebRequest webRequest;
webRequest = (HttpWebRequest)System.Net.WebRequest.Create(fullUrl);
webRequest.Method = WebRequestMethods.Http.Post;
if (useDefaultProxy)
{
webRequest.Proxy = System.Net.WebRequest.DefaultWebProxy;
webRequest.Credentials = CredentialCache.DefaultCredentials;
}
else
{
System.Net.WebRequest.DefaultWebProxy = null;
webRequest.Proxy = System.Net.WebRequest.DefaultWebProxy;
}
I found the Application Config method did not work, but the problem was still due to the proxy settings. My simple request used to take up to 30 seconds, now it takes 1.
public string GetWebData()
{
string DestAddr = "http://mydestination.com";
System.Net.WebClient myWebClient = new System.Net.WebClient();
WebProxy myProxy = new WebProxy();
myProxy.IsBypassed(new Uri(DestAddr));
myWebClient.Proxy = myProxy;
return myWebClient.DownloadString(DestAddr);
}
WebClient's DownloadString is a simple wrapper for HttpWebRequest, could you try using that temporarily and see if the speed improves? If things get much faster, could you share your code so we can have a look at what may be wrong with it?
EDIT:
It seems HttpWebRequest observes IE's 'max concurrent connections' setting, are these URLs on the same domain? You could try increasing the connections limit to see if that helps? I found this article about the problem:
By default, you can't perform more than 2-3 async HttpWebRequest (depends on the OS). In order to override it (the easiest way, IMHO) don't forget to add this under section in the application's config file:
<system.net>
<connectionManagement>
<add address="*" maxconnection="65000" />
</connectionManagement>
</system.net>
Why wouldn't multithreading solve this issue? Multithreading would minimize the network wait times, and since you'd be storing the contents of the buffer in system memory (RAM), there would be no IO bottleneck from dealing with a filesystem. Thus, your 82 pages that take 82 seconds to download and parse, should take like 15 seconds (assuming a 4x processor). Correct me if I'm missing something.
____ DOWNLOAD THREAD_____*
Download Contents
Form Stream
Read Contents
_________________________*
HttpWebRequest may be taking a while to detect your proxy settings. Try adding this to your application config:
<system.net>
<defaultProxy enabled="false">
<proxy/>
<bypasslist/>
<module/>
</defaultProxy>
</system.net>
You might also see a slight performance gain from buffering your reads to reduce the number of calls made to the underlying operating system socket:
using (BufferedStream buffer = new BufferedStream(stream))
{
using (StreamReader reader = new StreamReader(buffer))
{
pageContent = reader.ReadToEnd();
}
}
Have you tried ServicePointManager.maxConnections? I usually set it to 200 for things similar to this.