问题
I am trying to read 500 MB text file to send it contents through HttpWebRequest. According to my requirement, I cannot send the data in chunks. Code is as follows :
using (StreamReader reader = new StreamReader(filename))
{
postData = reader.ReadToEnd();
}
byte[] byteArray = Encoding.UTF8.GetBytes(postData);
request.ContentType = "text/plain";
request.ContentLength = byteArray.Length;
Stream dataStream = request.GetRequestStream();
dataStream.Write(byteArray, 0, byteArray.Length);
dataStream.Close();
WebResponse response = request.GetResponse();
Console.WriteLine(((HttpWebResponse)response).StatusDescription);
dataStream = response.GetResponseStream();
using (StreamReader reader = new StreamReader(dataStream))
{
responseFromServer = reader.ReadToEnd();
}
Console.WriteLine(responseFromServer);
dataStream.Close();
response.Close();
Reading such large file gives me out of memory exception. Is there a way I can do this?
回答1:
Sounds like you may be encountering this documented issue with HttpWebRequest. Per the KB article, try setting the HttpWebRequest.AllowWriteStreamBuffering
property to false
.
回答2:
All files are transferred in chunks - that's what an ethernet packet is; it's a single chunk of data. I would wager that the requirement really means "this file must be transferred in a single web service call."
Assuming that's the case, you'd read the data from disk into a 64KB buffer, and then write the buffer to the request.
request.ContentType = "text/plain";
request.ContentLength = byteArray.Length;
Stream dataStream = request.GetRequestStream();
int BUFFER_SIZE = 65536;
byte[] buffer = new byte[BUFFER_SIZE];
using (StreamReader reader = new StreamReader(filename)) {
int count = 0;
while (true) {
int count = reader.Read(buffer, 0, BUFFER_SIZE);
dataStream.Write(buffer, 0, count);
if (count < BUFFER_SIZE) break;
}
}
dataStream.Close();
来源:https://stackoverflow.com/questions/11640844/out-of-memory-exception-reading-large-text-file-for-httpwebrequest