I\'d like to know how I can split a large file without using too many system resources. I\'m currently using this code:
public static void SplitFile(string i
It seems odd to assemble each output file in memory; I suspect you should be running an inner buffer (maybe 20k or something) and calling Write
more frequently.
Ultimately, if you need IO, you need IO. If you want to be courteous to a shared hosting environment you could add deliberate pauses - maybe short pauses within the inner loop, and a longer pause (maybe 1s) in the outer loop. This won't affect your overall timing much, but may help other processes get some IO.
Example of a buffer for the inner-loop:
public static void SplitFile(string inputFile, int chunkSize, string path)
{
const int BUFFER_SIZE = 20 * 1024;
byte[] buffer = new byte[BUFFER_SIZE];
using (Stream input = File.OpenRead(inputFile))
{
int index = 0;
while (input.Position < input.Length)
{
using (Stream output = File.Create(path + "\\" + index))
{
int remaining = chunkSize, bytesRead;
while (remaining > 0 && (bytesRead = input.Read(buffer, 0,
Math.Min(remaining, BUFFER_SIZE))) > 0)
{
output.Write(buffer, 0, bytesRead);
remaining -= bytesRead;
}
}
index++;
Thread.Sleep(500); // experimental; perhaps try it
}
}
}