Using C#, I am finding the total size of a directory. The logic is this way : Get the files inside the folder. Sum up the total size. Find if there are sub directories. Then
The short answer is no. The way Windows could make the directory size computation a faster would be to update the directory size and all parent directory sizes on each file write. However, that would make file writes a slower operation. Since it is much more common to do file writes than read directory sizes it is a reasonable tradeoff.
I am not sure what exact problem is being solved but if it is file system monitoring it might be worth checking out: http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.aspx
Based on the answer by spookycoder, I found this variation (using DirectoryInfo
) at least 2 times faster (and up to 10 times faster on complex folder structures!) :
public static long CalcDirSize(string sourceDir, bool recurse = true)
{
return _CalcDirSize(new DirectoryInfo(sourceDir), recurse);
}
private static long _CalcDirSize(DirectoryInfo di, bool recurse = true)
{
long size = 0;
FileInfo[] fiEntries = di.GetFiles();
foreach (var fiEntry in fiEntries)
{
Interlocked.Add(ref size, fiEntry.Length);
}
if (recurse)
{
DirectoryInfo[] diEntries = di.GetDirectories("*.*", SearchOption.TopDirectoryOnly);
System.Threading.Tasks.Parallel.For<long>(0, diEntries.Length, () => 0, (i, loop, subtotal) =>
{
if ((diEntries[i].Attributes & FileAttributes.ReparsePoint) == FileAttributes.ReparsePoint) return 0;
subtotal += __CalcDirSize(diEntries[i], true);
return subtotal;
},
(x) => Interlocked.Add(ref size, x)
);
}
return size;
}