Reusing a filestream

限于喜欢 提交于 2019-11-30 06:37:52

Your suspicion is correct - if you reset the position of an open file stream and write content that's smaller than what's already in the file, it will leave trailing data and result in a corrupt file (depending on your definition of "corrupt", of course).

If you want to overwrite the file, you really should close the stream when you're finished with it and create a new stream when you're ready to re-save.

I notice from your linked question that you are holding the file open in order to prevent other users from writing to it at the same time. This probably wouldn't be my choice, but if you are going to do that, then I think you can "clear" the file by invoking stream.SetLength(0) between successive saves.

There are various ways to do this; if you are re-opening the file, perhaps set it to truncate:

using(var file = new FileStream(path, FileMode.Truncate)) {
    // write
}

If you are overwriting the file while already open, then just trim it after writing:

file.SetLength(file.Position); // assumes we're at the new end

I would try to avoid delete/recreate, since this loses any ACLs etc.

Another option might be to use SetLength(0) to truncate the file before you start rewriting it.

Recently ran into the same requirement. In fact, previously, I used to create a new FileStream within a using statement and overwrite the previous file. Seems like the simple and effective thing to do.

using (var stream = new FileStream(path, FileMode.Create, FileAccess.Write)
{
   ProtoBuf.Serializer.Serialize(stream , value);
}

However, I ran into locking issues where some other process is locking the target file. In my attempt to thwart this I retried the write several times before pushing the error up the stack.

int attempt = 0;
while (true)
{
   try
   {
      using (var stream = new FileStream(path, FileMode.Create, FileAccess.Write)
      {
         ProtoBuf.Serializer.Serialize(stream , value);
      }
      break;
   }
   catch (IOException)
   {
      // could be locked by another process
      // make up to X attempts to write the file
      attempt++;
      if (attempt >= X)
      {
         throw;
      }
      Thread.Sleep(100);
   }
}

That seemed to work for almost everyone. Then that problem machine came along and forced me down the path of maintaining a lock on the file the entire time. So in lieu of retrying to write the file in the case it's already locked, I'm now making sure I get and hold the stream open so there are no locking issues with later writes.

int attempt = 0;
while (true)
{
   try
   {
      _stream = new FileStream(path, FileMode.Open, FileAccess.ReadWrite, FileShare.Read);
      break;
   }
   catch (IOException)
   {
      // could be locked by another process
      // make up to X attempts to open the file
      attempt++;
      if (attempt >= X)
      {
         throw;
      }
      Thread.Sleep(100);
   }
}

Now when I write the file the FileStream position must be reset to zero, as Aaronaught said. I opted to "clear" the file by calling _stream.SetLength(0). Seemed like the simplest choice. Then using our serializer of choice, Marc Gravell's protobuf-net, serialize the value to the stream.

_stream.SetLength(0);
ProtoBuf.Serializer.Serialize(_stream, value);

This works just fine most of the time and the file is completely written to the disk. However, on a few occasions I've observed the file not being immediately written to the disk. To ensure the stream is flushed and the file is completely written to disk I also needed to call _stream.Flush(true).

_stream.SetLength(0);
ProtoBuf.Serializer.Serialize(_stream, value);
_stream.Flush(true);

Based on your question I think you'd be better served closing/re-opening the underlying file. You don't seem to be doing anything other than writing the whole file. The value you can add by re-writing Open/Close/Flush/Seek will be next to 0. Concentrate on your business problem.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!