Quickly create large file on a Windows system

前端 未结 23 2293
别那么骄傲
别那么骄傲 2020-12-07 06:35

In the same vein as Quickly create a large file on a Linux system, I\'d like to quickly create a large file on a Windows system. By large I\'m thinking 5 GB.

相关标签:
23条回答
  • 2020-12-07 07:14

    I found an excellent utility that is configurable at https://github.com/acch/genfiles.

    It fills the target file with random data, so there are no problems with sparse files, and for my purposes (testing compression algorithms) it gives a nice level of white noise.

    0 讨论(0)
  • 2020-12-07 07:14

    I found a solution using DEBUG at http://www.scribd.com/doc/445750/Create-a-Huge-File, but I don't know an easy way to script it and it doesn't seem to be able to create files larger than 1 GB.

    0 讨论(0)
  • 2020-12-07 07:17

    I was searching for a way to generate large files with data, not just sparse file. Came across the below technique:

    If you want to create a file with real data then you can use the below command line script.

    echo "This is just a sample line appended to create a big file.. " > dummy.txt
    for /L %i in (1,1,14) do type dummy.txt >> dummy.txt
    

    (Run the above two commands one after another or you can add them to a batch file.)

    The above commands create a 1 MB file dummy.txt within few seconds...

    0 讨论(0)
  • Open up Windows Task Manager, find the biggest process you have running right click, and click on Create dump file.

    This will create a file relative to the size of the process in memory in your temporary folder.

    You can easily create a file sized in gigabytes.

    0 讨论(0)
  • 2020-12-07 07:19

    I was looking for a way to create a large dummy file with space allocation recently. All of the solutions look awkward. Finally I just started the DISKPART utility in Windows (embedded since Windows Vista):

    DISKPART
    CREATE VDISK FILE="C:\test.vhd" MAXIMUM=20000 TYPE=FIXED
    

    Where MAXIMUM is the resulting file size, 20 GB here.

    0 讨论(0)
  • 2020-12-07 07:20

    Check out RDFC http://www.bertel.de/software/rdfc/index-en.html

    RDFC is probably not the fastest, but it does allocate data blocks. The absolutely fastest would have to use lower level API to just obtain cluster chains and put them into MFT without writing data.

    Beware that there's no silver bullet here - if "creation" returns instantly that means you got a sparse file which just fakes a large file, but you won't get data blocks/chains till you write into it. If you just read is you'd get very fast zeros which could make you believe that your drive all of the sudden got blazingly fast :-)

    0 讨论(0)
提交回复
热议问题