large-files

Git/rsync mix for projects with large binaries and text files

余生颓废 提交于 2019-12-05 19:15:36
Is anyone aware of a project that can effectively combine git version control for text-based files and something like rsync for large binary files (like data)? Obviously, this is a little beyond what a DVCS should do, but I am curious if anyone has written a smart wrapper around git to do such things to sync with a central repository. You might like git-annex . From its homepage: git-annex allows managing files with git, without checking the file contents into git. While that may seem paradoxical, it is useful when dealing with files larger than git can currently easily handle, whether due to

How do I safely disable/remove the largefiles directory from a mercurial repository?

廉价感情. 提交于 2019-12-05 14:39:29
In the past, I have been working with the largefiles extension in mercurial to save data together with the code I have been working on. I think this was a mistake and I would like to remove the "largefiles" directory (8GB). Our network user directories are limited to 10 GB, and I need space. I have not used any large files for a long time now. I will not miss them when they are gone forever. So my questions are Can I remove the largefiles directory under .hg without damaging the repo? If I do, will I be able to check out old code, even if some large datafiles are missing? Should I remove those

content-length header from php is overwritten !

感情迁移 提交于 2019-12-05 11:40:42
I'm trying to figure why the Content-Length header of php gets overwritten. This is demo.php <?php header("Content-Length: 21474836470");die; ?> a request to fetch the headers curl -I http://someserver.com/demo.php HTTP/1.1 200 OK Date: Tue, 19 Jul 2011 13:44:11 GMT Server: Apache/2.2.16 (Debian) X-Powered-By: PHP/5.3.3-7+squeeze3 Content-Length: 2147483647 Cache-Control: must-revalidate Content-Type: text/html; charset=UTF-8 See Content-Length ? It maxes out at 2147483647 bytes, that is 2GB. Now if modify demo.php like so <?php header("Dummy-header: 21474836470");die; ?> the header is not

Serving Large Protected Files in PHP/Apache

我与影子孤独终老i 提交于 2019-12-05 10:30:36
I need to serve up large files (> 2gb) from an Apache web server. The files are protected downloads, so I need some kind of way to authorize the user. The CMS I'm using uses cookies checked against a MySQL database to verify the user. On the server, I have no control over max_execution_time, and limited control over memory_limit. My technique has been working for small files. After the user has been authorized in PHP (by the CMS), I use readfile() to serve the file, which is stored above the document root to prevent direct access. I've read about techniques to chunk the download or to use

Python - Opening and changing large text files

五迷三道 提交于 2019-12-05 10:21:24
I have a ~600MB Roblox type .mesh file, which reads like a text file in any text editor. I have the following code below: mesh = open("file.mesh", "r").read() mesh = mesh.replace("[", "{").replace("]", "}").replace("}{", "},{") mesh = "{"+mesh+"}" f = open("p2t.txt", "w") f.write(mesh) It returns: Traceback (most recent call last): File "C:\TheDirectoryToMyFile\p2t2.py", line 2, in <module> mesh = mesh.replace("[", "{").replace("]", "}").replace("}{", "},{") MemoryError Here is a sample of my file: [-0.00599, 0.001466, 0.006][0.16903, 0.84515, 0.50709][0.00000, 0.00000, 0][-0.00598, 0.001472,

Upload large file in background (service restarting when the app closed)

微笑、不失礼 提交于 2019-12-05 08:12:11
I would like to upload large files (~10 - 100Mb wifi or mobile network), but in background, because the user maybe will leave the app and later the system will close the app (if not enoguh memory) I created a service for this case but my problem is that when i killed the app the service restarting and the uploading start again. I found same problems without solution: keeping background service alive after user exit app My service is restarted each time the application is closed So it won't work, but what is the solution? How does the youtube app??? You should use a foreground service via the

How to remove largefiles from Mercurial repo

妖精的绣舞 提交于 2019-12-05 05:10:39
See also this question . Without knowing what I was doing, I enabled the largefiles extension, committed a file and pushed it to kiln. Now I know the error of my ways, and I need to permanently revert this change. I followed the guidance from SO on the subject; and I can remove largefiles locally, but this doesn't affect the remote repos in kiln. I have tried opening the repo in KilnRepositories on the Kiln server and nuking the largefiles folder (as well as deleting 'largefiles' from the requires file), but after a few pushes/pulls the folder and the require's line come back. Is there a way

Is using istream::seekg too much expensive?

余生颓废 提交于 2019-12-05 03:35:45
In c++, how expensive is it to use the istream::seekg operation? EDIT: How much can I get away with seeking around a file and reading bytes? What about frequency versus magnitude of offset? I have a large file (4GB) that I am parsing, and I want to know if it's necessary to try to consolidate some of my seekg calls. I would assume that the magnitude of differences in file location play a role--like if you seek more than a page in memory away, it will impact performance--but small seeking is of no consequence. Is this correct? This question is heavily dependent on your operating system and disk

How to handle loading of LARGE JSON files

徘徊边缘 提交于 2019-12-05 02:54:37
问题 I've been working on a WebGL application that requires a tremendous amount of point data to draw to the screen. Currently, that point and surface data is stored on the webserver I am using, and ALL 120MB of JSON is being downloaded by the browser on page load. This takes well over a minute on my network, which is not optimal. I was wondering if anyone has any experience/tips about loading data this large. I've tried eliminating as much whitespace as possible, but that barely made a dent in

C# Saving huge images

狂风中的少年 提交于 2019-12-05 02:44:49
I'm having difficulties when trying to save huge images with C# (I'm talking about over one gigabyte). Basically I'm trying to do this in parts - I have around 200 bitmap sources and I need a way to combine them before or after encoding them to a .png file. I know this is going to require lots of RAM unless I somehow stream the data directly from hard drive but I have no idea how to do this either. Each bitmap source is 895x895 pixels so combining the images after encoding doesn't seem easy because C# doesn't let you create a bitmap with size of 13425 x 13425. This PngCs library (disclaimer: I