large-files

Large file upload though html form (more than 2 GB)

梦想的初衷 提交于 2019-11-26 08:48:54
问题 Is there anyway to upload a file more than 2 GB, using simple html form upload? Previously I have been uploading large files through silverlight using chunking (dividing a large file into segments and then uploading segments one by one & then reassemble segments at server). Now, we have a requirement that we just have to use simple html (though GWT) form uploads. Please guide me if there is any way to achieve large file upload this way. If it is impossible to do it using simple html, can

How to avoid OutOfMemoryError when uploading a large file using Jersey client

怎甘沉沦 提交于 2019-11-26 08:13:22
问题 I am using Jersey client for http-based request. It works well if the file is small but run into error when I post a file with size of 700M: Exception in thread \"main\" java.lang.OutOfMemoryError: Java heap space at java.util.Arrays.copyOf(Arrays.java:2786) at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:94) at sun.net.www.http.PosterOutputStream.write(PosterOutputStream.java:61) at com.sun.jersey.api.client.CommittingOutputStream.write(CommittingOutputStream.java:90) at

Best way to process large XML in PHP [duplicate]

为君一笑 提交于 2019-11-26 08:12:18
问题 This question already has answers here : Parsing Huge XML Files in PHP (7 answers) Closed 2 years ago . I have to parse large XML files in php, one of them is 6.5 MB and they could be even bigger. The SimpleXML extension as I\'ve read, loads the entire file into an object, which may not be very efficient. In your experience, what would be the best way? 回答1: For a large file, you'll want to use a SAX parser rather than a DOM parser. With a DOM parser it will read in the whole file and load it

How to read large xml file without loading it in memory and using XElement

人走茶凉 提交于 2019-11-26 07:47:23
问题 I want to read a large xml file (100+M). Due to its size, I do not want to load it in memory using XElement. I am using linq-xml queries to parse and read it. What\'s the best way to do it? Any example on combination of XPath or XmlReader with linq-xml/XElement? Please help. Thanks. 回答1: Yes, you can combine XmlReader with the method XNode.ReadFrom, see the example in the documentation which uses C# to selectively process nodes found by the XmlReader as an XElement. 回答2: The example code in

python: read lines from compressed text files

非 Y 不嫁゛ 提交于 2019-11-26 07:38:28
问题 Is it easy to read a line from a gz-compressed text file using python without extracting the file completely? I have a text.gz file which is aroud 200mb. When I extract it, it becomes 7.4gb. And this is not the only file I have to read. For the total process, I have to read 10 files. Although this will be a sequential job, I think it will a smart thing to do it without extarcting the whole information. I do not even know that it is possible. How can it be done using python? I need to read a

Upload 1GB files using chunking in PHP

狂风中的少年 提交于 2019-11-26 07:36:31
问题 I have a web application that accepts file uploads of up to 4 MB. The server side script is PHP and web server is NGINX. Many users have requested to increase this limit drastically to allow upload of video etc. However there seems to be no easy solution for this problem with PHP. First, on the client side I am looking for something that would allow me to chunk files during transfer. SWFUpload does not seem to do that. I guess I can stream uploads using Java FX (http://blogs.oracle.com

Reading very large files in PHP

泄露秘密 提交于 2019-11-26 05:51:22
问题 fopen is failing when I try to read in a very moderately sized file in PHP . A 6 meg file makes it choke, though smaller files around 100k are just fine. i\'ve read that it is sometimes necessary to recompile PHP with the -D_FILE_OFFSET_BITS=64 flag in order to read files over 20 gigs or something ridiculous, but shouldn\'t I have no problems with a 6 meg file? Eventually we\'ll want to read in files that are around 100 megs, and it would be nice be able to open them and then read through

HTML5 - How to stream large .mp4 files?

不羁的心 提交于 2019-11-26 04:57:53
问题 I\'m trying to setup a very basic html5 page that loads a .mp4 video that is 20MB. It appears that the browser needs to download the entire thing rather than just playing the first part of the video and streaming in the rest. This post is the closest thing I\'ve found while searching... I tried both Hand Brake and Data Go Round by neither appeared to make a difference: Any ideas on how to do this or if it\'s possible? Here is the code I\'m using: <video controls=\"controls\"> <source src=\"

Seeking and reading large files in a Linux C++ application

余生长醉 提交于 2019-11-26 04:44:00
问题 I am running into integer overflow using the standard ftell and fseek options inside of G++, but I guess I was mistaken because it seems that ftell64 and fseek64 are not available. I have been searching and many websites seem to reference using lseek with the off64_t datatype, but I have not found any examples referencing something equal to fseek . Right now the files that I am reading in are 16GB+ CSV files with the expectation of at least double that. Without any external libraries what is

Using Python Iterparse For Large XML Files

那年仲夏 提交于 2019-11-26 02:08:01
问题 I need to write a parser in Python that can process some extremely large files ( > 2 GB ) on a computer without much memory (only 2 GB). I wanted to use iterparse in lxml to do it. My file is of the format: <item> <title>Item 1</title> <desc>Description 1</desc> </item> <item> <title>Item 2</title> <desc>Description 2</desc> </item> and so far my solution is: from lxml import etree context = etree.iterparse( MYFILE, tag=\'item\' ) for event, elem in context : print elem.xpath( \'description