large-files

Large file support in C++

送分小仙女□ 提交于 2019-11-29 14:45:12
问题 64bit file API is different on each platform. in windows: _fseeki64 in linux: fseeko in freebsd: yet another similar call ... How can I most effectively make it more convenient and portable? Are there any useful examples? 回答1: Most POSIX-based platforms support the " _FILE_OFFSET_BITS " preprocessor symbol. Setting it to 64 will cause the off_t type to be 64 bits instead of 32, and file manipulation functions like lseek() will automatically support the 64 bit offset through some preprocessor

Is there any memory restrictions on an ASP.Net application?

生来就可爱ヽ(ⅴ<●) 提交于 2019-11-29 12:14:42
I have an ASP.Net MVC application that allows users to upload images. When I try to upload a really large file (400MB) I get an error. I assumed that my image processing code (home brew) was very inefficient, so I decided I would try using a third party library to handle the image processing parts. Because I'm using TDD, I wanted to first write a test that fails. But when I test the controller action with the same large file it is able to do all the image processing without any trouble. The error I get is "Out of memory". I'm sure my code is probably using a lot more memory than it needs to

[Android SDK]Can't copy external database (13MB) from Assets

半世苍凉 提交于 2019-11-29 12:13:56
I need a list of italian words for a game I'm developing but I can't actually make it copy my database from assets. I tried quitea lot of solutions I found on the website, such as: Using your own SQLite database in Android applications how to copy large database which occupies much memory from assets folder to my application? Load files bigger than 1M from assets folder But I had no luck, it keeps on giving me this error on the line os.write(buffer, 0, len); but I can't understand why. Here's the function's code and the constants I'm using. A strange thins is that my database stops copying

How to read 4GB file on 32bit system

六眼飞鱼酱① 提交于 2019-11-29 11:44:38
In my case I have different files lets assume that I have >4GB file with data. I want to read that file line by line and process each line. One of my restrictions is that soft has to be run on 32bit MS Windows or on 64bit with small amount of RAM (min 4GB). You can also assume that processing of these lines isn't bottleneck. In current solution I read that file by ifstream and copy to some string. Here is snippet how it looks like. std::ifstream file(filename_xml.c_str()); uintmax_t m_numLines = 0; std::string str; while (std::getline(file, str)) { m_numLines++; } And ok, that's working but to

Get Large File Size in C

半世苍凉 提交于 2019-11-29 09:55:53
Before anyone complains of "duplicate", I've been checking SO quite thoroughly, but there seem to be no clean answer yet, although the question looks quite simple. I'm looking for a portable C code , which is able to provide the size of a file, even if such a file is bigger than 4GB. The usual method (fseek, ftell) works fine, as long as the file remains < 2GB. It's fairly well supported everywhere, so I'm trying to find something equivalent. Unfortunately, the updated methods (fseeko, ftello) are not supported by all compilers. For example, MinGW miss it (and obviously MSVC). Furthermore,

Stream parse 4 GB XML file in PHP

梦想与她 提交于 2019-11-29 07:38:44
I'm trying and need some help doing the following: I want to stream parse a large XML file ( 4 GB ) with PHP. I can't use simple XML or DOM because they load the entire file into memory, so I need something that can stream the file. How can I do this in PHP? What I am trying to do is to navigate through a series of <doc> elements. And write some of their children to a new xml file. The XML file I am trying to parse looks like this: <feed> <doc> <title>Title of first doc is here</title> <url>URL is here</url> <abstract>Abstract is here...</abstract> <links> <sublink>Link is here</sublink>

Charting massive amounts of data

一世执手 提交于 2019-11-29 06:23:19
问题 We are currently using ZedGraph to draw a line chart of some data. The input data comes from a file of arbitrary size, therefore, we do not know what the maximum number of datapoints in advance. However, by opening the file and reading the header, we can find out how many data points are in the file. The file format is essentially [time (double), value (double)]. However, the entries are not uniform in the time axis. There may not be any points between say t = 0 sec and t = 10 sec, but there

Are there any good workarounds to the GitHub 100MB file size limit for text files?

一笑奈何 提交于 2019-11-29 05:42:24
问题 I have a 190 MB plain text file that I want to track on github. The text file is a pronounciation lexicon file for our text-to-speech engine. We regularly add and modify lines in the text files, and the diffs are fairly small, so it's perfect for git in that sense. However, GitHub has a strict 100 MB file size limit in place. I have tried the GitHub Large File Storage service, but that uploads a new version of the entire 190 MB file every time it changes - so that would quickly grow to many

php uploading large files

痞子三分冷 提交于 2019-11-29 05:19:09
I'm stuck here with file uploading problem. I've searched for answers but found only "increasing post_max_size and upload_max_filesize" suggestion and that doesn't work for me. I can't get large files uploaded(approx. around 150mb+), the following are my php.ini settings and my environments php.ini - max_input_time 300 - max_execution_time 600 - memory_limit 1024M - upload_max_filesize 1512M - post_max_size 2048M environments - Webserver: XAMPP - PHPFramwork: CodeIgniter I’ve also tried many other php.ini configurations. The file uploading class that I’ve built received posted file data from

32 bit Windows and the 2GB file size limit (C with fseek and ftell)

会有一股神秘感。 提交于 2019-11-29 04:52:24
I am attempting to port a small data analysis program from a 64 bit UNIX to a 32 bit Windows XP system (don't ask :)). But now I am having problems with the 2GB file size limit (long not being 64 bit on this platform). I have searched this website and others for possible sollutions but cannot find any that are directly translatable to my problem. The problem is in the use of fseek and ftell. Does anyone know of a modification to the following two functions to make them work on 32 bit Windows XP for files larger than 2GB (actually order 100GB). It is vital that the return type of nsamples is a