large-files

Read large file into sqlite table in objective-C on iPhone

主宰稳场 提交于 2020-01-31 20:34:31
问题 I have a 2 MB file, not too large, that I'd like to put into an sqlite database so that I can search it. There are about 30K entries that are in CSV format, with six fields per line. My understanding is that sqlite on the iPhone can handle a database of this size. I have taken a few approaches but they have all been slow > 30 s. I've tried: 1) Using C code to read the file and parse the fields into arrays. 2) Using the following Objective-C code to parse the file and put it into directly into

Read large file into sqlite table in objective-C on iPhone

こ雲淡風輕ζ 提交于 2020-01-31 20:33:19
问题 I have a 2 MB file, not too large, that I'd like to put into an sqlite database so that I can search it. There are about 30K entries that are in CSV format, with six fields per line. My understanding is that sqlite on the iPhone can handle a database of this size. I have taken a few approaches but they have all been slow > 30 s. I've tried: 1) Using C code to read the file and parse the fields into arrays. 2) Using the following Objective-C code to parse the file and put it into directly into

XML(265 mb) to excel data import [closed]

时光毁灭记忆、已成空白 提交于 2020-01-26 04:59:37
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed yesterday . I am trying import data from XML to excel. Warning my xml file size is of 265 mb which contains POS transactions. There were few duplicate transactions(same trans number imported twice) imported into the xml file. So i need to figure it out by importing the data in to excel. I have tried opening excel >> Data >

Difficulty reading large file into byte array

邮差的信 提交于 2020-01-25 12:55:28
问题 I have a very large BMP file that I have to read in all at once because I need to reverse the bytes when writing it to a temp file. This BMP is 1.28GB, and I'm getting the "Out of memory" error. I can't read it completely (using ReadAllBytes) or using a buffer into a binary array because I can't initialize an array of that size. I also can't read it into a List (which I could then Reverse()) using a buffer because halfway through it runs out of memory. So basically the question is, how do I

why minizip doesn't archive large file (larger 4 GB)

时光毁灭记忆、已成空白 提交于 2020-01-24 20:34:37
问题 I am trying to use static minizip library on Windows 7 64-bit, using Visual Studio 2010. The main goal is to archive files larger 4GB. I build zlib using CMake 2.8, and linked in to my project. It works for files less 4GB, but doesn't work proper for files larger 4GB. Why I have a problem with archiving 5GB file using minizip? Did I miss something on the build libraries stage? Here are all my steps, library and project: https://github.com/koponomarenko/file_compression Really need help.

Fastest way to read very large text file in C#

泪湿孤枕 提交于 2020-01-14 14:15:14
问题 I have a very basic question. I have several text files with data which are several GB's in size each. I have a C# WPF application which I'm using to process similar data files but nowhere close to that size (probably around 200-300mb right now). How can I efficiently read this data and then write it somewhere else after processing without everything freezing and crashing? Essentially whats the best way to read from a very large file? For my low scale application right now, I use System.IO

Git/rsync mix for projects with large binaries and text files

笑着哭i 提交于 2020-01-13 13:12:40
问题 Is anyone aware of a project that can effectively combine git version control for text-based files and something like rsync for large binary files (like data)? Obviously, this is a little beyond what a DVCS should do, but I am curious if anyone has written a smart wrapper around git to do such things to sync with a central repository. 回答1: You might like git-annex. From its homepage: git-annex allows managing files with git, without checking the file contents into git. While that may seem

C# Saving huge images

☆樱花仙子☆ 提交于 2020-01-13 08:20:09
问题 I'm having difficulties when trying to save huge images with C# (I'm talking about over one gigabyte). Basically I'm trying to do this in parts - I have around 200 bitmap sources and I need a way to combine them before or after encoding them to a .png file. I know this is going to require lots of RAM unless I somehow stream the data directly from hard drive but I have no idea how to do this either. Each bitmap source is 895x895 pixels so combining the images after encoding doesn't seem easy

Sorting big file (10G)

廉价感情. 提交于 2020-01-12 08:52:10
问题 I'm trying to sort a big table stored in a file. The format of the file is (ID, intValue) The data is sorted by ID , but what I need is to sort the data using the intValue , in descending order. For example ID | IntValue 1 | 3 2 | 24 3 | 44 4 | 2 to this table ID | IntValue 3 | 44 2 | 24 1 | 3 4 | 2 How can I use the Linux sort command to do the operation? Or do you recommend another way? 回答1: How can I use the Linux sort command to do the operation? Or do you recommend another way? As others

Sorting big file (10G)

萝らか妹 提交于 2020-01-12 08:52:03
问题 I'm trying to sort a big table stored in a file. The format of the file is (ID, intValue) The data is sorted by ID , but what I need is to sort the data using the intValue , in descending order. For example ID | IntValue 1 | 3 2 | 24 3 | 44 4 | 2 to this table ID | IntValue 3 | 44 2 | 24 1 | 3 4 | 2 How can I use the Linux sort command to do the operation? Or do you recommend another way? 回答1: How can I use the Linux sort command to do the operation? Or do you recommend another way? As others