large-data

Encrypt big char* using std::string with Crypto++

蓝咒 提交于 2020-02-08 04:19:28
问题 I am new with Crypto++. I want to using Crypto++ library to encrypt/decrypt a large byte array in C++. The data can be anything, so asume its binary format. First, I tried with "byte array" (char * or char[]). byte PlainText[] = { 'H','e','l','l','o',' ', 'W','o','r','l','d', 0x0,0x0,0x0,0x0,0x0 }; byte key[ AES::DEFAULT_KEYLENGTH ]; ::memset( key, 0x01, AES::DEFAULT_KEYLENGTH ); // Encrypt ECB_Mode< AES >::Encryption Encryptor( key, sizeof(key) ); byte cbCipherText[AES::BLOCKSIZE]; Encryptor

arrayfun with function with inputs of different dimensions

ぃ、小莉子 提交于 2020-01-30 06:30:12
问题 I'm trying to create a matrix that contains the averages of the kxk submatrices of a larger nxn matrix, where n is divisible by k. I can accomplish this fairly efficiently with something like this mat = mat2cell(mat, k*ones(1,n/k), k*ones(1,n/k)) mat = cellfun(@mean,mat,'UniformOutput',false); mat = cellfun(@mean,mat,'UniformOutput',false); %repeated to collapse cells to 1x1 mat = cell2mat(mat) However, since I have a very large amount of data all in very large matrices, repeating this

Constructing high resolution images in Python

感情迁移 提交于 2020-01-23 17:10:06
问题 Say I have some huge amount of data stored in an HDF5 data file (size: 20k x 20k, if not more) and I want to create an image from all of this data using Python. Obviously, this much data cannot be opened and stored in the memory without an error. Therefore, is there some other library or method that would not require all of the data to be dumped into the memory and then processed into an image (like how the libraries: Image, matplotlib, numpy, etc. handle it)? Thanks. This question comes from

Finding the transpose of a very, very large matrix

大兔子大兔子 提交于 2020-01-23 09:54:10
问题 I have this huge 2 dimensional array of data. It is stored in row order: A(1,1) A(1,2) A(1,3) ..... A(n-2,n) A(n-1,n) A(n,n) I want to rearrange it into column order A(1,1) A(2,1) A(3,1) ..... A(n,n-2) A(n,n-1) A(n,n) The data set is rather large - more than will fit on the RAM on a computer. (n is about 10,000, but each data item takes about 1K of space.) Does anyone know slick or efficient algorithms to do this? 回答1: Create n empty files (reserve enough space for n elements, if you can).

Fastest way to transfer Excel table data to SQL 2008R2

强颜欢笑 提交于 2020-01-22 14:09:31
问题 Does anyone know the fastest way to get data from and Excel table (VBA Array) to a table on SQL 2008 without using an external utility (i.e. bcp)? Keep in mind my datasets are usually 6500-15000 rows, and about 150-250 columns; and I end up transferring about 20-150 of them during an automated VBA batch script. I have tried several methods for getting large amounts of data from an Excel table (VBA) to SQL 2008. I have listed those below: Method 1. Pass table into VBA Array and send to stored

Sorting gigantic binary files with C#

↘锁芯ラ 提交于 2020-01-22 12:46:48
问题 I have a large file of roughly 400 GB of size. Generated daily by an external closed system. It is a binary file with the following format: byte[8]byte[4]byte[n] Where n is equal to the int32 value of byte[4]. This file has no delimiters and to read the whole file you would just repeat until EOF. With each "item" represented as byte[8]byte[4]byte[n]. The file looks like byte[8]byte[4]byte[n]byte[8]byte[4]byte[n]...EOF byte[8] is a 64-bit number representing a period of time represented by

QCompleter for large models

ぐ巨炮叔叔 提交于 2020-01-21 15:07:38
问题 QCompleter works slightly slow on large data sets (large models): when I start to input characters in QCombobox it passes few seconds to show auto-complete popup with variants, when input 2nd char QCompleter does not react on key press for few seconds as well. Next characters works fine. Model size is about 100K records. Is it possible to improve QCompleter performance or show popup after 2nd or 3rd input symbol? Are there some good examples? 回答1: Solution appears similar to this: https:/

PHP Connection Reset on Large File Upload Regardless Correct Setting

霸气de小男生 提交于 2020-01-21 01:41:10
问题 I am having a very common problem which it seems that all the available solutions found are not working. We have a LAMP server which is receiving high amount of traffic. Using this server, we perform a regular file submission upload. On small file uploads, it works perfectly. On files of around 4-5MB, this submission upload failed intermittently (sometimes it works but many times it failed). We have the following configuration on our PHP: max_input_time: 600 max_execution_time: 600 max_upload

Multi conditional statistics (avg, std dev, z-scores) for large data sets in Excel/VBA

大憨熊 提交于 2020-01-16 05:20:22
问题 I'm looking to calculate statistics for a large data set on Excel and encountering some issues due to data set size. It seems VBA may be the way to go, as copying AVERAGEIF and STDDEV array functions across data this size is causing long calculation times. Appreciate possible solutions or code that could be used here. Goals: To calculate statistics (avg, std dev, z-scores) conditional on 2 identifiers (e.g. average of all heights at 01/01/10) Able to handle large data sets (100k+ data points)

Optimising HDF5 dataset for Read/Write speed

倾然丶 夕夏残阳落幕 提交于 2020-01-14 06:04:06
问题 I'm currently running an experiment where I scan a target spatially and grab an oscilloscope trace at each discrete pixel. Generally my trace lengths are 200Kpts. After scanning the entire target I assemble these time domain signals spatially and essentially play back a movie of what was scanned. My scan area is 330x220 pixels in size so the entire dataset is larger than RAM on the computer I have to use. To start with I was just saving each oscilloscope trace as a numpy array and then after