file-processing

How to remove all code from multiple VB6 .frm files and leave form design?

为君一笑 提交于 2019-12-02 22:03:09
问题 I have a large VB6 app with many .frm files. I want to basically 'gut' the code from all the forms and just leave the GUI design. What would be the best way to perform this task quickly? 回答1: If you really have enough forms that you can't just open each form and Ctrl + A , Del , Ctrl + S Then you can always write a quick VB program to do it. Visual Basic puts the information needed to display the form at the beginning of the file followed by the code. Copy each .frm file to a backup, open it

Splitting command line args with GNU parallel

ぐ巨炮叔叔 提交于 2019-12-02 21:44:52
Using GNU parallel : http://www.gnu.org/software/parallel/ I have a program that takes two arguments, e.g. $ ./prog file1 file2 $ ./prog file2 file3 ... $ ./prog file23456 file23457 I'm using a script that generates the file name pairs, however this poses a problem because the result of the script is a single string - not a pair. like: $ ./prog "file1 file2" GNU parallel seems to have a slew of tricks up its sleeves, I wonder if there's one for splitting text around separators: $ generate_file_pairs | parallel ./prog ? # where ? is text under consideration, like "file1 file2" The easy work

Text file with different data types into structure array [closed]

筅森魡賤 提交于 2019-12-02 18:34:59
问题 Closed . This question needs details or clarity. It is not currently accepting answers. Want to improve this question? Add details and clarify the problem by editing this post. Closed 6 years ago . I have to parse a text file with 3 different data types. I want it to be saved in a structure array with three members. My text file look like this: A B 45.78965 A C 35.46731 B C 46.78695 The program that I'm reading it with is the following and it does not work. What am I doing wrong? #include

Parse tab delimited text file

感情迁移 提交于 2019-12-02 08:53:19
问题 I need to parse a tab-delimited text file by grabbing specific columns, like columns 1 and 5, and output each of these columns into a text file. Please find an example of the data file, and the code: Data file: COL1 COL2 COL3 COL4 COL5 COL6 123 345 678 890 012 234 234 456 787 901 123 345 etc Batch file: @echo off & setlocal For /F "tokens=1,5*" %%i in (myFile.dat) do call :doSomething "%%i" "%%j" goto :eof :doSomething Set VAR1=%1 Set VAR2=%2 @echo %VAR1%>>Entity.txt @echo %VAR2%>>Account.txt

Text file with different data types into structure array [closed]

梦想的初衷 提交于 2019-12-02 07:52:29
I have to parse a text file with 3 different data types. I want it to be saved in a structure array with three members. My text file look like this: A B 45.78965 A C 35.46731 B C 46.78695 The program that I'm reading it with is the following and it does not work. What am I doing wrong? #include <stdio.h> struct gra { char from; char to; double w; }; int main () { FILE *fp = fopen("graph.txt", "r"); int i = 0; while (!feof(fp)) { fscanf(fp, "%[^\t]", &graph[i].from, &graph[i].to, &graph[i].w); i++; } fclose(fp); } One of your problems is that you're reading using %[^\t] , which reads strings,

Parse tab delimited text file

这一生的挚爱 提交于 2019-12-02 04:21:42
I need to parse a tab-delimited text file by grabbing specific columns, like columns 1 and 5, and output each of these columns into a text file. Please find an example of the data file, and the code: Data file: COL1 COL2 COL3 COL4 COL5 COL6 123 345 678 890 012 234 234 456 787 901 123 345 etc Batch file: @echo off & setlocal For /F "tokens=1,5*" %%i in (myFile.dat) do call :doSomething "%%i" "%%j" goto :eof :doSomething Set VAR1=%1 Set VAR2=%2 @echo %VAR1%>>Entity.txt @echo %VAR2%>>Account.txt This works, however, the For loop stops on the first line. Could you help me in finding the issue?

How do I read a large file from disk to database without running out of memory

不想你离开。 提交于 2019-12-01 09:34:25
I feel embarrassed to ask this question as I feel like I should already know. However, given I don't....I want to know how to read large files from disk to a database without getting an OutOfMemory exception. Specifically, I need to load CSV (or really tab delimited files). I am experimenting with CSVReader and specifically this code sample but I'm sure I'm doing it wrong. Some of their other coding samples show how you can read streaming files of any size, which is pretty much what I want (only I need to read from disk), but I don't know what type of IDataReader I could create to allow this.

Python - reading files from directory file not found in subdirectory (which is there)

耗尽温柔 提交于 2019-12-01 08:15:40
I am convinced it is something simply syntactic - I however can not figure out why my code: import os from collections import Counter d = {} for filename in os.listdir('testfilefolder'): f = open(filename,'r') d = (f.read()).lower() freqs = Counter(d) print(freqs) will not work - it apparently can see in to the 'testfilefolder' folder and tell me that the the file is there i.e. an error message 'file2.txt' is not found. So it can find it to tell me that it is not found... I however get this piece of code to work: from collections import Counter d = {} f = open("testfilefolder/file2.txt",'r') d

reading input from text file into array of structures in c

这一生的挚爱 提交于 2019-12-01 07:17:45
My structure definition is, typedef struct { int taxid; int geneid; char goid[20]; char evidence[4]; char qualifier[20]; char goterm[50]; char pubmed; char category[20]; } gene2go; I have tab-seperated text file called `"gene2go.txt". Each line of this file contains taxID , geneID , goID , evidence , qualifier , goterm , pubmed and category information. Each line of the file will be kept in a structure. When the program is run, it will first read the content of the input file into an array of type gene2go, I used a function called readInfo . The program will also take the following input

How to prevent file from being overridden when reading and processing it with Java?

筅森魡賤 提交于 2019-11-30 22:57:48
I'd need to read and process somewhat large file with Java and I'd like to know, if there is some sensible way to protect the file that it wouldn't be overwritten by other processes while I'm reading & processing it? That is, some way to make it read-only, keep it "open" or something... This would be done in Windows environment. br, Touko you want a FileLock : FileChannel channel = new RandomAccessFile("C:\\foo", "rw").getChannel(); // Try acquiring the lock without blocking. This method returns // null or throws an exception if the file is already locked. FileLock lock = channel.tryLock(); //