text-files

Using a SQL Server for application logging. Pros/Cons?

喜你入骨 提交于 2019-11-30 12:35:37
问题 I have a multi-user application that keeps a centralized logfile for activity. Right now, that logging is going into text files to the tune of about 10MB-50MB / day. The text files are rotated daily by the logger, and we keep the past 4 or 5 days worth. Older than that is of no interest to us. They're read rarely: either when developing the application for error messages, diagnostic messages, or when the application is in production to do triage on a user-reported problem or a bug. (This is

How to configure GNU Emacs to write UNIX or DOS formatted files by default?

巧了我就是萌 提交于 2019-11-30 12:01:39
问题 I've had these functions in my .emacs.el file for years: (defun dos2unix () "Convert a DOS formatted text buffer to UNIX format" (interactive) (set-buffer-file-coding-system 'undecided-unix nil)) (defun unix2dos () "Convert a UNIX formatted text buffer to DOS format" (interactive) (set-buffer-file-coding-system 'undecided-dos nil)) These functions allow me to easily switch between formats, but I'm not sure how to configure Emacs to write in one particular format by default regardless of which

Python Multiple users append to the same file at the same time

a 夏天 提交于 2019-11-30 10:58:26
问题 I'm working on a python script that will be accessed via the web, so there will be multiple users trying to append to the same file at the same time. My worry is that this might cause a race condition where if multiple users wrote to the same file at the same time and it just might corrupt the file. For example: #!/usr/bin/env python g = open("/somepath/somefile.txt", "a") new_entry = "foobar" g.write(new_entry) g.close Will I have to use a lockfile for this as this operation looks risky. 回答1

PowerShell multiple string replacement efficiency

大憨熊 提交于 2019-11-30 09:12:41
问题 I'm trying to replace 600 different strings in a very large text file 30Mb+. I'm current building a script that does this; following this Question: Script: $string = gc $filePath $string | % { $_ -replace 'something0','somethingelse0' ` -replace 'something1','somethingelse1' ` -replace 'something2','somethingelse2' ` -replace 'something3','somethingelse3' ` -replace 'something4','somethingelse4' ` -replace 'something5','somethingelse5' ` ... (600 More Lines...) ... } $string | ac "C:\log.txt"

Splitting gzipped logfiles without storing the ungzipped splits on disk

戏子无情 提交于 2019-11-30 09:11:23
问题 I have a recurring task of splitting a set of large (about 1-2 GiB each) gzipped Apache logfiles into several parts (say chunks of 500K lines). The final files should be gzipped again to limit the disk usage. On Linux I would normally do: zcat biglogfile.gz | split -l500000 The resulting files files will be named xaa, xab, xac, etc So I do: gzip x* The effect of this method is that as an intermediate result these huge files are temporarily stored on disk. Is there a way to avoid this

C# Read Text File Containing Data Delimited By Tabs

让人想犯罪 __ 提交于 2019-11-30 08:19:16
问题 I have some code: public static void ReadTextFile() { string line; // Read the file and display it line by line. using (StreamReader file = new StreamReader(@"C:\Documents and Settings\Administrator\Desktop\snpprivatesellerlist.txt")) { while ((line = file.ReadLine()) != null) { char[] delimiters = new char[] { '\t' }; string[] parts = line.Split(delimiters, StringSplitOptions.RemoveEmptyEntries); for (int i = 0; i < parts.Length; i++) { Console.WriteLine(parts[i]); sepList.Add(parts[i]); } }

Java Text File Encoding

谁都会走 提交于 2019-11-30 08:14:00
问题 I have a text file and it can be ANSI (with ISO-8859-2 charset), UTF-8, UCS-2 Big or Little Endian. Is there any way to detect the encoding of the file to read it properly? Or is it possible to read a file without giving the encoding? (and it reads the file as it is) (There are several program that can detect and convert encoding/format of text files.) 回答1: UTF-8 and UCS-2/UTF-16 can be distinguished reasonably easily via a byte order mark at the start of the file. If this exists then it's a

How do you dynamically identify unknown delimiters in a data file?

我的未来我决定 提交于 2019-11-30 06:51:24
I have three input data files. Each uses a different delimiter for the data contained therein. Data file one looks like this: apples | bananas | oranges | grapes data file two looks like this: quarter, dime, nickel, penny data file three looks like this: horse cow pig chicken goat (the change in the number of columns is also intentional) The thought I had was to count the number of non-alpha characters, and presume that the highest count was the separator character. However, the files with non-space separators also have spaces before and after the separators, so the spaces win on all three

What is the best way to read in a text file from the server in asp.net-mvc

≡放荡痞女 提交于 2019-11-30 06:01:41
In one of my controller actions I need to read in a text file that has a bunch of reference data in it. Right now I simply put it in the "/Content" directory. My questions are: Is this the "right" place to put this file or should I put it in another directory? What is the best way to read in a text file in asp.net-mvc that is sitting on the server? mathieu If the file should not be directly available via URL, you should put it in App_Data. For reading it, just use: var fileContents = System.IO.File.ReadAllText(Server.MapPath(@"~/App_Data/file.txt")); Ok this way it works for me (VS2017) Set

What's the best way of doing dos2unix on a 500k line file, in Windows? [closed]

荒凉一梦 提交于 2019-11-30 04:45:12
问题 As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance. Closed 7 years ago . Question says it all, I've got a 500,000 line file that gets generated as part of an automated build process on a Windows box and it's