logfile-analysis

Logfile analysis in R?

廉价感情. 提交于 2019-12-03 04:55:44
问题 I know there are other tools around like awstats or splunk, but I wonder whether there is some serious (web)server logfile analysis going on in R. I might not be the first thought to do it in R, but still R has nice visualization capabilities and also nice spatial packages. Do you know of any? Or is there a R package / code that handles the most common log file formats that one could build on? Or is it simply a very bad idea? 回答1: In connection with a project to build an analytics toolbox for

Software for Webserver Log Analysis? [closed]

你说的曾经没有我的故事 提交于 2019-12-03 03:45:25
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 5 years ago . Can I get some recommendations (preferably with some reasons) for good log analysis software for Apache 2.2 access log files? I have heard of Webalizer and AWStats, but have never really used any of them, and would like to know: What they can do Why they are useful Interesting uses for them Any and all comments

Logfile analysis in R?

元气小坏坏 提交于 2019-12-02 18:12:57
I know there are other tools around like awstats or splunk, but I wonder whether there is some serious (web)server logfile analysis going on in R. I might not be the first thought to do it in R, but still R has nice visualization capabilities and also nice spatial packages. Do you know of any? Or is there a R package / code that handles the most common log file formats that one could build on? Or is it simply a very bad idea? In connection with a project to build an analytics toolbox for our Network Ops guys, i built one of these about two months ago. My employer has no problem if i open

How can I tell when my dataset in R is going to be too large?

微笑、不失礼 提交于 2019-11-28 03:02:22
I am going to be undertaking some logfile analyses in R (unless I can't do it in R), and I understand that my data needs to fit in RAM (unless I use some kind of fix like an interface to a keyval store, maybe?). So I am wondering how to tell ahead of time how much room my data is going to take up in RAM, and whether I will have enough. I know how much RAM I have (not a huge amount - 3GB under XP), and I know how many rows and cols my logfile will end up as and what data types the col entries ought to be (which presumably I need to check as it reads). How do I put this together into a go/nogo

Parsing huge logfiles in Node.js - read in line-by-line

大兔子大兔子 提交于 2019-11-26 11:02:47
I need to do some parsing of large (5-10 Gb)logfiles in Javascript/Node.js (I'm using Cube). The logline looks something like: 10:00:43.343423 I'm a friendly log message. There are 5 cats, and 7 dogs. We are in state "SUCCESS". We need to read each line, do some parsing (e.g. strip out 5 , 7 and SUCCESS ), then pump this data into Cube ( https://github.com/square/cube ) using their JS client. Firstly, what is the canonical way in Node to read in a file, line by line? It seems to be fairly common question online: http://www.quora.com/What-is-the-best-way-to-read-a-file-line-by-line-in-node-js