logfiles

how to parse json where key is variable in python?

二次信任 提交于 2019-12-12 02:52:21
问题 i am parsing a log file which is in json format, and contains data in the form of key : value pair. i was stuck at place where key itself is variable. please look at the attached code in this code i am able to access keys like username,event_type,ip etc. problem for me is to access the values inside the "submission" key where i4x-IITB-CS101-problem-33e4aac93dc84f368c93b1d08fa984fc_2_1 is a variable key which will change for different users, how can i access it as a variable ? { "username":

Split access.log file by dates using command line tools

自作多情 提交于 2019-12-08 22:51:09
问题 I have a Apache access.log file, which is around 35GB in size. Grepping through it is not an option any more, without waiting a great deal. I wanted to split it in many small files, by using date as splitting criteria. Date is in format [15/Oct/2011:12:02:02 +0000] . Any idea how could I do it using only bash scripting, standard text manipulation programs (grep, awk, sed, and likes), piping and redirection? Input file name is access.log . I'd like output files to have format such as access

extract data from log file in specified range of time with awk getline bash

社会主义新天地 提交于 2019-12-07 12:46:19
问题 I was searching for parsing a log file and found what I need in this link extract data from log file in specified range of time But the most useful answer (posted by @Kent): # this variable you could customize, important is convert to seconds. # e.g 5days=$((5*24*3600)) x=$((5*60)) #here we take 5 mins as example # this line get the timestamp in seconds of last line of your logfile last=$(tail -n1 logFile|awk -F'[][]' '{ gsub(/\//," ",$2); sub(/:/," ",$2); "date +%s -d \""$2"\""|getline d;

extract data from log file in specified range of time with awk getline bash

て烟熏妆下的殇ゞ 提交于 2019-12-06 02:10:37
I was searching for parsing a log file and found what I need in this link extract data from log file in specified range of time But the most useful answer (posted by @Kent): # this variable you could customize, important is convert to seconds. # e.g 5days=$((5*24*3600)) x=$((5*60)) #here we take 5 mins as example # this line get the timestamp in seconds of last line of your logfile last=$(tail -n1 logFile|awk -F'[][]' '{ gsub(/\//," ",$2); sub(/:/," ",$2); "date +%s -d \""$2"\""|getline d; print d;}' ) #this awk will give you lines you needs: awk -F'[][]' -v last=$last -v x=$x '{ gsub(/\//," "

How can I view log files in Linux and apply custom filters while viewing?

不问归期 提交于 2019-12-03 02:12:45
问题 I need to read through some gigantic log files on a Linux system. There's a lot of clutter in the logs. At the moment I'm doing something like this: cat logfile.txt | grep -v "IgnoreThis\|IgnoreThat" | less But it's cumbersome -- every time I want to add another filter, I need to quit less and edit the command line. Some of the filters are relatively complicated and may be multi-line. I'd like some way to apply filters as I am reading through the log, and a way to save these filters somewhere

How can I view log files in Linux and apply custom filters while viewing?

Deadly 提交于 2019-12-02 15:47:31
I need to read through some gigantic log files on a Linux system. There's a lot of clutter in the logs. At the moment I'm doing something like this: cat logfile.txt | grep -v "IgnoreThis\|IgnoreThat" | less But it's cumbersome -- every time I want to add another filter, I need to quit less and edit the command line. Some of the filters are relatively complicated and may be multi-line. I'd like some way to apply filters as I am reading through the log, and a way to save these filters somewhere. Is there a tool that can do this for me? I can't install new software so hopefully it's something

Changing Location of Velocity.Log File

六月ゝ 毕业季﹏ 提交于 2019-12-01 05:51:48
Seems pretty straight forward. Documentation at http://velocity.apache.org/engine/devel/developer-guide.html#Configuring_Logging says to set the runtime.log property. Here's what I got for all my properties. velocityEngine.setProperty(RuntimeConstants.FILE_RESOURCE_LOADER_PATH, templatesPath); velocityEngine.setProperty("runtime.log", "/path/to/my/file/velocity.log"); velocityEngine.setProperty("resource.loader", "string"); velocityEngine.setProperty("string.resource.loader.class", "org.apache.velocity.runtime.resource.loader.StringResourceLoader"); velocityEngine.setProperty("string.resource

log4j vs. System.out.println - logger advantages?

ぐ巨炮叔叔 提交于 2019-11-30 08:13:37
I'm using log4j for the first time in a project. A fellow programmer told me that using System.out.println is considered a bad style and that log4j is something like standard for logging matters nowadays. We do lots of JUnit testing - System.out stuff turns out to be harder to test. Therefore I began utilizing log4j for a Console controller class, that's just handling command-line parameters. // log4j logger config org.apache.log4j.BasicConfigurator.configure(); Logger logger = LoggerFactory.getLogger(Console.class); Category cat = Category.getRoot(); Seems to work: logger.debug("String");

log4j vs. System.out.println - logger advantages?

假如想象 提交于 2019-11-29 11:50:52
问题 I'm using log4j for the first time in a project. A fellow programmer told me that using System.out.println is considered a bad style and that log4j is something like standard for logging matters nowadays. We do lots of JUnit testing - System.out stuff turns out to be harder to test. Therefore I began utilizing log4j for a Console controller class, that's just handling command-line parameters. // log4j logger config org.apache.log4j.BasicConfigurator.configure(); Logger logger = LoggerFactory

extract last 10 minutes from logfile [duplicate]

故事扮演 提交于 2019-11-28 07:45:48
This question already has an answer here: Filter log file entries based on date range 3 answers Trying to find a simple way for watching for recent events (from less than 10 minutes), I've tried this: awk "/^$(date --date="-10 min" "+%b %_d %H:%M")/{p++} p" /root/test.txt but it doesn't work as expected... Log files are in form : Dec 18 09:48:54 Blah Dec 18 09:54:47 blah bla Dec 18 09:55:33 sds Dec 18 09:55:38 sds Dec 18 09:57:58 sa Dec 18 09:58:10 And so on... You can match the date range using simple string comparison, for example: d1=$(date --date="-10 min" "+%b %_d %H:%M") d2=$(date "+%b %