stdout

How to write to StdOut in Windows and FASM?

China☆狼群 提交于 2020-01-17 01:11:07
问题 The question is pretty simple, yet I can't seem to find how to do it: how do I write to StdOut in Windows/FASM? There does not seem to be any documentation online. Ideas? 回答1: There are a few options... 1) Use the WinAPI. This is either WriteConsole OR by using CreateFile with the filename as CON and then using WriteFile 2) Using msvcrt, and printf as you would in a c program. 来源: https://stackoverflow.com/questions/7263097/how-to-write-to-stdout-in-windows-and-fasm

is there a way to pipe STDERR and STDOUT to different programs?

╄→尐↘猪︶ㄣ 提交于 2020-01-16 19:01:09
问题 For example I need to set two tee commands in such way that one will be reading from STDOUT and second from STDERR and both redirecting the console output to different files. Is such thing possible in windows batch files ? I know about how to redirect output to file but in this case it won't be displayed on screen or how to combine both streams but how about piping both independently ? 回答1: You may process STDOUT and STDERR with separate programs via the next trick: (test | findstr /N /A:2A "

No output from awk only when redirected to pipe or file [duplicate]

半腔热情 提交于 2020-01-11 11:50:30
问题 This question already has an answer here : awk not printing to file (1 answer) Closed 12 months ago . I have a rather simple script (print content from a tty after adding timestamp to every row). It outputs nicely on the command line, but redirecting the output with > does not work. Why not? Here is the script: #!/bin/bash awk '{ print strftime("%Y-%m-%d %H:%M:%S |"), $0; }' "$1" Running it as is, like timecat /dev/ttyACM0 works fine, I see the content in my terminal. But if I run timecat

grep with continuous pipe does not work

吃可爱长大的小学妹 提交于 2020-01-11 09:23:09
问题 (maybe it is the "tcpflow" problem) I write a script to monitoring http traffic, and I install tcpflow , then grep it works (and you should make a http request, for example curl www.163.com ) sudo tcpflow -p -c -i eth0 port 80 2>/dev/null | grep '^Host: ' it outputs like this (continuously) Host: config.getsync.com Host: i.stack.imgur.com Host: www.gravatar.com Host: www.gravatar.com but I can't continue to use pipe does not work (nothing output) sudo tcpflow -p -c -i eth0 port 80 2>/dev/null

grep with continuous pipe does not work

你。 提交于 2020-01-11 09:23:06
问题 (maybe it is the "tcpflow" problem) I write a script to monitoring http traffic, and I install tcpflow , then grep it works (and you should make a http request, for example curl www.163.com ) sudo tcpflow -p -c -i eth0 port 80 2>/dev/null | grep '^Host: ' it outputs like this (continuously) Host: config.getsync.com Host: i.stack.imgur.com Host: www.gravatar.com Host: www.gravatar.com but I can't continue to use pipe does not work (nothing output) sudo tcpflow -p -c -i eth0 port 80 2>/dev/null

How to change how frequently SLURM updates the output file (stdout)?

天大地大妈咪最大 提交于 2020-01-11 03:09:08
问题 I am using SLURM to dispatch jobs on a supercomputer. I have set the --output=log.out option to place the content from a job's stdout into a file ( log.out ). I'm finding that the file is updated every 30-60 minutes, making it difficult for me to check on the status of my jobs. Any idea why it takes so long to update this file? Is there a way to change settings so that this file is updated more frequently? Using SLURM 14.03.4-2 回答1: This may be related to buffering. Have you tried disabling

Python: How to peek into a pty object to avoid blocking?

爷,独闯天下 提交于 2020-01-10 15:33:01
问题 I am using pty to read non blocking the stdout of a process like this: import os import pty import subprocess master, slave = pty.openpty() p = subprocess.Popen(cmd, stdout = slave) stdout = os.fdopen(master) while True: if p.poll() != None: break print stdout.readline() stdout.close() Everything works fine except that the while-loop occasionally blocks. This is due to the fact that the line print stdout.readline() is waiting for something to be read from stdout . But if the program already

Python: How to peek into a pty object to avoid blocking?

我只是一个虾纸丫 提交于 2020-01-10 15:32:25
问题 I am using pty to read non blocking the stdout of a process like this: import os import pty import subprocess master, slave = pty.openpty() p = subprocess.Popen(cmd, stdout = slave) stdout = os.fdopen(master) while True: if p.poll() != None: break print stdout.readline() stdout.close() Everything works fine except that the while-loop occasionally blocks. This is due to the fact that the line print stdout.readline() is waiting for something to be read from stdout . But if the program already

How to inherit stdin and stdout in python by using os.execv()

前提是你 提交于 2020-01-09 11:45:35
问题 First, I wrote a c++ code as follows: #include <cstdio> int main() { int a,b; while(scanf("%d %d",&a,&b) == 2) printf("%d\n",a+b); return 0; } I use g++ -o a a.cpp to complie it. Afterwards, I wrote python code as follows: import os,sys sys.stdin = open("./data.in","r") sys.stdout = open("./data.out","w") pid = os.fork() if pid == 0: cmd = ["./a","./a"] os.execv(cmd[0],cmd) However, the data.out file contains nothing. That is to say, the child process did not inherit stdin and stdout from his

How can I reinitialize Perl's STDIN/STDOUT/STDERR?

╄→尐↘猪︶ㄣ 提交于 2020-01-09 09:37:32
问题 I have a Perl script which forks and daemonizes itself. It's run by cron, so in order to not leave a zombie around, I shut down STDIN,STDOUT, and STDERR: open STDIN, '/dev/null' or die "Can't read /dev/null: $!"; open STDOUT, '>>/dev/null' or die "Can't write to /dev/null: $!"; open STDERR, '>>/dev/null' or die "Can't write to /dev/null: $!"; if (!fork()) { do_some_fork_stuff(); } The question I have is: I'd like to restore at least STDOUT after this point (it would be nice to restore the