io-buffering

How do I flush a file in Perl?

主宰稳场 提交于 2020-01-09 03:43:08
问题 I have Perl script which appends a new line to the existing file every 3 seconds. Also, there is a C++ application which reads from that file. The problem is that the application begins to read the file after the script is done and file handle is closed. To avoid this I want to flush after each line append, but I'm new to Perl and don't know how to do that. 回答1: Try: use IO::Handle; $fh->autoflush; This was actually posted as a way of auto-flushing in an early question of mine, which asked

Real time read from subprocess.stdout on Windows

不羁岁月 提交于 2019-12-24 15:39:35
问题 To emphasize, the problem is real time read instead of non-blocking read . It has been asked before, e.g. subprocess.Popen.stdout - reading stdout in real-time (again). But no satisfactory solution has been proposed. As an example, the following code tries to simulate the python shell. import subprocess p = subprocess.Popen(['python'], stdin=subprocess.PIPE, stdout=subprocess.PIPE) while True: line = input('>>> ') p.stdin.write(line.encode()) print('>>> ', p.stdout.read().decode()) However,

Is there any way to find the buffer size of a file object

走远了吗. 提交于 2019-12-19 03:38:17
问题 I'm trying to "map" a very large ascii file. Basically I read lines until I find a certain tag and then I want to know the position of that tag so that I can seek to it again later to pull out the associated data. from itertools import dropwhile with open(datafile) as fin: ifin = dropwhile(lambda x:not x.startswith('Foo'), fin) header = next(ifin) position = fin.tell() Now this tell doesn't give me the right position. This question has been asked in various forms before. The reason is

Java with NetBeans 7.2.1 - Execution order issue

北城以北 提交于 2019-12-13 22:05:36
问题 Consider the following two classes in a NetBeans Java application. The main class: public class ExcecutionOrder { public static void main(String[] args) { Worker worker = new Worker(); worker.init(); worker.doProcessing(); worker.stop(); } } And a worker class like this: public class Worker { public void init() { System.out.println("\n-------------------------------------------------"); System.out.println("Worker initialized."); System.out.println("--------------------------------------------

Cygwin terminal buffers STDOUT

六月ゝ 毕业季﹏ 提交于 2019-12-11 14:26:40
问题 I use Altera Quartus software which comes with its own Cygwin distribution and a dumb terminal which, according to the shortcut placed in my Start Menu by Altera, is run using cmd.exe /c "c:\altera\15.1\nios2eds\NiosII Command Shell.bat" where this batch file configures the environment for Quartus and launches bash. When I use this window to run Altera tools, their output comes out immediately (not buffered) and in color. I also have my own Cygwin installation with an Xserver and terminals (i

Streaming web uploads to socket with Rack

五迷三道 提交于 2019-12-10 22:32:25
问题 I currently have a Sinatra app running in an FCGI handler. I want to write a handler that will sit within the rackup file (probably in front of the Sinatra app) and will stream big file uploads to another server via sockets (without buffering it on disk first) and do so in interlock with the request. So what I would like to do is some kind of stream-decode-send workflow without param preparsing. I've read somewhere that there is a problem with this because specifically due to the way the

What is the difference between the buffering argument to open() and the hardcoded readahead buffer size used when iterating through a file?

北战南征 提交于 2019-12-01 10:51:15
Inspired by this question , I'm wondering exactly what the optional buffering argument to Python's open() function does. From looking at the source , I see that buffering is passed into setvbuf to set the buffer size for the stream (and that it does nothing on a system without setvbuf , which the docs confirm). However, when you iterate over a file, there is a constant called READAHEAD_BUFSIZE that appears to define how much data is read at a time (this constant is defined here ). My question is exactly how the buffering argument relates to READAHEAD_BUFSIZE . When I iterate through a file,

What is the difference between the buffering argument to open() and the hardcoded readahead buffer size used when iterating through a file?

感情迁移 提交于 2019-12-01 08:15:58
问题 Inspired by this question, I'm wondering exactly what the optional buffering argument to Python's open() function does. From looking at the source, I see that buffering is passed into setvbuf to set the buffer size for the stream (and that it does nothing on a system without setvbuf , which the docs confirm). However, when you iterate over a file, there is a constant called READAHEAD_BUFSIZE that appears to define how much data is read at a time (this constant is defined here). My question is

Can I stop std::cout flushing on “\\n”?

懵懂的女人 提交于 2019-11-28 00:29:02
According to to this post std::cout will automatically flush on \n when it is attached to an interactive device (e.g. a terminal window). Otherwise (e.g. when being piped to a file) it will act fully buffered and will only flush on .flush() or std::endl . Is there a way to override this behaviour in Microsoft Visual C++ so that I can select whether I want fully buffered or line buffered mode? Contrary to anon's (Apr 28 '09) answer, this behavior has nothing to do with the operating system or "console software." C++'s <iostream> streams are designed to be interoperable with C's <stdio.h>

Lisp format and force-output

夙愿已清 提交于 2019-11-27 23:40:13
I don't understand why this code behaves differently in different implementations: (format t "asdf") (setq var (read)) In CLISP it behaves as would be expected, with the prompt printed followed by the read, but in SBCL it reads, then outputs. I read a bit on the internet and changed it: (format t "asdf") (force-output t) (setq var (read)) This, again, works fine in CLISP, but in SBCL it still reads, then outputs. I even tried separating it into another function: (defun output (string) (format t string) (force-output t)) (output "asdf") (setq var (read)) And it still reads, then outputs. Am I