So I am an inexperienced Python coder, with what I have gathered might be a rather complicated need. I am a cognitive scientist and I need precise stimulus display and butto
When people talk about real-time computing, what they mean is that the latency from an interrupt (most commonly set off by a timer) to application code handling that interrupt being run, is both small and predictable. This then means that a control process can be run repeatedly at very precise time intervals or, as in your case, external events can be timed very precisely. The variation in latency is usually called "jitter" - 1ms maximum jitter means that an interrupt arriving repeatedly will have a response latency that varies by at most 1ms.
"Small" and "predictable" are both relative terms and when people talk about real-time performance they might mean 1μs maximum jitter (people building inverters for power transmission care about this sort of performance, for instance) or they might mean a couple of milliseconds maximum jitter. It all depends on the requirements of the application.
At any rate, Python is not likely to be the right tool for this job, for a few reasons:
malloc
or C++'s new
) because the amount of time they take is not predictable. Python neatly hides this from you, making it very difficult to control latency. Again, using lots of those nice off-the-shelf libraries only makes the situation worse.mlockall
in somewhere, but any new allocation will upset things).I have a more basic question though. You don't say whether your button is a physical button or one on the screen. If it's one on the screen, the operating system will impose an unpredictable amount of latency between the physical mouse button press and the event arriving at your Python application. How will you account for this? Without a more accurate way of measuring it, how will you even know whether it is there?