Accurate timing in iOS

后端 未结 7 1359
生来不讨喜
生来不讨喜 2020-12-01 02:32

I am looking at the \'Metronome\' sample code from the iOS SDK (http://developer.apple.com/library/ios/#samplecode/Metronome/Introduction/Intro.html). I am running the metro

相关标签:
7条回答
  • 2020-12-01 02:52

    Did you consider using a NSTimer instead of this "strange" while-loop-solution? It seems for me, that you are - not only a bit - overcomplicating very very simple things here ...

    0 讨论(0)
  • 2020-12-01 02:53

    thanks ! took me a little while to figure out to use CFAbsoluteTimeGetCurrent() for currentTime0 and currentTime1 , and had to cast duration (CGFloat) seperately ( double myDuration = (double)duration) as was having a mare trying to use doubleValue .

    your suggestion works perfectly for my needs - have you managed to increase accuracy ?

    0 讨论(0)
  • 2020-12-01 03:00

    Using a while loop and sleep to time a metronome is not a robust way to solve this problem as it is likely to produce timing that is jittery and that drifts, as you have seen. I believe the standard way to solve this problem is to use Core Audio (in one way or another) to feed a continuous audio stream containing your metronome ticks separated by the correct amount of silence between them, depending on the tempo. Because you know the exact sample rate, you can time your ticks very accurately. Unfortunately, generating the audio yourself is quite a bit more difficult than what you're attempting to do, but this Stackoverflow question might get you started.

    0 讨论(0)
  • 2020-12-01 03:01

    The problem is more fundamental. IOS is not a real time OS so the time it takes to do various operations is not "determinate". It varies. For accuracy you would need a real time OS like RIM's New QNX operating system.

    0 讨论(0)
  • 2020-12-01 03:02

    I'm very interested in this too, also noticed the fallacy of using NSTimer and the Metronome example.

    I'm currently looking into CAMediaTiming Protocol...yet i have no idea how to use it, but it could prove to be a more precise solution as it's used for animations. I could also be completely wrong since games slow down when too much goes on. I'm basing my theory on my favorite game where precise timing is required when "fighting opponents on the streets". The game's fps has been reduced to 30 i think compared to its console counterparts which clocks at 60. Yet the game's combo system's timing is very crucial, e.g. a combo consists of hitting a button then another one exactly 3 frames later, so either:

    • the game on the iOS is more forgiving on timing
    • the developers implemented their own timing system
    • else they're using the Metronome's timer loop or the CAMediaTiming protocol.

    Info on timing in animations is found here: http://developer.apple.com/library/ios/#documentation/Cocoa/Conceptual/Animation_Types_Timing/Articles/Timing.html#//apple_ref/doc/uid/TP40006670-SW1

    0 讨论(0)
  • 2020-12-01 03:06

    Ok, I have some answers after doing some more tests, so I am sharing it with anyone who is interested.

    I've placed a variable to measure time intervals between ticks, inside the play method (the method that actually sends the play message to the AVAudioPlayer object), and as my simple compare-to-external-watch experiment showed, the 60 BPM was too slow - I got these time intervals (in seconds):

    1.004915
    1.009982
    1.010014
    1.010013
    1.010028
    1.010105
    1.010095
    1.010105
    

    My conclusion was that some overhead time elapses after each 1-second-interval is counted, and that extra time (about 10msec) is accumulated to a noticeable amount after a few tens of seconds --- quite bad for a metronome. So instead of measuring the interval between calls, I decided to measure the total interval from the first call, so that the error won't be accumulated. In other words I've replaced this condition:

    while (continuePlaying && ((currentTime0 + [duration doubleValue]) >= currentTime1)
    

    with this condition:

    while (continuePlaying && ((_currentTime0 + _cnt * [duration doubleValue]) >= currentTime1 ))
    

    where now _currentTime0 and _cnt are class members (sorry if it's a c++ jargon, I am quite new to Obj-C), the former holds the time stamp of the first call to the method, and the latter is an int counting number of ticks (==function calls). This resulted in the following measured time intervals:

    1.003942
    0.999754
    0.999959
    1.000213
    0.999974
    0.999451
    1.000581
    0.999470
    1.000370
    0.999723
    1.000244
    1.000222
    0.999869
    

    and it is evident even without calculating the average, that these values fluctuate around 1.0 second (and the average is close to 1.0 with at least a millisecond of accuracy).

    I will be happy to hear more insights regarding what causes the extra time to elapse - 10msec sounds as eternity for a modern CPU - though I am not familiar with the specs of the iPod CPU (it's iPod 4G, and Wikipedia says the CUP is PowerVR SGX GPU 535 @ 200 MHz)

    0 讨论(0)
提交回复
热议问题