You cannot guarantee timer accuracy, especially at that resolution, across all browsers and devices. The timer is not always tied to frame rate.
Thanks. So, what factors would causes Timer to be tied to frame rate? Perhaps I can work around those factors.
If this kind of accuracy isn't possible, how do Audio applications achieve their accuracy? Or, do they not require this level of accuracy?
Similar issue here: http://forums.adobe.com/message/972593#972593, but no answer
I think in IOS the timer is tied to frame rate. In desktop platforms, the maximum timer rate is not tied to frame rate and is throttled and each platform/browser has a different number, and I’ve never seen 5ms. I think 10ms was the best we saw. Also, timers drain batteries. Recent versions of the player might throttle timers even more when the app is not in the foreground.
I don’t know how the Audio keeps up, or if it has to. Or are you talking about someone trying to mix channels?
Our animation code generally uses absolute time and slower intervals.
Iv'e never had to work with a timer concept before Actionscript, so the whole timer accuracy thing was a bit confusing for me too at first. Assuming an exact interval is the natural thing to do, but it makes sense why something like that isn't possible.
What worked well for me was to start thinking of time passage as an average rather than a fixed cycle.
Lets say you wanted to bind a value to the timeline of an audio track (like a slider bar). Instead of incrementing that value on a timed cycle, figure out what time it is now compared to the time the audio started playing, and set the value of the slider to where it should be, give or take your accuracy.
Another example would be to use a tween on that value for the same duration of the audio track. The time syncing is then averaged out, and your accuracy is determined by the hardware limit.
Thanks... The application is an AIR application for desktop, but the Timer accuracy seems to be tightly coupled with the frame rate. I get accurate results at 300fps (within 1-5ms), which is fine on my machine, but the CPU consumption is really high of course. I'm not sure what Flash does between frames, but it seems to be a lot.
It runs a for loop over all of your display objects. A really high frame-rate is not recommended. And it may vary on Mac vs Windows.
Why do you need such high resolution? Why can’t you work with absolute time?
The program displays graphics that emulate eye-saccades. Apparently the eye moves from one point to the next in 8ms. So, it has to be accurate within 8ms. However, most screen refresh rates aren't any faster than 16ms (60Hz), so that may not make sense. That said, the accuracy needs to be within 16ms and I'm only able to achieve that by setting stage.frameRate = 90;
1 person found this helpful
Flash is a deferred renderer. When you use the drawing commands, all you actually doing is setting up the display list. Nothing is drawn until your code stops running. This is not true in other native apps where drawing is immediate. And, if your code takes more than 16ms to run, it will delay the update even if you’ve set the frame rate to 90, and you’ll have an actual frame rate less than that. That’s why most folks use absolute time. On ENTER_FRAME, they determine the actual time, figure out what should be drawn at that point on the timeline, and quickly draw it.
Lots of things, including other processes on the computer, can delay when the player will draw again.