How much influence do stage dimensions have with regards to frame rate?
Using a test Flash app I'm trying to run some tests to help design my new website.
The test Flash is set to 100% x 100% to fill up the browser window. Framerate set to 20. It also contains a pre-made framerate counter class, to help gauge performance. The Flash is setup with a function that moves the contents to the center of the stage, with no resizing. This means that when I shrink the browser window, the Flash contents are always center in the browser viewport, and the contents do not scale down.
-A huge 2500x1500 bitmap in the background.
-A 500x500 bitmap in an MC. MC position is tweened to a very spots.
When viewed in browser window of size roughly 1500x1100... Got avg framerates of 12-15 fps.
When sizing browser down to roughly 600x600... Got avg framerates of 18-20 fps.
-A 2500x1500 vector box replacing the bitmap.
-A 500x500 vector box replacing the bitmap in the animated MC.
When viewed in browser window of size roughly 1500x1100... Got avg framerates of 13-19 fps.
When sizing browser down to roughly 600x600... Got almost fixed framerate at 20fps (good!).
All this is telling me that the amount of actual rendered final pixels has a huge impact on Flash framerate, less so than what is actually in the Flash, especially when in near-fullscreen situations.
How true is this? I'm sure it's common knowledge... but any comments or articles about this?
Is there a general consensus of a rule of thumb of maximum pixels for Flash nowadays? For example are we talking about no larger than 1024x768 for ideal Flash smoothness on the current generation of platforms, assuming typical Flash content (nothing too crazy) ???
Ran Test 2 directly in Flash, in near-fullscreen dimensions... the framerate in the first instance of the movie loop had avg fps of 15 (like the browser)... but every loop after that was stable at 20fps !!!!!!!!!!! WHY ? Is there some kind of caching being used in the Flash application, that browsers don't use? I changed no settings.