I’m currently writing my dissertation for my Bachelor of Arts or Science.
For this dissertation I compare Flash with Unity with a performance test. I used Away3D for the Flash applications and draw graphs.
Now I have results which are very strange.
Please take a look at the embed picture.
There you can see Unity make an smooth graph, while the Flash-Graph appears in a stairway.
Did anyone know why this happens? Is this why Flash clear the temporary RAM only if the limit reached?!
I need this for my analysis. So if anyone have an idea, please tell me.
It's not important that in this graph, Unity is better. After this Graph I find some things I have to equal. So it's just about this "stairway".
My application works as follows:
Every 0,5 seconds I’m adding an Object (992 Polys or 5288 Polys..depends on the object i choose for the test) on the stage.
I automaticly add 17 in a horizontal row and 7 in a vertical row.
After this I multiply my z-coordinate so I can add as many objects in depth.
Every 25 objects I write the FPS and the object-count in the database to create these graphs.
I also append an defuse shader to the Objects an rotate them around the y-Axis to higher the activity of my processor and graphic card.
The only other things on my Stage are the camera-object and a pointlight.
The pointlight and the far clippingplane have an depth of 10.000 to render objects if there are more than 3000 on the stage.
I’ve activated mipmapping and 2x Antialaising.
Batchrendering is deactivated in Flash and in Unity.
If anyone can help me: THANKS!!!
Videocard performance is often limited by bottlenecks. When you exceed the available resources in any particular step of the priceline you suddenly go from what was one pass to two, and subsuquently you see performance dropoffs in cliffs rather than gradual decline.
That said, mismanagment of resource allocation is probably the cause. If you send me the swf I can give you detailed perfomance information and identify the culprit. (I'd like to know, too =)