I could need another pair of eyes, or maybe another brain
here: For a miniature of a big area, I want to calculate the
position of an object from the position in the big screen. The
original pos / big area size * small area size
I think this should be right for getting the relative
position in the small screen.
However, Flash comes up with strange results:
(20+5000) / (2*5000) * 119 = 2439.5
it should result in 59.738 for this calculation. The code for
this looks like this:
where omc.x is the x position in the big screen (here 20),
MAX is 5000, 2*MAX is the complete size (the point 0,0 is in the
middle, so it's 10000 big), and the small screen is display.width.
The values seem to be right; the calculation above
(20+5000...) was traced from the code. However, the result is
totally wrong, and I have no idea why. Any ideas on this?
"blemmo" <email@example.com> wrote in
> d'oh... just realized the numbers came in as strings...
so it all looked
> in the trace, but calculating strings with numbers isn't
> darn... I should remember to use the debugger more