Copy link to clipboard
Copied
Hi. I'm using AEGP_GetLayerDuration to get my layer's duration. On my windows computer it returns the result in frames, but on mac it returns it in it's original form, where I must divide by the in_data->time_step if I want to convert it to frames. Has anyone else experienced this? Currently I'm using a potentially silly workaround which seems to work:
#ifdef AE_OS_WIN
layerLastFrame = layerFirstFrame + duration.value;
#endif
#ifdef AE_OS_MAC
layerLastFrame = (layerFirstFrame + (duration.value / in_data->time_step));
#endif
Copy link to clipboard
Copied
Are you sure you've not got your projects set up slightly differently between OSs? I doubt the function would return different units based on OS.
Copy link to clipboard
Copied
I'm sure of it. On windows, if I open up the project it will display the duration and scale in frames. Then if I modify the outpoint of the layer by one frame, it snaps back to measuring it correctly in the steps. Seems like a very situational bug, because if I then reapply the effect, there's no problem. I must reopen the project without saving to reproduce the error.
Copy link to clipboard
Copied
Ok so it seems to not be OS dependant. If I have a duration of 5,000 then that can't be measured as an A_long in steps because it's too big, so it measures it in frames. It's so confusing because I don't know the exact conditions as to whether AE will choose to measure in frames or steps for A_Time.