Copy link to clipboard
Copied
Hi!
I've tried every combination I can think of BUT I just keep getting strange pixels from
PrPixelFormat_ARGB_4444_16u
format in Premiere when my plugin reads the incoming data. If I copy it straight to the output it's fine, but I can't get a specific pixel, it looks like the the screen is doubled up on the X axis or something.
I'm trying to read in pixels from my plugin running in Premiere Pro. I need to get these into 16bit ARGB format. The data LOOKS like it's aligned as per AE 16bit ARGB BUT then the values are really strange apart from the alpha.
I think I just need a function to read the incoming Premiere pixels equivalent to the usual AE 16bit method below:
AE 16bit ARGB Version (works in After Effects)
PF_Pixel16* getPixel16 (PF_EffectWorld* inputP, const A_long x, const A_long y) {
return (PF_Pixel16*)((char*)inputP->data + (y * inputP->rowbytes) + x * sizeof (PF_Pixel16));
}
Does anyone have any ideas on an equivalent Premiere Pro version for getting/setting pixels in PrPixelFormat_ARGB_4444_16u or similar please ? I'm stumped!
cheers guys!
Copy link to clipboard
Copied
No actual solution, but I got sick of supporting all the various Premiere Pro subformats, so I only added
pfS->AddSupportedPixelFormat(in_data->effect_ref, PrPixelFormat_BGRA_4444_32f);
to my plugin code, but no others, forcing Premiere Pro to do the actual conversion to 32bit float for me and make life easier for me as a developer (at the expense of maybe a bit of performance loss because of the conversion).
How did you manage to get Premiere Pro to send you ARGB data, by the way? For me, it preferred BGRA in most cases.