Copy link to clipboard
Copied
I have a ATI Radeon HD 2400 Pro card, and it supports H.264 hardware decoding. How come only ATI Radeon HD 4xxx cards and higher only get support for H.264 hardware decoding for Flash Player 10.1?
Copy link to clipboard
Copied
Adobe won't answer you because they don't care. Unless they can line their pockets by doing so, they won't help you. Its terrible.
Copy link to clipboard
Copied
Yeah, thanks for the info, man, but I kind of figured this already when they haven't responded in a couple of months. In any case, I found out with a little research on my own getting me to choose to upgrade to an ATI Radeon HD 4xxx Series Card. Basically, the ATI Radeon HD 2xxx/3xxx with the UVD decoder doesn't use full enough mpeg4/h.264 hardware decoding support, and so you need ATI cards with more full mpeg4/h.264 hardware support where those would be cards with the UVD2 decoder. Those UVD2 decoder would be ATI Radeon HD 4xxx and higher. The funny thing is when I asked ATI tech support way back about this, they didn't know anything involving the Adobe Flash player support even though one of their most recent video card driver updates is said to include support for Adobe Flash player 10.1 beta. ATI tech support also told me to contact Adobe. If Adobe doesn't want to help with questions about system requirements about their software, their loss because that kind of info would help their users of their software understand the video card requirements. I think it's gonna become a lot tougher to play high quality flash video with older video cards with HD flash video appearing more and more, and I think the Adobe Flash player 10.1 release may suffer when a lot of users aren't able to use hardware acceleration support for not meeting the minimum requirements of newer video cards.
Copy link to clipboard
Copied
Did a bit of research man and looking into that whole UVD arguement, and it doesnt hold. The desktop 3xxx series card has UVD+ whereas the 2xxx series only has UVD which would mean that running 10.1 on a 2xxx card would be a big ask. Which is fine since the 2xxx series is really old.
Now what I also found out is the mobility radeon 3xxx series is also UVD+ so the desktop 3xxx should be supported also! Since Adobe themselves state that integrated 3xxx series is supported.
Also i found out that hd3200 chips that are integrated into motherboards that are also apparently supported only have UVD (not UVD+ or UVD2).
All the info i found was on the following wiki link:
http://en.wikipedia.org/wiki/Unified_Video_Decoder
Seems like either one of Adobe or AMD are holding out on everyone. It may even be both.
Copy link to clipboard
Copied
OK, maybe I forgot to include UVD+ there, but you get the idea.
Copy link to clipboard
Copied
I don't really get the idea man. The desktop version of a 3870 should have a lot more grunt than an integrated hd 3200 seems to me that Adobe has no intention of trying to support the 3xxx desktop cards. Maybe Ati said that they would only co-operate if Adobe set the bar high so that more people would buy the newer hardware. Purely speculation but there is a distinct possiblility.
Copy link to clipboard
Copied
It makes sense that they would make it possible for people. Unless of course it is not compatible with the technology that they use?
Copy link to clipboard
Copied
I don't know what to say.
All I know is the fact ATI's UVD2 hardware decoder has full H.264 bitstream hardware decoding support than UVD or UVD+ which seems to me why Adobe flash 10.1 and ATI are going with a minimum card that supports UVD2: Radeon HD 4350 and newer
This wiki site has summarized a whole listing of ATI video cards/chipsets:
http://en.wikipedia.org/wiki/Comparison_of_ATI_Graphics_Processing_Units
Here's also another wiki site summarising the UVD technology and it's different versions:
http://en.wikipedia.org/wiki/UVD
I already got Radeon HD 4xxx card because I don't know if Adobe Flash 10.1 will ever actually support a Radeon HD 2xxx or 3xxx card, and now, I'm just OK with it.
Copy link to clipboard
Copied
You see a lot of people cant afford to just go out and get the 4xxx series. Especially the ones with agp 3xxx series cards since the 4xxx cards out for agp are heavily overpriced and not as powerful, peformance wise, compared to the 3xxx series.
I haven't actually seen anywhere that the embedded hd3xxx series has UVD2, on that wikipedia page its listed as having just plain UVD on its own.
Flash Labs are getting 10.1 running on much worse hardware. I cant see why they cant get it running on the 3xxx desktop series . Perhaps they are lazy or simply don't have the time or incentive ($money$) or talent (sounds harsh but couldnt think of a better word) to get it working. Adobe Labs please prove me wrong, please get this working on the older hardware (not the 2xxx series because that would be silly).
Copy link to clipboard
Copied
I'm also interested why I can't use my HD 3xxx hardware acceleration in Flash, when I use it without problem to decode HD movies?
I can't find the list of supported video cards on 10.2 release note page.
Copy link to clipboard
Copied
Hi, It would be better if you started a new thread since this one contains a lot of older info.
I don't know if this has what you need or not, but take a look. I think there was another more current thread on this if you want to take a look.
http://www.adobe.com/products/flashplayer/systemreqs/index.html
eidnolb
Copy link to clipboard
Copied
They don't have all the full hardware decoding support for H.264 video decoding in one component needed for flash as in the UVD2 hardware video decoder component for ATI (or AMD now) Radeon HD 4XXX and higher graphics cards. In the ATI Radeon HD 2xxx/3xxx graphics cards, which have the UVD or UVD+ hardware video decoder component, UVD/UVD+ supports hardware decoding for H.264 , but not fully because the video post-processing for H.264 isn't supported by the UVD/UVD+ hardware component. Instead, the video post-processing for H.264 for cards with UVD/UVD+ is done by the 2D/3D graphics shaders hardware component of the graphics card. In Radeon HD 4XXX and higher, the UVD2 hardware video decoder component does full support of H.264 decoding including video-post processing separate 2D/3D graphics components.
http://en.wikipedia.org/wiki/UVD
If you guys notice that in Adobe Flash Player 10.2, the flash video decoding and the 2D/3D graphics rendering are done completely separately which allows for lesser CPU usage (although how much CPU usage depends on what kind of graphics hardware support you have of course as well). So because Adobe Flash Player (starting with 10.2) has video decoding and 2D/3D graphics done separately and because ATI Radeon HD 2xxx/3xxx graphics cards with UVD/UVD+ don't allow H.264 decoding in the UVD/UVD+ hardware decoder component to be done separately from the 2D/3D hardware components since video post-processing for H.264 is done in the 2D/3D graphics shaders, I believe this is why they require graphics hardware support for Radeon HD 4XXX cards and higher.
Basically, Radeon HD 4XXX cards and higher have full decoding support for H.264 in the UVD2 hardware compoent which is completely separate from the 2D/3D graphics components of the graphics card, and they want for Flash Player (10.2 and higher) to require graphics hardware support with graphics cards that support H.264 video hardware decoding that's completely separate from the 2D/3D graphics hardware components of the video cards. As I said the UVD/UVD+ graphics cards like Radeon HD 2XXX/3XXX cards and higher don't have H.264 video decoding completely separate in the UVD/UVD+ hardware video decoder from the 2D/3D graphics hardware components of the cards.
Copy link to clipboard
Copied
OMG, I gogled this thread, and saw last post from februay 24th. I thought it was 10 days ago, LoL.
Anyway, just to check one thing before I join a more recent thread. If I got you right, you say that :
- Flash does decoding and postprocessing separately
- UVD/UVD+ does exclusively decoding (while GPU is doing PP)
- UVD2 is doing both decoding and postprocessing
I can see it would be a problem if Flash was doing decoding+PP in the same component, then UVD+ wouldn't physically support that.
But, since Flash is already doing those tasks separately, I can't see a reason why DXVA couldn't be implemented on UVD+
Copy link to clipboard
Copied
No, it's not flash that does decoding and postprocessing separately. It's the ATI cards with the UVD/UVD+ deocder like Radeon HD 2xxx/3xxx that does the video decoding and video postprocessing separately by doing most of the H.264 video decoding for flash video in the UVD/UVD+ decoder hardware component and then the H.264 video postprocessing in the 3D/2D graphics shader hardware component in the card.
Flash Player 10.2 and higher is doing flash video (with the video decoding and video postprocessing together) using one hardware component (like UVD2) separate from 3D/2D graphics rendering (not video postprocessing) using another hardware component on the card. So basically, ATI UVD2 cards are required because Flash player is now designed so that flash video(video decoding+video postprocessing together) is done in one hardware component and is completely separate from the flash 3D/2D graphics rendering done in other hardware components on the card.
Keep in mind, Flash Player 10.2 is doing the flash video layer and the flash 3D/2D graphics layer as two separate graphics layers.
Copy link to clipboard
Copied
Oh, thank you very much for being patient enough to explain that again.
I haven't read your sentence carefully, I just saw:
"If you guys notice that in Adobe Flash Player 10.2, the flash video decoding and the 2D/3D graphics rendering are done completely separately which allows for lesser CPU usage"
Copy link to clipboard
Copied
DivX HiQ works fine for my HD 3850. Obviously Adobe is willing to let a competitor step in and take away some of their market share.
Copy link to clipboard
Copied
I think it's a bit unfair to compare Divx to Adobe Flash because Divx is just video and Adobe Flash is not just video, but also 3D/2D graphics. In Adobe Flash player 10.2 and higher, the Flash player is designed to create the video and 3D/2D graphics completely separately in two layers by two completely independent separate hardware components on the graphics card. Graphics cards with the UVD/UVD+ decoder hardware component like the HD 3850 make flash video using both the UVD/UVD+ decoder hardware component and the 3D/2D graphics component on the video card, and so flash video cannot be made in separate hardware components from the flash 2d/3d graphics. The UVD2 hardware component on Radeon HD 4xxx and higher graphics cards allows for making flash video on the UVD2 hardware component being completely independent and separate from the other hardware component making the flash 3D/2D graphics on the same card.
Copy link to clipboard
Copied
Well, the original poster was asking about H.264 video acceleration, not 2d/3d acceleration for flash games. With regards to video acceleration, DivX HiQ works quite nicely. I doubt there will ever be a solution for 2d/3d flash games, because of the latency that would be introduced
Copy link to clipboard
Copied
Well, Flash Player 10.2 completely separating the flash video rendering process and the flash 3d/2d graphics process is their solution to improve this latency. With flash video and flash 3d/2d graphics being rendered separately and completely independent of each other in Flash Player 10.2, I've noticed lesser CPU usage. Although the original poster was talking about H.264 video acceleration, the fact Flash Player 10.2 is designed for both separated handling of flash video and flash 3d/2d graphics which is why the required graphics card must handle flash video and flash 3d/2d graphics independently of each in separate hardware components on the card.
Copy link to clipboard
Copied
Well, when I run DXVA checker I get that my card is capable of: ModeH264_VLD_NoFGT.
The acceleration modes are like this:
the higher the letter the more acceleration there is:
Mode A post-processing,
Mode B Motion Compensation
Mode C inverse discrete cosine transform
Mode D Variable Length Decoder ( full bitstream decode acceleration except for MPEG-2 where it means something else).
FGT is film grain technology, which allows film grain to be reintroduced on video which has had it removed.
I was told that:
"You have H.264 full acceleration upto 1080p ( Mode D ), WMV9 Mode A (post-proc) upto 720p, VC1 in Mode D,VC-1 full bitstream decode ( Variable Length Decoder, Mode D ) and MPEG-2 Modes C & D ( iDCT for unencrypted and encrypted video )...."
I still can't see why YouTube would not be capable to run accelerated on Radeon series 2&3.
Copy link to clipboard
Copied
They can't run accelerated because Flash player is now designed to do flash video and flash 3D/2D graphics separate completely from each other. The Radeon HD series 2xxx/3xxxx can't do flash video and flash 2D/3D graphics separately from each other because the H.264 post-processing needed to make flash video can't be done separately from and is dependent on the 2D/3D graphics hardware components. As i said, in Radeon HD 4xxx and higher graphics cards, all H.264 video decoding+post-processing done in the UVD2 decoders for making flash video is completely separate and not dependent on the 3D/2D graphics hardware components on those same cards.
So if the H.264 video post-processing for Radeon HD 2XXX/3xxx cards wasn't done in the 3D/2D graphics hardware components and was only done in the UVD/UVD+ video decoder hardware component instead, then the Radeon HD 2XXX/3XXX graphics would have supported acceleration.
Copy link to clipboard
Copied
My video card has capabilities to perform FULL hardware acceleration of H.264 by itself.
I don't care which part of it is done by UVD chip, and which part by the shaders.
It's all done by the hardware of the video card, and that is the point of DXVA - to decrease the CPU load.
MPC-HC on the other hand plays 1080p smoothly. Thay have done their job well.
So, my card has the capability, and Adobe software developers don't know how to use it.
It's simple as that.
Post-processing is anyway all about noise reduction, image format, sharpness, deinterlacing.
It doesn't need to be accelerated. It can be tweaked in CCC.
Copy link to clipboard
Copied
Yes, DXVA does decrease the CPU load and it is all done by the card (not doubt your card has the HD video capability), but Flash player 10.2 and higher is now designed to do the flash video and flash 3D/2D graphics separate as two separate graphics layers parallel and independent from each other which helps decrease the CPU load even more. It's all on the same graphics card hardware, but even with DXVA, doing the flash video layer and flash 3D/2D graphics layer as two separate, parallel, independent graphics layers has to do done in two separate hardware components on the card to make the two layers separately as designed with the latest Flash player. In other words, for the flash video and flash 3D/2D to be done separately, the UVD hardware component for making flash video can't share flash video decoding+flash video post-processing with the hardware components responsible for making flash 2D/3D graphics, or otherwise,flash player can't use the graphics card hardware to render the flash video layer and flash 3D/2D graphics layer as two layers separate from each other and get you less CPU usage. Even though H.264 post-processing can be tweaked in CCC, H.264 post-processing is still part of the hardware and in Radeon HD 2XXX/3XXX graphics cards is still not independent from the 3D/2D graphics shaders hardware components that make the flash 2D/3D graphics.Bottom line, Radeon HD 2XXX/3XXX cards don't support hardware acceleration Flash Player 10.2 because they don't support the graphics card hardware components that does flash video decoding+video post processing independently, separately and parallel from the hardware components that render the flash 3D/2D graphics, and this doesn't work with the design of Flash Player 10.2 which does flash video and flash 3D/2D graphics as two completely separate, parallel independent graphics layer rendering processes.
Copy link to clipboard
Copied
I understand that it does NOT support.
I am qurious WHY does it not support, when card has all the hardware needed. Data is being offoaded from CPU to Video Card, GPU says "OK, I can offload all decoding to UVD, and I can do only PP". But Adobe says "No, We forbid GPU to pee-pee."
Why does Flash want to know where PP is taken care of? And, if its one IC (UVD2) it says YES, and if it's the other IC (GPU) it says no.
Just one year ago we were happy when MCP was enabled to use GPU shaders for PP.
http://benchmarkreviews.com/index.php?option=com_content&task=view&id=407&Itemid=38&limit=1&limitstart=9
One day, some guy will not come up some goofy PP add-on, which will need UVD4 to run. And no matter if you don't want it,you won't be able to use DXVA on you 5xxx card anymore.That's not OK from my point of view.
"AMD has also stated that the UVD component being incorporated into the GPU core only occupies 4.7 mm²"
So, bunch of people are suffering because that unit is lacking additional 2mm²
Can one reserve like 40 stream processors just for PP purpose, so that he can run DXVA on Flash, Power Director, and other HD video..... LoL
Copy link to clipboard
Copied
New version out, but still no support for all Ati HD cards