I just got a Zotac ZBOX AD02 (AMD E-350 APU w/ Radeon HD 6310) and in general it's working great. However, I have a problem with Flash video on HBO GO. For the first minute or so of steaming video, everything seems perfect. Video is high quality and smooth, CPU is around 50%. Then after a minute or two there will always be some visible video glitch and CPU utilization will jump up to 90-100% and playback becomes extremely choppy. So I'm guessing Flash hardware acceleration is working initially, then something happens that kills the hardware acceleration, and of course the E-350 CPU is not up the task by itself.
The really strange thing is that it seems to be specific to HBO GO. I can stream HD all day long from Amazon Instant Video and Youtube (at least when Youtube is able to dish it out fast enough). And it's definitely not a bandwidth problem. I have Comcast cable (20Mb down), and a bigger computer I have hooked up to another TV has no problem with HBO GO. I think the same problem happens on the bigger computer, but it's faster CPU can overcome the lack of hardware acceleration.
I've got the latest drivers from AMD and the latest Flash player (even tried the 11 beta) and have tried all the main browsers (FF, IE, Chrome), but it's always the same: 1-2 minutes of perfect playback and moderate CPU utilization, then jacked up CPU and choppy playback.
Unfortunately, there doesn't seem to be any way to contact HBO GO directly about this problem (support links just refer you to your cable provider), so I thought I'd give it a shot here. Is anyone else out there having this problem with HBO GO or any other Flash video sites? Any thoughts on what else I could do to track down the root cause?
Sorry to resurrect this thread, but it's the only place I found where people were already talking about this issue.
I just started using HBO GO (I usually watch HBO shows on satellite, but I've been looking through some of their back catalog), and I'm seeing this exact same behavior. I'm running a Core 2 Duo E6600 and a GTX 460, but the effect is the same: Video is smooth and CPU utilization is low for a couple minutes, then CPU usage spikes to 90-100% and the video playback gets choppy. Did anyone ever figure out what was causing this issue? It essentially makes HBO GO useless on my computer. I'm running the latest version of Flash (11.8) and recent video drivers, so I don't think either of those are the issue. Also, hardware acceleration works on all the other video sites with flash-based players that I've tried; HBO GO is the only site where I experience this issue.
EDIT: I feel quite silly now. I looked at the site code and found that HBO is using wmode tranparent in their flash object, preventing any hardware acceleration. The bump in CPU utilization appears to simply be the player switching from SD to HD bitrates (which triggers about two minutes in, apparently, if you have a high enough speed connection).
Sucks that others are having problems, but at least I know it's not just me now.
It's interesting that Eavesdown is having the same problem on Nvidia. I figured it might have been some kind of problem between Flash and the Radeon 6310 drivers/hardware, but if the exact same thing is happening on Nvidia, then it really must be a general problem with Flash and/or the way HBO uses it.
I never was able to resolve the issue, but I think I found a somewhat tolerable workaround. Just a few days ago I tried reverting to some old Flash versions as suggested in the link Bill posted. 10.3 had a weird image clarity problem (hard to describe without seeing it - almost like a blur effect), but 10.2 basically worked. I say "basically" because it's not perfect. 10.2 was able to stream HBO GO in HD with a good frame rate, but every 10-15 seconds there's a little jerk/chop in the video. It's annoying, but it's at least watchable, unlike the 11.x versions which are unwatchable once you loose HW acceleration. I was using Firefox for this test.
I think as a workaround I'm going to see what 10.x version works best in IE and use that exclusively for HBO GO, since I don't use IE otherwise. Then I can keep the latest flash version installed in Firefox as my main browser. I wouldn't reccomend using the older Flash versions for general browsing due to all the security holes that have been fixed since then, but I figure HBO GO should be safe enough with the old version. I'll let you know how it goes.
Hopefully HBO is working on a HTM5 version of the site, but I'm not holding my breath.
To save others a few clicks, here are the links for the Flash uninstaller and archived versions:
We clearly think alike in our troubleshooting methods. I actually tried installing 10.2 for IE as part of a test earlier today, but the HBO GO site rejected me outright, saying it needed Flash Player 10.2 or better to run. I tried 10.3, as well, but received the same error, so it appears the Flash version detection script is broken when running against newer versions of IE. The site loads fine in IE if I'm running an 11.x version of Flash.
With regards to the root issue: The problem shouldn't be specific to any video cards or drivers, since the site isn't actually utilizing them. I have no clue why they're using wmode 'transparent' (as opposed to 'direct' or 'gpu', which allow hardware acceleration), since they don't need to composite the player with anything else on the site. Heck, the entire site is essentially just one large flash object, so they don't really need to worry about interaction with any standard HTML objects at all.
I imagine the site works fine for people with fast processors (although my 2.4GHz dual core should be plenty powerful for HD video if it were encoded/rendered properly), and lots of people access it through apps (iOS/Android/Roku/X360) that don't even use the site's code, so they probably figure that it's not worth their time to fix the site for people with older computers.
As far as an HTML5 or Silverlight version of the site goes: Even the main (non-streaming) HBO site is written in pure Flash, so the HBO web designers are clearly very enamored of taking the Flash approach to media display. I wouldn't expect that to change anytime soon without an incredibly compelling financial reason.
Well, my idea with 10.x on IE didn't really work. Results were not nearly as good as in Firefox, and I'm not willing to use 10.x in my main browser.
However, based on your insight on the wmode=transparent issue, I started playing around with a Greasemonkey script to force wmode=direct, and I think it actually works. I haven't been able to try it at home yet on the problem PC, but here on my work PC (don't tell anyone I'm not actually working) it makes a massive difference in CPU utilization. Without the script CPU is around 80-90% during video playback. When I enable the script it drops to around 20%. The only issue is that I usually have to refresh the page once video playback starts to get wmode=direct to take effect (start video playback, then once the video playback page loads, hit refresh in the browser).
My script is just a modified version of this one I found:
// @name Force flash wmode direct on HBO GO
// @namespace http://userscripts.org/topics/3090#posts-11620
// @description Force flash video playback on HBO GO to use wmode direct to allow hardware acceleration
// @include http://www.hbogo.com/*/video*
// @grant none
document.addEventListener("DOMNodeInserted", nodeInserted, false);
for (var objs = document.getElementsByTagName("object"), i = 0, obj; obj = objs[i]; i++)
if (obj.type == 'application/x-shockwave-flash')
var skip = false;
for (var params = obj.getElementsByTagName("param"), j = 0, param; param = params[j]; j++)
if (param.getAttribute("name") == "wmode")
skip = true;
var param = document.createElement("param");
Wow... we really think alike. I first tried a GreaseMonkey approach to fixing the wmode yesterday, but found that it didn't work consistently. Instead, I opted for using Fiddler (a web proxy debugger tool) to substitute my own customized version of the 'go.js' file that the site uses to generate the flash player object. That way, I didn't have to modify and regenerate the player object after the page loaded to engage the new wmode.
Unfortunately, the player would still seem to drop out of hardware acceleration mode from time to time. Perhaps there is something in the player's code that is causing the object to refresh itself in into a transparent wmode when certain bitrate changes occur? I'd be curious to hear if your GreaseMonkey script works on your home PC that was having all the issues, since I couldn't seem to get the wmode to stick properly using a userscript technique.
If the GreaseMonkey trick isn't working at home, and you'd like to try my Fiddler approach, just send me a PM, and I can give you the updated code that I'm running through Fiddler (as well as a quick Fiddler tutorial, if you haven't used it before).
EDIT: If you don't want to have to refresh your GreaseMonkey script when you go to a video page, just change the 'include' line to:
That way, it'll trigger no matter where you are on the site, and make the site render more smoothly, in general, since even the navigation is Flash-based.
No luck on the home PC. Like you said, the hardware acceleration won't stick. It's weird becuase I left the video running for a long time on my work PC (at least 20 minutes) and it maintained the low CPU utilization the whole time, so I know acceleration was working. Maybe Intel graphics on the work PC make a difference. But anyway, no such luck at home. BTW, I tried wmode=gpu as well with the same results. Once you get it into accelerated mode it will run OK for a while, but always looses acceleration at some point. Sounds like the fiddler approach is behaving the same way. Bummer...
Not that it matters now, but I limited the greasemonkey script to the video page intentionally figuring there must be a reason they used transparent on the site. It would be even more infuriating to find out there was no reason to use transparent in the first place, which seems like it might be the case.
What browser have you been using? I've been using Firefox, but thought I might give Chrome a try with the Greasemonkey script. Maybe Chrome's embedded Flash player will behave differently. Probably not, but I'm running out of ideas at this point.
I've been using Firefox for most of my testing, as well, but I was going to try both Chrome and IE with my Fiddler approach later today (since it's a browser-independent solution). The wmode direct setting runs fine for the rest of the site with no issues (moving through "pages" on their site is really just manipulating layers of the flash object, so the video "pages" are essentially identical to the browse "pages"); it appears there's no real need for the wmode transparent anywhere on the site.
At this point, I'm guessing that the site may be straight up ignoring the wmode parameter set in the object (or resetting it to transparent internally the second any action is taken on the site), and that the low CPU utilization is only a result of low-bitrate streaming when the streams start up. I'm not sure if you see the same effect, but the player always runs fine when it's streaming the ugly, low-res picture for the first couple minutes, and immediately starts to choke when it switches to high res (with noticeable visual improvements). It would really be nice if HBO gave us a bitrate selector like Netflix does, so we could test that hypothesis, but I don't really expect them to improve the player any time soon, since it doesn't seem to be a significant part of their revenue model.
I know exactly what you're talking about with the low-def initial playback and then the switch to HD, but at work I'm defintely seeing long periods HD playback and low CPU with wmode=direct. In fact you can usually tell when direct is working because it seems to kick into HD right away. With transparent it often takes a couple minutes.
Basically at work it seems to be functioning as you would expect in direct mode. Unfortunately I can't sit here and watch an hour of TV at work, so I can't say for sure that it never loses HW acceleration, but it defintely works for much longer periods than at home. You'd think the crappy old Intel GPU I have at work would be about as far away from "accelerated" as you can get, but it seems to work.
It's very curious to me that the site is acting differently on the Intel GPU machine. I was running some more tests (no better luck with Chrome or IE, alas), and noticed that, despite wmode direct being set, StageVideo (the hardware acceleration pipeline for OSMF) never gets used on my machine. Can you check whether that's true on the Intel GPU machine? If you right-click on the video while not in fullscreen mode and select "Show Video Debug", it should show you the StageVideo status.
On my comp, the debug status shows StageVideo as being enabled in OSMF and supported by my version of Flash, but the HBO GO player always says "StageVideo is not being used, regular video 'probably' is." This suggests to me that HBO simply isn't using the StageVideo class in their streaming code, meaning there shouldn't be hardware acceleration available regardless of wmode settings. Based on the dev documentation for StageVideo, acceleration should also be engaging in fullscreen (if available) regardless of the wmode setting. Since I don't see any improvements in fullscreen mode, it seems highly unlikely hardware acceleration is available for the player in any shape or form.
I wonder if the improvements on the Intel machine may simply be a result of wmode direct being generally less resource intensive, making it easier for the player to be rendered on the CPU, instead of a benefit from hardware acceleration kicking in.
EDIT: I'm not sure why I didn't think of it earlier, but I just checked my GPU's video engine load during HBO GO playback, and it's not engaging at all, so I'm clearly not getting hardware acceleration at any point with the player, regardless of wmode. I think I'm going to throw in the towel here and give up on watching HBO via my computer. Roku boxes are pretty cheap these days, so I may look into grabbing one of those to hook up to my TV for HBO GO playback. It's still annoying that the HBO GO techs couldn't be bothered to configure the player in such a way that it allows for hardware acceleration.
Yeah, I'm with ya. I think I'm going to throw in the towel too.
When in direct mode on the work PC, I can't seem to get the video debug dialog to display. The option is there in the context menu, but nothing happens when I click it. Or maybe it does and there's some kind of layer/z-order problem that keeps it from showing on top of the video. In transparent mode, it's exactly as you describe. That's also what I see at home.
The Intel GPU is a real mystery. Unfortunately, I don't think there's any way to monitor the load on old Intel graphics to confirm for sure if it's accelerating or not, is there? But the effect of wmode=direct is dramatic on the CPU. Like I said, around 20% utilization in direct mode and 2-4 times that in transparent. The image quality is high and seems the same in both mode. And it's 100% consistent. Enable Greasemonkey, refresh the page and CPU drops. Disable it, refresh, and it jumps. Maybe you're right that it's just the overhead of transparent mode, but I wouldn't have guessed the difference would be that big. Which makes me think it's accelerating, but who knows? I actually wish it didn't work so well because it leaves me with this nagging sliver of hope that I'm close to solving it, even though the more rational part of my brain is telling me it's not possible.
Oh well, thanks for trying to help solve this. At least it was kind of fun trying to figure out the Greasemonkey script. But man it's frustrating to get no where after all the effort and know that some dev at HBO GO could probably fix it in 5 minutes.
The fact that you can't get the debug info to show (combined with the massive shift in CPU usage) on the Intel machine when using wmode direct makes it sound like something is happening to the display pipeline with wmode direct. I don't see any of those effects on my nVidia machines. I wish I had an onboard Intel GPU machine around here to test on.
I don't know if GPU-Z will monitor engine loads from onboard graphics or not, but you might try it on the Intel box, just for kicks. If it does happen to work, it could tell us definitively whether the player is somehow being hardware accelerated on that machine or not.
Can you test the site in Chrome (with no userscripts or fiddle proxy scripts) on your AMD machine to see if you get GPU acceleration on your Radeon (using the standard GPU-Z test), as well?
I suffer this problem too.
80-100% CPU. Chrome, plays ok till HD, Firefox plays ok till HD, IE does a little bit better using 64-bit flash, but eventually chops too.
This wouldn't particularly be a problem if HBO would allow the user to force standard deffiniton... Silverlight does not have this issue.
OS is 64-bit Win 7 Pro, ATI on-board graphics, AMD Athlon x2 CPU.