Recently had to install a new video card. My last couple have been NVidia, but I've had quite a few problems, so trying an AMD card this time. It's a Radeon, RX580. Not top of the line, but certainly no slouch either.
When I used Lr for the first time after installing the card (GPU use is turned on in Lr), I got some very bizarre things happening.
Browsing in the Library module was fine. However, when I went into the Develop module that's when things got weird. On some images, as soon as I entered the Develop module, the screen would just go black. On others, when I tried to use the Crop tool, the screen would go black. A screen grab is below. On some images, I could edit without any issue. The black screen is not camera specific, it's not date specific (i.e., images taken within a certain period), it doesn't seem to have any specificity at all.
If I go into Preferences and turn off GPU use, the problem goes away.
This has, I realise, been discussed in other fora (Pr very much). Adobe "recommends" the R9 series of cards from AMD. The R9 is an old line of cards. The current crop of AMD cards is the RX line. The idea that an alleged top software company like Adobe doesn't build it's software to make use of current technology, and further that it builds its software to favour NVidia products is absolutely ridiculous.
What the bloody hell is Adobe doing?
Is anyone else experiencing similar problems with AMD cards?
Could you please have a look at this article to know more about unsupported GPU in Lightroom: Adobe Lightroom GPU Troubleshooting and FAQ
Also, I believe Nvidia GPU's are more reliable in terms of applications support, however, there are some AMD cards which works really well with Adobe Applications.
Thank you for that cut and paste answer. Thank you for not addressing the actual issue of the post.
Yes, I've seen that article. That's how I'm aware that Adobe "recommends" the R9 line from AMD. That's how I'm able to make the comment I did about Adobe's software not being up to date with current technology.