Adobe's leveraging of the GPU just seem to be so flaky. I upgraded a HD6000 series card to a R9 280 (at reasonable expense!) to try to get some advantage out of the GPU but still no luck.
GPU acceleration is disabled due to error.
My set up is:
Lightroom version: 6.1.1 [ 1032027 ]
Operating system: Windows 7 Home Premium Edition
Version: 6.1 
Application architecture: x64
System architecture: x64
Logical processor count: 4
Processor speed: 2.4 GHz
Built-in memory: 8190.5 MB
Real memory available to Lightroom: 8190.5 MB
Real memory used by Lightroom: 1356.1 MB (16.5%)
Virtual memory used by Lightroom: 1318.5 MB
Memory cache size: 208.2 MB
Maximum thread count used by Camera Raw: 4
Camera Raw SIMD optimization: SSE2
System DPI setting: 96 DPI
Desktop composition enabled: Yes
Displays: 1) 1920x1200
Input types: Multitouch: No, Integrated touch: No, Integrated pen: No, External touch: No, External pen: No, Keyboard: No
Graphics Processor Info:
Check OpenGL support: Failed
Vendor: ATI Technologies Inc.
Version: 3.3.13399 Core Profile Context 15.201.1151.0
Renderer: AMD Radeon R9 200 Series
Following the Adobe help they have identified an 'issue' with with the AMD Catalyst package 15.7 (which contains the driver version 15.200.1062.1004) and recommend updating to the 15.8Beta package (containing driver 15.201.1151.0) - I've tried both and still GPU acceleration is disabled due to errors.
My original install of LR6 did get GPU acceleration with AMD driver 14.501.1003.0 but I can't roll back to that now (and it caused some display errors in IE11)
Any help gratefully received
Are you actually seeing slow performance by Lightroom's develop module? If so, what actions in the develop module are slow?
With a 1920 x 1200 monitor, you will see little to no benefit to GPU acceleration even with the best graphics card, I would turn off the GPU acceleration and not worry about it in Lightroom unless you get a larger monitor.