I wonder if there is something wrong here. I have a decent machine (2500k overclocked to almost 5ghz, GTX570, 8Gb ram, Lr installed on a SSD).
LR5 served me well for a long time, and despite being slow at some tasks it was ok. On the other hand, LRcc offers very few new features and a big performance decrease.
I'm editing some 20mp canon raw files, and the simple task or cropping/rotating/panning became a nightmare. It is slow and something appears to be very wrong when I try to recrop a virtual copy of an image. On top of moving very slowly, the crop rectangle sometimes keeps resetting its position and "refuses" to stay in the place I put it at.
When I zoom and pan on library mode everything goes well, it only gets bad on developer mode. If I zoom 1:1 and have some adjustments made with the spot removal brush, it feels like I'm panning a gigantic file on a 20 year old computer. Even in a clean image with no adjustments at all, developer mode struggles badly to pan when zoomed.
My catalog has around 40,000 files but it shouldn't matter when developing a single photo. Again, the same catalog was on LR5 and didn't have any of these issues.
Checked the computer resources usage, and nothing is actually being forced to the limit. Total memory usage never goes above 5gb (out of 8gb) and processor usage doesn't raise above 40%. If feels that LRcc is not making proper use of the computing power available.
Just replying to myself here, I tried turning off that option under Preferences>Performance to use my Graphics processor, and the the performance boosted immediately. Now it's on par with Lr5.
I am using the 350.12 (latest as today) version of Nvidia driver with a GTX570. Apparently there is an issue on the way that LRcc uses my video card. I hope this gets fixed, but at least now my LRcc is usable.