Try searching the site for the error - you'll find lots of material.
But the short answer is - the card doesn't meet Lr's minimum spec.
So you are using a Mac or a Windows PC?
If a Mac they are notorious for not using the add-on, discrete, graphic card until you actually open a program that can and does want to use it.
But in the end the GPU feature currently being used in LR isn't really ready for prime time and in 99% of the installs it is better to just turn it completely off.
One thing to know about the graphics card acceleration is that it actually slows down the brush tool by quite a bit if it is working. So be careful what you wish for. That said, you should probably go into your graphics setup in windows (this sounds like a windows machine) and turn off the automatic switching between builtin graphics and discrete graphics card. That might enable you to use the GPU acceleration in Lightroom. It will speed up many tasks in Develop but not all especially on high resolution displays. If you can't get it to work even with only the discrete card enabled, you should see if you need driver updates for your card.
I found the driver update for Nvidia on their website and I can use the GPU acceleration now with 960M ... but it's not exactly the smoothest still and the brush tool still lags about 1 sec. Also, I checked my CPU usage and it peaks at 100% sometimes a lot of the times when using Lightroom actively, is that normal?
If I don't use any GPU acceleration it's even slower...
I don't understand why it's so slow, is it simply because the 4k display takes up too much power?
This is my system info if that helps
Lightroom version: CC 2015.1 [ 1025654 ]
Operating system: Windows 8.1 Home Premium Edition
Version: 6.3 
Application architecture: x64
System architecture: x64
Logical processor count: 8
Processor speed: 2.5 GHz
Built-in memory: 16272.9 MB
Real memory available to Lightroom: 16272.9 MB
Real memory used by Lightroom: 990.1 MB (6.0%)
Virtual memory used by Lightroom: 972.9 MB
Memory cache size: 459.1 MB
Maximum thread count used by Camera Raw: 4
Camera Raw SIMD optimization: SSE2,AVX,AVX2
System DPI setting: 240 DPI (high DPI mode)
Desktop composition enabled: Yes
Displays: 1) 3840x2160
Input types: Multitouch: Yes, Integrated touch: Yes, Integrated pen: No, External touch: No, External pen: No, Keyboard: No
Graphics Processor Info:
GeForce GTX 960M/PCIe/SSE2
Check OpenGL support: Passed
Vendor: NVIDIA Corporation
Version: 3.3.0 NVIDIA 353.30
Renderer: GeForce GTX 960M/PCIe/SSE2
LanguageVersion: 3.30 NVIDIA via Cg compiler
2) Canon Tether Plugin
5) Leica Tether Plugin
6) Nikon Tether Plugin
Config.lua flags: None
Adapter #1: Vendor : 10de
That's unfortunately pretty normal. It takes a lot of horsepower to render images at the resolution you are working at and this is not a very high powered gpu. If you run at 1/2 resolution you'll see it being much faster as it only needs to calculate 1/4 of the data. This sort of raw processing of high megapixel cameras and on high res displays is at the edge of what you can do.