1 person found this helpful
gigabyte is one of the top mfg's for parts, along with asus, but didn't seem to have good features with their x99 boards. looks like thats changed and this could be a great board, especially at its price. to find an asus mbd that could 4-way sli would cost almost 2x this mbd. i also like the reinforced pcie slots for big/heavy gpu's. i looked at the manual, it doesn't seem to clearly say if any pcie slots share cpu pcie lanes with the m.2 slot like the asus boards. it does say that raid is disabled when using m.2, which might turn off some people.
I own a Gigabyte X99P-SLI (I even have a review on Newegg).
Not sure if my insight will be useful to people but in short I must say that I am impressed with this board. I have built over 30 computers and work part time in the film/media industry with photo editing, video editing and after effects. That being said, this motherboard has all the features you could need (and more) to accomplish your media/creative tasks.
1) Expandable to 128GB of Memory (8x)
2) Support for latest M.2 standard (PCI-E 3.0x4) - Only few X99 boards have this. There is one or two from MSI and one or two from EVGA.
3) Supports upcoming 2011-v3 processors in latest bios update
4) Thunderbolt 3.0 and USB 3.1 (first one based on intel processor)
5) Metal reinforced PCI-E lanes
6) 4x Full PCI-E 16x lanes
7) Stable and reliable
8) Isolated audio channels on motherboard
9) USB 3.1 type C and type A
10) Configurable light ray trace for audio (just something cool)
11) Worth the $250 or less price tag
12) Good CPU heatsink clearance with the motherboard heatsinks.
1) Bios was initially an issue. It was slow and clunky but with the latest F20 bios I noticed a substantial improvement. They also fixed the XMP profile issue with AUTO CPU mode.
Overall this is a great motherboard with the latest gen technology on an x99 platform. Working with video editing and after effects, I have noticed no hiccups or slow downs, no blue screens or restarts. If you are looking for an expandable motherboard with the newest features then this is a no brainer.
I like some of the claims for this board and may buy one, but have to correct one wrong impression above:
"2) Support for latest M.2 standard (PCI-E 3.0x4) - Only few X99 boards have this. There is one or two from MSI and one or two from EVGA".
All the X99 motherboards from ASUS do fully support these new super SSD devices and provide this kind of performance. And truly with our PPBM Disk I/O benchmark we can achieve a write rate of 1500 MB/second when using one of these devices
gigabyte's website has this notice for the motherboard, but i do not see any info about it in the manual:
* Theoretical Bandwidth, M.2 Bandwidth may vary by CPU model.
one reviewer on newegg said gigabyte sent him this message:
Please check the CPU as well as the 5820K bandwidth is limited as compared to the 5930K CPU. On our testing for this board for this Samsung M.2:
If using 5930K, the speed is around 2591MB/s.
If using 5820K, the speed is around 1671MB/s.
this suggests poor cpu pcie lane assignment design for the 28 lane cpu. the manual doesn't specifically say x8/x8/x8 for the 28 lane, but its suggested with the expansion slot notes and support for 3 way sli with the 28 lane cpu. the last x4 allocation is unclear to me as i couldn't find any info in the manual. if its assigned to the last pcie slot, then a x4 m.2 pcie adapter could be used to let the 28 lane cpu get around the m.2 slot issue. the message from gigabyte sounds like they had to test it just to figure out what it's suppose to do. the poor documentation and design worries me about this board and company.
I just bought a m.2 hdd, and 10x hard disk for Gigabyte x99p-sli, can I raid 8 hard disk as media hard drive and the rest run individually as operating system, cache, output/preview/ auto save? Hopefully you expert can answer my question.
Thanks in advance.
the 10 sata ports are split to 6 + 4, and the last 4 do not allow for raid. so i believe you can only use up to 6 drives in raid. if using 6 drives is too small of capacity, you may have to buy a raid card or return 8 hdd for 6 larger ones.
there is also this note on newegg for the motherboard:
Support for RAID 0, RAID 1, RAID 5, and RAID 10
* Only AHCI mode is supported when an M.2 PCIe SSD or a SATA Express device is installed.
i couldn't find this on gigabytes website or in the manual. if its true you might get around it by using a pcie m.2 adapter card and avoiding the m.2 slot on the motherboard, or using a raid card for the hdd's.
So if I buy a pcie raid adapter card, I can use additional 2x hdds with those 6xhdds in raid mode. right? As raid adapter use only pclex1 slot, but I won't use m.2 ssd adapter because I will lost pciex16 slot and the board has other 4x sata connectors on board if the following message is true.
Support for RAID 0, RAID 1, RAID 5, and RAID 10
* Only AHCI mode is supported when an M.2 PCIe SSD or a SATA Express device is installed.
I saw this message is listed under the Internal storage interface section of MB manual, but not sure it's still valid.
hardware based raid cards will have one or more sas ports that support 4 sata drives per sas port. newer ones usually have mini sas ports such as the SFF-8643. these raid cards support drives in multiples of 4, such as 4 hdd, or 8 hdd, or 12 hdd, etc. so you will be able to buy one that will work with up to 8 hdd. areca is a good brand, often recommended and used by people in this forum.
these hardware raid cards use x8 pcie lanes, so they will have to go in one of the x16 size slots, and not the x1 slots. sata/raid cards that use those x1 are cheap/junk raid cards. it may also depend on which cpu are using, as the i7-5820k and i7-6800k only have 28 lanes vs the rest of the more expensive i7 or xeons that have 40 pcie lanes. if you have the 28 lane cpu, that motherboard can still support 3 cards in the top 3 x16 size pcie slots, splitting the 28 lanes to x8/x8/x8. the motherboard manual shows the different pcie lane configurations for the different cpu's.
if you haven't used a raid card before, you might want to do some research. if you are using 8 larg hdd's, you may want to use raid 6 or better to lower the risk of a complete raid failure. you will also want to have a UPS powering your computer, one that is also connected by usb that can turn off the computer if the power is out and the UPS battery is low.
I would prefer to use the 6 onboard raid connectors in conjunction with two
ports from raid adapter using pclex1 slot, is it possible? because I don't
want to lose pclex16 slot, I possibly need to install 4 GPUs in the end to
speed up rendering process.
BTW, I use i76850k, so it won't have lane restriction.
you must be using some other software beside adobe to require 4 gpu's, premiere is usually fine with just one.
which gpu's are you planning on using? and have you already purchased them?
the only way you could mix and match different sata ports to get to 8 hdd's and still raid, would be using software raid. last time i looked, there wasn't any good options for this for raid 6 or better, on windows. Microsoft has been trying to copy ZFS for over a decade, but last i heard its still not reliable. so short answer, no.
if you don't want to run 3 gpu's and a raid card or 6 drives using motherboard raid isn't enough, your options may be limited to an external setup using the thunderbolt or network connections. a 10gbe nas would be fast, but your motherboard only lists a slow 1gb ethernet port. there are thunderbolt to 10gbe adapters, but they are also very expensive. promise technology makes some thunderbolt storage devices that have built in raid cards and hold 8 hdd's, but are also very expensive.
Premiere only works with one GPU when editing, but works with multiple GPUs
when rendering. It's good when you have lots of effects on timeline.
Ok, I have to sacrifice 2 hdds, so I only have 6 left, do you know how much
speed difference out of 6 hdds in raid 0 and raid 5? 50mb/s or something
in terms of reading?
it might be well over 100mb, it would depend on the speeds of the drives, which also depends on how full and fragmented they are. motherboard raid will not function as fast as a real raid card, so you might run into speed limitations with the motherboard raid that limit performance even lower.
were you still planning on using the m.2 ssd? and have you checked to see if using it disables the motherboard raid?
premiere is a cpu based program, with gpu support second. so you really have to be doing something special to need more than one or two gpu's for premiere. especially with only a 6 core cpu, its going to limit what premiere can do. people using two gpu's usually need the second for red raw debayering.
If it's over 100mb, then I will use raid 0 then. I have m.2 in front of
me, but the motherboard manual and online features from Gigabyte site
clearly says only ACHI mode is available when M.2 or sata express
populated, so I don't try. Should I give a try?
True, when cpu 6 core reaches its limit, the number of GPU is limited too,
but I have two old cards already and plan to purchase the upcoming next gen
geforce card, so I will have 3 minimum for sure, the fourth one may or may
not be plugged in. I am thinking if I I use one pclex16 slot (8x speed) for
8 hdds in raid, the speed will be limited to that of 8x pcie speed which is
half speed of 16x pcie. I guess using raid card and onboard raid will be
similar if they are all ssd, maybe slightly different for spinning hdds.
i looked the newegg reviews for your motherboard, searched for raid, and there were some reviewers that had a m.2 drive and raid in their comments. so maybe it does work? if you already have the m.2 drive opened and/or are planning on keeping it, it might be worth testing. if you were using the m.2 drive for the os/apps drive, it would be overkill, and you might want to return it. a sata ssd samsung 850 will be fine for os/apps.
if you mix and match different speed gpu's, say two older gtx 670 and a faster new gtx 1070, the older cards will hold back the new card(s). so you might as well get another used card of the same speed as the ones you already own, or sell the old ones and go with 1 or 2 new fast gpu's.
pcie x8 would be rated at 7.8gb bandwidth, you wouldn't get close to that even with 8 ssd. also, if you don't have nas or enterprise drives, you might as well use motherboard raid 0 and 6 drives. if they are nas or enterprise, using 8 on a hardware raid card should perform better, and also gives you more parity raid configurations to choose from.
My original plan to use m.2 is for two purposes, one is to free up one sata
port, and the other one is to use it as a cache drive, not for os/app. But
I will give it a try since I already opened the package. Also apart from
upgrading bios after installing MB, what other software I need to upgrade
from Gigabyte site.
I don't use SLI for editing, so mixing different cards will be fine.
Premiere doesn't support SLI anyway.
whichever drivers and utilities you may need. if you plan on using the bios to setup things like the fans and overclock, you may not need to bother with those utilities.
i was never suggesting or talking about sli.
we had posts from people using two different video cards, premiere was trying to send both cards 50/50 of the workload, so the slow card was at 99% and the fast card was around 50%. so unless something has changed in premiere, you may end up with the same problem.
What make and model is your M.2 drive? There are good and bad ones on the market.
I thought it would work like that, Pr send one frame rendering job to one
card and second frame to the second card, and then if one card finishes its
job, Pr will give the third frame to render regardless which one is slow or
fast, otherwise there is no point to keep old cards as mentioned by Adobe
product manager in a video last year.
I bought a M.2 samsung evo 850
i thought you had the samsung 950 pro or similar. the evo 850 would be ok, but i don't see it on the supported m.2 list. so i don't know if the motherboard will work with it. some only accept pcie m.2 like the 950 pro.
you are right, Ron, I put it on, and there is nothing in bios you can make
it appear. So I definitely have to buy a adapter or resell it.
I would not bother getting an adapter, I am not sure it would work as the adapters I know are designed for the PCI gen3 x4 M.2 devices and not for SATA III M.2. While I have a stack of adapters I do not have any SATA III M.2 device to test that theory.
if you were to use a pcie x4 adapter, its going to take a x16 slot that could be used for a raid card or your gpu's. if you skip the m.2 drive, a pcie x1 sata controller card could connect 2-4 more sata ssd's. it will require one of your four gpu's is single slot size and not double, to avoid covering up the pcie x1 slot. if you go with 3 double slot gpu's, third in the very bottom pcie x16 slot, there should be room for a pcie x8 raid card and pcie x1 sata controller card.
I was told by local computer seller it's better wait for the external HDD
raid box usb 3.2. The internal pciex16 port for Raid adapter maybe a good
temperary solution . But what is the point to use internal pciex1 slot
raid adapter where we have 4 non-raidable sata ports onboard already?
I have another question. I have two hdd in raid 0 already, since I will
anther 4 disks in the array, will I be better start from new, or just
expand them on top?
my last post was just some examples of how much you would be giving up by using a pcie m.2 adapter for a single m.2 sata drive, not necessarily recommending to do those examples.
usb 3.2? perhaps they meant usb 3.1 gen 2? (no one is calling it usb 3.2)
if you are talking about an external raid box for your 6-8 raid setup, i don't recall seeing any usb devices that raid that many drives. external raid would be handy if you need to transfer the raid between different computers quickly, other than that an internal raid card will be better. usb would have lower bandwidth and possibly higher latency over pcie x8 or x16 raid card, and almost all external raid drive enclosures use junk raid controllers, so the raid performance will suffer from that as well. any quality usb or thunderbolt external raid enclosure with a real raid card inside of it, will cost 2-4 times as much as just the pcie raid card.
i don't remember if the intel raid will let you add more drives, it may require you to rebuild the raid from new.
I returned the m.2 ssd already, intel raid won't allow me to add more hdds
onto existing one, so rebuilt a new raid. External usb3.2 I mean similar
speed as thunderbolt 3 using the same port on the Mobo.