PCIe Space Problems... eATX/ATX

Associate
Joined
7 Sep 2020
Posts
7
I'm buying the parts for a system build and I need some advice on motherboard choice. Now ideally I want to get a TRX chipset to take advantage of the quad channel memory support because my computer is used for way more than just gaming.

My issue is one of spacing. I've got 3 other important cards besides my graphics card, those being a Soundblaster AE-5 (PCIe x1), Intel X520-SR2 (PCIe x8) and Adaptec 8805 RAID card (PCIe x8).

The problem is, that most TRX chipset motherboards come with only 4 x 16x PCIex slots and if I put in a 3 slot graphics card, I think it will restrict me using the second 16x PCIe slot. Both the Intel NIC and RAID card both put out a lot of heat and I do mean a lot, so you ideally want some separation between these two cards and probably any other card too.

Is there any motherboard that I can buy that will give me at least a slot clearance between all these cards? Obviously the motherboard will probably have to be eATX. I just don't want a hot graphics card coupled with a roasting NIC and RAID card all bunched together.
 
Soldato
Joined
6 Jun 2008
Posts
11,618
Location
Finland
I'm afraid finding such board is unlikely:
Various ATX boards have max 7 expansion card positions.
So you'll have some cards entirely bunched up together no matter what.

If you have only headphones or stereo speakers, one solution would be replacing internal sound card with external.
Except for lacking analog surround outputs Sound BlasterX G6 is basically AE-5's external twin and Creative is even selling B-stock at nice £70 price.
 
Associate
OP
Joined
7 Sep 2020
Posts
7
If you have only headphones or stereo speakers, one solution would be replacing internal sound card with external.
Except for lacking analog surround outputs Sound BlasterX G6 is basically AE-5's external twin and Creative is even selling B-stock at nice £70 price.

I pretty much thought as much. The problems here is that, I already have the Soundblaster AE-5 and didn't really want to waste it and go external. Also, and this is just a coders' viewpoint, but I'm a bit hater of USB in general and try to avoid it at all costs. Obviously I can't avoid it completely with things like mouse and keyboard, but I prefer PCIe over the shocking way USB is implemented on Windows with the way it utilises drivers.

So let's take this back a bit... Is there a MOBO I can get where I can get a slot of space between the graphics card and say the sound card, then have the sound card close to the NIC with a slot of space before the RAID card? As long as I can separate the NIC and RAID card, even with just say the Soundcard, which essentially produces no heat, the better. I reckon I could do this if the graphics card wasn't a 3 slot thing.
 
Soldato
Joined
6 Jun 2008
Posts
11,618
Location
Finland
With memory slots on both sides of CPU socket Threadripper boards seem to have space only for six expansion slots.
Hence with three slot graphics card everything will be bunched up with no free slots:
https://www.overclockers.co.uk/giga...i-socket-strx4-atx-motherboard-mb-585-gi.html

Though if you had case with extra expansion card positions below standard 7 that could allow dropping graphics card to lowest slot of mobo.


prefer PCIe over the shocking way USB is implemented on Windows with the way it utilises drivers.
More shocking than toyphone (un)user interface fad pushed into PCs? :p
(or using buyers as unpaid alpha testers of known buggy patches)
 
Associate
OP
Joined
7 Sep 2020
Posts
7
More shocking than toyphone (un)user interface fad pushed into PCs? :p
(or using buyers as unpaid alpha testers of known buggy patches)

Well I can't comment on that, but to try and defend my position a bit: there is a reason why legacy BIOSes didn't work with a USB mouse even though they did with a mouse port. Anyway, we digress...

So if I didn't go with Thread Ripper and entertained other architectures and chipsets, what do my choices then become? The same? In other words, what have I got down the Intel road with a 32core CPU (apart from it being vastly more expensive)

All this being said, if I can find a partner 3080 that isn't 3 slot, that's the one I'll go for. The only one that I've seen that's actually dual slot is the Founders Edition. This way I can use the small Soundcard below the graphics card, the NIC directly beneath it and have a slot space before the RAID card. I hate real-estate problems, but nobody ever seems to really cater for people who have a few proper add-in PCIe cards.
 
Soldato
Joined
1 Apr 2014
Posts
18,634
Location
Aberdeen
Intel X520-SR2 (PCIe x8)

There are TRX40 motherboards with onboard 10 Gb Ethernet, so you might not need the NIC. You might also investigate Epyc workstation motherboards.

Another solution would be to use a case which supports an upright card and put one of the cards on a riser cable.
 
Soldato
Joined
6 Jun 2008
Posts
11,618
Location
Finland
Associate
OP
Joined
7 Sep 2020
Posts
7
There are TRX40 motherboards with onboard 10 Gb Ethernet, so you might not need the NIC. You might also investigate Epyc workstation motherboards.

Another solution would be to use a case which supports an upright card and put one of the cards on a riser cable.

There are quite a few boards that support 10Gbe but none of them will give you a fibre output which is what I need. They're all ethernet and some of the stuff I do just wouldn't work well with that kind of latency. The upright card on a riser might be a good thing for me to look at. Wasn't what I was planning, but I could entertain that.

Not sure about EPYC MOBOs, not really looked too much into those.
 

Kei

Kei

Soldato
Joined
24 Oct 2008
Posts
2,750
Location
South Wales
You need to watercool your GPU so it takes up just one slot.

My X399 system has:
Vega 56
RME HDSPE audio card
AjA Kona LSe video capture card
GTX 780 (just used for Cuda)
Intel X710 10Gbe SFP NIC
and two NVME SSDs too
 
Associate
OP
Joined
7 Sep 2020
Posts
7
You need to watercool your GPU so it takes up just one slot.

As cool as that is and I do mean it's seriously cool, and I have thought about watercooling, it's a little more work than I had planned. First, there's a bit of work in dismantling the graphics card and stuff and with a 3080, it might be quite tricky getting waterblocks at this early stage. Then, since this computer is usually on 24/7 with very little downtime, it would require much more maintenance for fluid changes and stuff. Secondly, I doubt you could get waterblocks for the NIC and RAID card, so you'd need to customise something if you were going to watercool those as they're a bit problem in terms of heat generation.

You know, I really have thought about soft tube watercooling like you've done, it just seems like such a ball-ache to do. Then there's the worry of leaks or a gasket going on one of the fittings etc.
 

Kei

Kei

Soldato
Joined
24 Oct 2008
Posts
2,750
Location
South Wales
As cool as that is and I do mean it's seriously cool, and I have thought about watercooling, it's a little more work than I had planned. First, there's a bit of work in dismantling the graphics card and stuff and with a 3080, it might be quite tricky getting waterblocks at this early stage. Then, since this computer is usually on 24/7 with very little downtime, it would require much more maintenance for fluid changes and stuff. Secondly, I doubt you could get waterblocks for the NIC and RAID card, so you'd need to customise something if you were going to watercool those as they're a bit problem in terms of heat generation.

You know, I really have thought about soft tube watercooling like you've done, it just seems like such a ball-ache to do. Then there's the worry of leaks or a gasket going on one of the fittings etc.
I was worried about leaks but have never had one from a fitting as I took all of the precautions to minimise the chances. I started watercooling 7 years ago and never looked back. Downtime is without doubt an issue though as it's taken me several hours to drain my system, make any changes then refill and bleed it again.

I would say that case manufacturers have a lot to answer for. Everything these days is form over function as looks come first. My new server case is a corsair 750D which I bought as it's one of very few cases which can support a lot of disks without resorting to backplanes. I'm looking at buying a spare metal side panel in order to drill some holes so that I can mount some fans in the side to blow over the 10Gbe NIC, RAID card and SAS expander as they all get very hot without direct airflow. The old coolermaster case it was in had a normal side panel with two fan positions which was great. I'd do the same for my workstation in the enthoo primo pictured above except I can't buy that side panel to drill.

In any case, if the motherboard doesn't have the right slots in the right places, it's never going to work. As far as I know, E-ATX won't make any difference as it's simply wider and offers the same number of expansion slots.
 
Associate
OP
Joined
7 Sep 2020
Posts
7
I would say that case manufacturers have a lot to answer for. Everything these days is form over function as looks come first. My new server case is a corsair 750D.

I wholeheartedly agree. Other problems I encounter are 5.25" bays as I have 3 optical drives that are used for media reading. It's all about looking cool and although I don't mind the odd bit of cable management, performance and stability are what are paramount to me. No overclocking, just pure stability and in order to achieve that, you need good air-flow which is virtually impossible, or just a pain in the arse if you have any proper PCIe hardware in your machine. It's not like you can sacrifice a RAID card when all you've got are those pieces of **** on-board RAID controllers that take 4yrs to rebuild arrays and stuff.

I just want good air-flow without having to jump through all these hoops just because people want their RGB nonsense. Surely it's not to much to ask? It's not like I'm asking for like 600 PCIe cards, just 3.

Okay, thanks for all the help people. I'm either going to having to go watercooling or find some riser contraption to make all this work.
 
Back
Top Bottom