• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

External GPU enclosures - case modding

Associate
Joined
2 Dec 2019
Posts
9
Hi,

Might be the wrong place for this, but even if it is, perhaps you can point me in the right direction.

I'd like to put 4 GPUs in an external enclosure (server style 4U) does anything exist for that?

The 3995Ws have the free PCIex16 slots, but there is no way you can get them inside the case.

Does it make sense?

thx
 
Soldato
Joined
11 Oct 2007
Posts
3,114
Location
London, UK
What? Why do you want them in an external box if you are concerned about the pcie lanes/have a 3995w system already?
There are tons of 4u cases for a full server, often specific to mobo but there are plenty of standard ones.
If you have an atx board with a 3995w, you could water cool the cards, this meaning they are single slot and fit in any standard case.

If you are hoping there is a box like a thunderbolt 3 gpu enclosure but for four GPUs, there is nothing on the consumer side I know of. Server side gpu compute boxes exist but tend to be add ons for other servers rather than "off the shelf" solution to anything.

I could make you something custom but if its just for something basic it wouldn't be worth the cost.

If for mining, doubt they exist as its pretty simple to have 3 or 4 GPUs in an off the shelf case, and if you build a box using aluminium extrusions, they'll be built for more than 4 GPUs.
 
Associate
OP
Joined
2 Dec 2019
Posts
9
What? Why do you want them in an external box if you are concerned about the pcie lanes/have a 3995w system already?
There are tons of 4u cases for a full server, often specific to mobo but there are plenty of standard ones.
If you have an atx board with a 3995w, you could water cool the cards, this meaning they are single slot and fit in any standard case.

Hmmmm but it has to sit in a DC for the connection. Doesn't water cooling leak sometimes? If there was a leak, and the water dripped out of my rack onto the rack below and destroyed all their servers, would they blame me? Things like this bother me.

I guess I could get a 1Gbps connection to my house and do the water cooling thing, but what does such involve - Open Reach digging up my road probably.

If you are hoping there is a box like a thunderbolt 3 gpu enclosure but for four GPUs, there is nothing on the consumer side I know of. Server side gpu compute boxes exist but tend to be add ons for other servers rather than "off the shelf" solution to anything.

I've noticed. And Thunderbolt is 40Gbps I think, which is far to slow... PCIe(4.0)x16 is 256Gbps I think.

I could make you something custom but if its just for something basic it wouldn't be worth the cost.

Wow, you can build these things, that's a skill - maybe you can share pics of some stuff you've made - I know I can't.

If for mining, doubt they exist as its pretty simple to have 3 or 4 GPUs in an off the shelf case, and if you build a box using aluminium extrusions, they'll be built for more than 4 GPUs.

Well yea, I've seen exactly what I want with "mining rigs" but they seem to connect over USB using PCIe x1 - I'm guessing it's some peculiarity of "mining" (whatever that is exactly) that you don't need much bandwidth between the CPU and the GPU; that isn't the case for me, I need a full PCIe(4.0)x16 for each GPU.
 
Soldato
Joined
11 Oct 2007
Posts
3,114
Location
London, UK
Hmmmm but it has to sit in a DC for the connection. Doesn't water cooling leak sometimes? If there was a leak, and the water dripped out of my rack onto the rack below and destroyed all their servers, would they blame me? Things like this bother me.

I guess I could get a 1Gbps connection to my house and do the water cooling thing, but what does such involve - Open Reach digging up my road probably.



I've noticed. And Thunderbolt is 40Gbps I think, which is far to slow... PCIe(4.0)x16 is 256Gbps I think.



Wow, you can build these things, that's a skill - maybe you can share pics of some stuff you've made - I know I can't.



Well yea, I've seen exactly what I want with "mining rigs" but they seem to connect over USB using PCIe x1 - I'm guessing it's some peculiarity of "mining" (whatever that is exactly) that you don't need much bandwidth between the CPU and the GPU; that isn't the case for me, I need a full PCIe(4.0)x16 for each GPU.

If you can't already get 1gbps, don't bother. Digging up road is expensive etc.

What I build isn't really data centre oriented, it's one off custom things, I aim at things like integrating silently into furniture or Art piece builds, hence not being worth it if you basically just need a box.

Yea mining only needs 1x because it just loads a file into vram and then its all local to gpu workload. But, that is only limited to 1x due to the motherboard pcie layout and risers used. You could get a frame, and use 16x risers and get the bandwidth you need. Technically any rack case with 4 x gpu support could do this, with the correct risers.

Yes watercooling can leak, should only be during initial setup /after a big move, but in a rack it's advisable to put it at the bottom etc just in case. I wss more imagining in a case at a desk rather than in a rack.

So are you looking to have a box of your GPUs, hosted in a DC?
You already have a server located with them?
What is the server currently?
Do you work for this company and is it for personal or business use?
If you have nothing currently bought, aren't the new nvidia nvlink based servers that were recently announced exactly this, or are they way beyond spec required?

I'm still learning towards doing it in one case is simpler than having a separate one for GPUs, since you'd need a half depth 4u case for GPUs, then risers. But if you went full depth you could fit them in the same box.
The interconnects for a separate gpu box just aren't that fast AFAIK, without going into individual/specifically designed solutions at least, and if your server isn't designed for external GPUs, it's not going to have any "nice" way to connect up.

So it really depends on what you currently own, what box its in, and what GPUs you need.
 
Associate
OP
Joined
2 Dec 2019
Posts
9
If you can't already get 1gbps, don't bother. Digging up road is expensive etc.

What I build isn't really data centre oriented, it's one off custom things, I aim at things like integrating silently into furniture or Art piece builds, hence not being worth it if you basically just need a box.

Yea mining only needs 1x because it just loads a file into vram and then its all local to gpu workload. But, that is only limited to 1x due to the motherboard pcie layout and risers used. You could get a frame, and use 16x risers and get the bandwidth you need. Technically any rack case with 4 x gpu support could do this, with the correct risers.

Yes watercooling can leak, should only be during initial setup /after a big move, but in a rack it's advisable to put it at the bottom etc just in case. I wss more imagining in a case at a desk rather than in a rack.

So are you looking to have a box of your GPUs, hosted in a DC?
You already have a server located with them?
What is the server currently?
Do you work for this company and is it for personal or business use?
If you have nothing currently bought, aren't the new nvidia nvlink based servers that were recently announced exactly this, or are they way beyond spec required?

I'm still learning towards doing it in one case is simpler than having a separate one for GPUs, since you'd need a half depth 4u case for GPUs, then risers. But if you went full depth you could fit them in the same box.
The interconnects for a separate gpu box just aren't that fast AFAIK, without going into individual/specifically designed solutions at least, and if your server isn't designed for external GPUs, it's not going to have any "nice" way to connect up.

So it really depends on what you currently own, what box its in, and what GPUs you need.

Yea, I was thinking if I could stick them into a 4U chassis, I could put the unit in the same rack, right underneath the server... currently have a 3960x desktop (awkward for rack) with 1 GPU in the DC but it's not good enough... need machine to have 1TB ram, so it will need to be 3995W, 3795W or an EPYC. You can get EPYC 2U servers that take 4 GPUs, not 3090s, special server ones A5000 or A6000 i think. Probably I should just buy that. It's for my one man company.
 
Soldato
Joined
11 Oct 2007
Posts
3,114
Location
London, UK
Yea, I was thinking if I could stick them into a 4U chassis, I could put the unit in the same rack, right underneath the server... currently have a 3960x desktop (awkward for rack) with 1 GPU in the DC but it's not good enough... need machine to have 1TB ram, so it will need to be 3995W, 3795W or an EPYC. You can get EPYC 2U servers that take 4 GPUs, not 3090s, special server ones A5000 or A6000 i think. Probably I should just buy that. It's for my one man company.

There are probably passive versions of the 3090 with standard heatsinks where the fins are aligned with the cooling in a rack. 4 desktop 3090s on air arent going to fit in many places, with their stupid hsf arrangement lol.
Since you haven't bought the server yet, I would go for something that holds it and the GPUs in one.
Are you self building it or buying it prebuilt? Phoning the sales at a server company like supermicro or asrock rack etc might be a good way to find out what hardware options are available, then source however you like.
 
Back
Top Bottom