• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia GeForce GTX 980 Ti Coming This Summer Featuring 3072 CUDA Cores and 6GB of GDDR5

Status
Not open for further replies.
Associate
Joined
15 Jun 2009
Posts
2,189
Location
South London
That Windforce triple fan design is utter rubbish (fan blades are too small) as soon as it goes above 60-65% RPM its way louder than stock :rolleyes: I RMAd 2 x GTX780 Ti with it on last year wasted £40 on P&P :(

Did you you miss the part when I mentioned the Windforce on the 970 was great but that's because the tdp is so low? Again yes when you slap it in more power hungry cards the fans will ramp up more. Particulary when you go SLI with said powerful cards. We both have obviously had different experiences based on the fact that my single 970 Windforce would hardly go over 59 degrees while being inaudible for me. Double up with almost 200w more heat to dissipate is a completely different situation.
 
Soldato
Joined
10 Oct 2012
Posts
4,420
Location
Denmark
Come on already release the new cards from both camps!!!. DO IT..DO IT.. DO IT!!
iDh1vAFaEWwjS.gif
 
Soldato
Joined
8 Nov 2006
Posts
9,237
Soldato
Joined
13 Mar 2008
Posts
9,638
Location
Ireland
Do you not feel/see the microstutter some people report with SLI and GSYNC Andy? I must admit I'd be tempted to go with 2 reference 980ti's in SLI if it worked as well with GSYNC as it did single card on the TX.

That's what I want to know. I have over 1500 for a GPU/s burning a hole in my wallet for an upgrade but I want the best possible experience for 1440p G-Sync 144Hz monitor.

The last time I used SLI was with the 7950GX2 which was nothing but hell, and crossfire with the 4870X2 and Dual FirePro D700's were messy to say the least.

If Two 980Tis are priced too high, and offer what I want I'll go for that over a single powerful Titan X.
 
Soldato
Joined
7 Aug 2012
Posts
2,643
That's what I want to know. I have over 1500 for a GPU/s burning a hole in my wallet for an upgrade but I want the best possible experience for 1440p G-Sync 144Hz monitor.

The last time I used SLI was with the 7950GX2 which was nothing but hell, and crossfire with the 4870X2 and Dual FirePro D700's were messy to say the least.

If Two 980Tis are priced too high, and offer what I want I'll go for that over a single powerful Titan X.

2 Ti's will demolish a single Titan X. And at 1440p 6gb is more than adequate. I say that as very happy Titan X owner, with a ROG Swift. Unfortunately your budget of £1500 isn't going to quite stretch to 2 Ti's and a 1440p Gysnc monitor (I assume your budget was inclusive of the monitor?)

Will probably be closer to £1700 combined. Whereas would be ~£1300ish if you go the single Titan route, or -£1100ish if you go the single Ti route.

What I will say is a single TX is more than capable at 1440p. You wont hit 144fps in many games, but with Gsync that isn't really all that important.

EDIT: Apologies I see in your sig you already have Swift lol.
 
Soldato
Joined
13 Mar 2008
Posts
9,638
Location
Ireland
2 Ti's will demolish a single Titan X. And at 1440p 6gb is more than adequate. I say that as very happy Titan X owner, with a ROG Swift. Unfortunately your budget of £1500 isn't going to quite stretch to 2 Ti's and a 1440p Gysnc monitor (I assume your budget was inclusive of the monitor?)

Will probably be closer to £1700 combined. Whereas would be ~£1300ish if you go the single Titan route, or -£1100ish if you go the single Ti route.

What I will say is a single TX is more than capable at 1440p. You wont hit 144fps in many games, but with Gsync that isn't really all that important.

I already own the ROG SWIFT, it's sitting here idle at the moment as the GTX 580 can't power it and no DVI to DP adapter in the house.

So it all depends on Ti launch pricing, performance ( if those new leaks are true ), and how bad Micro-stutter will be on the SWIFT. Although new cards means new drivers, so maybe they've improved a bit more, especially for SLI for the Witcher 3.
 
Soldato
Joined
7 Aug 2012
Posts
2,643
I already own the ROG SWIFT, it's sitting here idle at the moment as the GTX 580 can't power it and no DVI to DP adapter in the house.

So it all depends on Ti launch pricing, performance ( if those new leaks are true ), and how bad Micro-stutter will be on the SWIFT. Although new cards means new drivers, so maybe they've improved a bit more, especially for SLI for the Witcher 3.

As far as SLI on the Swift goes - I used it for few months with my 780Ti's and the only game I encountered microstutter with was Metro Last Light, but was apparently down to a bug with Physx. It worked perfectly for the most part and then I loaded up a save 1 time and although the fps was consitently high +100fps at all times, it genuinely felt like 30fps. Changing Physx to cpu fixed it.

Other than that, I had zero complaints regarding microstutter or any other issues. A shame that Gsync, DSR and SLI cannot all be used in conjunction (Nvidia claim they are working on it). Only reason I changed to a single card was a) I felt 3gb vram wasn't sufficient going forward and b) I am a sucker for shiny new things.

I would have no hesitation about going SLI again, but a single TX has thus far proven to be sufficient for driving the Swift.
 
Soldato
Joined
13 Mar 2008
Posts
9,638
Location
Ireland
As far as SLI on the Swift goes - I used it for few months with my 780Ti's and the only game I encountered microstutter with was Metro Last Light, but was apparently down to a bug with Physx. It worked perfectly for the most part and then I loaded up a save 1 time and although the fps was consitently high +100fps at all times, it genuinely felt like 30fps. Changing Physx to cpu fixed it.

Other than that, I had zero complaints regarding microstutter or any other issues. A shame that Gsync, DSR and SLI cannot all be used in conjunction (Nvidia claim they are working on it). Only reason I changed to a single card was a) I felt 3gb vram wasn't sufficient going forward and b) I am a sucker for shiny new things.

I would have no hesitation about going SLI again, but a single TX has thus far proven to be sufficient for driving the Swift.

I mainly want to play the Witcher 3 and it's later expansions at over 60fps at 1440p with G-Sync. Sadly a single titan can't manage that at Ultra. Except it overclocked by more than 300Mhz and with Gameworks still off.

Just 1440p with G-Sync over 60fps would suit me fine. Any better performance is a plus for me in that game. The NVIDIA forums have a lot of people complaining about Micro-stutter which is why I'm still not too sure on it.
 
Associate
Joined
25 Jan 2015
Posts
171
I already own the ROG SWIFT, it's sitting here idle at the moment as the GTX 580 can't power it and no DVI to DP adapter in the house.

So it all depends on Ti launch pricing, performance ( if those new leaks are true ), and how bad Micro-stutter will be on the SWIFT. Although new cards means new drivers, so maybe they've improved a bit more, especially for SLI for the Witcher 3.
I'm in the same boat atm with my Swift, and hesitating if I should go for 2x 980 TI in SLI. I suppose 1 should actually suffice, but I'm just such a high frame/refresh rate junkie. I want 2560x1440 4xAA or at least 2xAA at 80 FPS minimum. Actually, it would be nice as well if I could just backlight strobe modern AAA shooters with full on ULMB. But eitherway, even G-Sync and 144Hz at 1440p is just a sight to behold.

If I go this route, it will be my first ever dual card SLI setup.
My main worry is excessive input lag, wich I am very sensitive to. Although seeing benchmarks in the net, SLI improves frametime to up to 7-16ms. I'm not sure tbh, I haven't seen anyone provide details extensively enough about user based experience with SLI.
 
Status
Not open for further replies.
Back
Top Bottom