• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

G71 / 7900 GTX news?

Caporegime
Joined
18 Oct 2002
Posts
32,618
Goksly said:
Well even that image isnt painting the picture that D.P put together.
First of all you can only pay attention to the X1800XT / X1900XT /X1900XTX / 7800GT 512. Why I hear you cry? Because they have similar amounts of memory. I would assume that this would draw more power.... anywho it eliminates the memory from the equation.
So if we look at the situation with that we have (using ** image)
X1800XT: 298
X1900XT: 317
7800 512: 318
X1900XTX: 341

So, with those figures in mind, how can the statement be true? Yes I am a bit ATI bias, but then again I dont say stuff which is untrue about nvidia just to give them a bad vibe :p


it is true because it is true if your ATI glasses allow you to see it or not!


http://www.digit-life.com/articles2/video/r520-part2.html
And the card gets very hot! I repeat, our readings demonstrate that the X1800XT has higher power consumption than the 7800GTX (about 100-110 W versus 80 W).

That is 25%-38% more power! But I guess that to you is small number.
 
Associate
Joined
23 May 2005
Posts
812
Location
Hove
Goksly said:
Well even that image isnt painting the picture that D.P put together.
First of all you can only pay attention to the X1800XT / X1900XT /X1900XTX / 7800GT 512. Why I hear you cry? Because they have similar amounts of memory. I would assume that this would draw more power.... anywho it eliminates the memory from the equation.
So if we look at the situation with that we have (using ** image)
X1800XT: 298
X1900XT: 317
7800 512: 318
X1900XTX: 341

So, with those figures in mind, how can the statement be true? Yes I am a bit ATI bias, but then again I dont say stuff which is untrue about nvidia just to give them a bad vibe :p

I see your point about the ram, but remember that the 512 is a heavily overclocked G70 core running at a higher voltage than the original parts. Also, the 150MHz memory clock speed increase could also account for a few watts?

D.P's comment about transitioning to a lower 90nm still stands though, unless they do an Prescott. 90nm is an actual node rather than just a dumb shrink with low-k which should help with power consumption.
 
Soldato
Joined
12 Mar 2003
Posts
8,157
Location
Arlington, VA
Goksly said:
First of all you can only pay attention to the X1800XT / X1900XT /X1900XTX / 7800GT 512. Why I hear you cry? Because they have similar amounts of memory.

As FX-Overlord said above, GTX-512 is using a higher VCore and I imagine VMem than the regular GTX which would account a lot for it's higher draw.

Anyway X1900XT and GTX-512 drawing the same, and the X1900XTX is more power hungry than everything.

So afaic D.P. is slightly more correct than you ;) although you are both raving fanboi's and should be beaten with a pair of large trouts :p

Suman

P.S. Sorry if you think I'm having a go at you or being an nV fanboi, but neither of those are true. Last cards have been Ti4200, 9800Pro, X800XL. And I'm just trying to present unbiased facts that's all.
 
Last edited:
Associate
Joined
23 May 2005
Posts
812
Location
Hove
sumang007 said:
So what you are left with is the X1800XT drawing 20W MORE than the GTX-256, the X1900XT and GTX-512 drawing the same, and the X1900XTX more power hungry than everything.

So afaik D.P. is slightly more correct than you ;) although you are both raving fanboi's and should be beaten with a pair of large trouts :p

The X1800XT uses 2.0V ram and the GTX uses 1.8V ram IIRC. Surely this together with the fact the XT runs at 300MHz higher clock should account for a few watts.

Cybermav: i think that gtx 512 was faster than xenos, even perhaps normal X1800XT. G71 should end up faster than RSX. It is fairly difficult to judge how fast xenos is because the architecture is qutie a bit different.
 
Last edited:
Soldato
Joined
12 Mar 2003
Posts
8,157
Location
Arlington, VA
Sorry FX-Overlord I was getting well confused, many edits to that post :p lol

Totally OT but how come the new BGA RAM chips are a funny oblong shape as opposed to the traditional square?

Suman
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
Fx-Overlord said:
The X1800XT uses 2.0V ram and the GTX uses 1.8V ram IIRC. Surely this together with the fact the XT runs at 300MHz higher clock should account for a few watts.


That probably does but its beside the point.

Performance wise they are both almost equal, yet the X1800/900 has a big die shrink gain and the low-k process gain, uses a lot more transistors, has a bigger die area despite smaller die process, churns out more power, and runs at a higher clock speed to provide equal performance.

Now Nvidia will join the 90nm Low-K club and the real comparisons can be drawn.
 
Associate
Joined
23 May 2005
Posts
812
Location
Hove
sumang007 said:
Sorry FX-Overlord I was getting well confused, many edits to that post :p lol

Totally OT but how come the new BGA RAM chips are a funny oblong shape as opposed to the traditional square?

Suman

Hehe, no problem. Fanboy tinting of comments aside, it is quite a good discussion about which core actually consumes the most power, considering the variation of memory sizes and vdimm. Another interesting point:

http://www.xbitlabs.com/news/video/display/20060203091202.html

xbitlabs said:
When X-bit labs originally measured power consumption of high-end Radeon X1800 XT graphics card back in September, 2005, it was about 112W under maximum recently, the absolute maximum for that time. However, when the measurements were carried out later, the power consumption dropped to slightly below 103W on the same graphics card with the same BIOS version, but on a newer driver.

And, NFI about the different sizes :p
 
Soldato
Joined
5 Mar 2003
Posts
10,760
Location
Nottingham
I still fail to see how anands results are any less valid. if all the components are the same, the test is the same and there is only one variable - the graphics card - any flux is most likely down to that.
And you slate anand's review for maybe using a dodgy testing method - but the digit life one justs has a small paragraph quoting results... I couldnt see anything about a test method; so how can that be more valid? Oh because it shows nvidia in a good light, right? :p

Also I think its a slightly bad comment to say "well the G70 uses a higher vcore" - well im sure if ATI slowed the clocks down they could pump just 0.8v through the card making it very very good power wise, but obviously the performance would stink a bit. You need to look at the cards in power / performance terms, and i think to say that the nvidia solution "uses a lot less power than" the R520 doesnt hold water.

@sunmang: we are having a discussion. im saying dp is wrong, ** saying im wrong. as long as the comments are backed up with something other than "omg no ** wrong fanboy" - there is nothing to apologise about. Although, if I see you in rl - ** dead. :p
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
But the components between the system are not the same and the exact software running on the CPU and residing in the system memory is not the same.
This is especially apparent if you compared crossfire to SLI power consumption, very different motherboard setups and drivers.

The total system draw maybe different but that does not relate to the ammount of power the GPUs draw. It may be an interesting and useful number, but for this debate it is meaningless. Various reviews used special power monitoring hardware connected to the gfx cards- this equipment is somewhat more specialist and expesnive so only a few reviewers did this. The results were clear with the X1800 pulling about 120-130 watts and the 7800GTX pulling 80-100watts.

The numbers in the digit-life document reffer to the manufactures specification- i.e. what ATI and Nvidia claim ther cards use. I don't have time to search but somewhere on the internet there will be documents with the official power ratings of each chip at the reference speeds. There is a difference of 20-30% which explains why the X1800 run a lot hotter than the the 7800GTX.

A possible reason why some of the Nvidia cards use more total system power in a dual core CPU system is if their drivers are making more use of the second CPU putting more load on that then the CPU will draw more power. Or maybe the Nvidia drivers use the system ram differently and use a lot more ram- this will also have an effect, especially across multiple dimms.
 
Soldato
Joined
4 Sep 2005
Posts
11,453
Location
Bristol
Cyber-Mav said:
at this rate of gfx card development the pc should be able to get more power then the new xbox. i think G71 will give the pc an advantage over the xbox 360
oh christ! dont bring the 360 into here as well! :p
 
Soldato
Joined
18 Oct 2002
Posts
11,038
Location
Romford/Hornchurch, Essex
BubbySoup said:
I'm really hoping these rumours are true - maybe ATI will realise they are just no good at developing cutting edge 3D hardware and give up.

Then we can have nVidia rain supreme which will surely benefit us consumers ...


HOW the hell can NO competition be a good thing?

Nvidia will start charging 1k for a gfx card, they wont need to progress either.

Competition pushes these companies forward. and keeps prices within range for us.

having 1 company supplying a huge market isnt good for no one, except for the company. They charge what they want for average products... I mean LOOK at microsoft....
 
Associate
Joined
24 May 2004
Posts
1,878
Location
Manchester
Overlag said:
HOW the hell can NO competition be a good thing?

Nvidia will start charging 1k for a gfx card, they wont need to progress either.

Competition pushes these companies forward. and keeps prices within range for us.

having 1 company supplying a huge market isnt good for no one, except for the company. They charge what they want for average products... I mean LOOK at microsoft....

You seriously didn't get his point did you? read it again! I think their is some sarcasm in there somewhere.
 
Soldato
Joined
23 Nov 2004
Posts
7,891
Location
UK
Cyber-Mav said:
hahaha, problem is both nvidia and ati are pulling out new cards are record breaking prices. when can we get a x1900xt or 7800gtx for 100 quid inc vat. ;)

In about 18 months off the MM at a guess :D
 
Soldato
Joined
5 Mar 2003
Posts
10,760
Location
Nottingham
bAz^uk said:
You seriously didn't get his point did you? read it again! I think their is some sarcasm in there somewhere.
In his defence, the internet isnt exactly the best medium for sarcasm. On another forum it would be easier to pick up, but believe it or not, I think there are more than a few people who actually think that way.
Obviously Bubble was being sarcastic.... but... well you get the picture :p
 
Soldato
Joined
22 Nov 2003
Posts
2,933
Location
Cardiff
Overlag said:
HOW the hell can NO competition be a good thing?

Nvidia will start charging 1k for a gfx card, they wont need to progress either.

Competition pushes these companies forward. and keeps prices within range for us.

having 1 company supplying a huge market isnt good for no one, except for the company. They charge what they want for average products... I mean LOOK at microsoft....
Jeeze, my post is still being dragged up!?

Obviously my sarcasm was lost on a lot of people. Maybe it was my fault for not breaking out plenty of :rolleyes: :rolleyes: :rolleyes: at the end of the post...
 
Soldato
Joined
30 Jul 2005
Posts
19,435
Location
Midlands
no i agree competition is lame. there should only be one company available. then the pc will become more like the xbox 360 and games programmers will see that new hardware is too expensive to buy so they better get their butts in gear and program games properly to make use of the hardware we currently have.
 
Back
Top Bottom