• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

nVidia GT300 - GeForce GTX 380 yields are sub-30%

Soldato
Joined
7 May 2006
Posts
12,192
Location
London, Ealing
nVidia GT300 - GeForce GTX 380 yields are sub-30%, now 2%

When we ran the TSMC yield story, we left owning you an explanation. Time to clear that too… why TSMC has "****** up" yields? Because the chip "some say it doesn't exist" has disastrous yields. We refuse to be drawn into the speculation does the GT300 exist or not, since we have enough data that would send nVidia Legal our way - but that was in the past, green guys learned better ;)

According to our sources close to the [silicon] heart of the matter, the problem that nVidia has are yields in the 20 percentage range. You've guessed, that is waaay [insert several "a"] too low for launching volume production. Even when TSMC improves the leakage issues [the company claims that the leakage issues are now the thing of the past], mVidia will probably send a new revision of silicon - the yields have to be get high enough to earn a little bit of money.

Current situation is three faulty chips on one working one is much too much, since those faulty chips aren't exactly "GTX 360" or "slower Quadro FX" grade material. Some faulty parts might work under forced cooling, but the high leakage is an issue with the current graphics card layout. We won't go into the whole instability, does not work etc situation. As we all know, the graphics chips are at the worst possible position, facing down [unless you put them in testbed/desktop case]. With high leakage parts, thermal shockwave is sent through the organic packaging to the PCB [Printed Circuit Board] and can cause extensive failure, like nVidia learned with their $200 million mistake called "bad bumps" or simply "bumpgate".

We have the exact number, but in order to protect the parties involved, we are going to refrain from posting the exact yield figure on first batch of chips. All we can say is - not yet ready for production.
bsn*

Update:
September 15, 2009

THE SAGA of Nvidia's GT300 chip is a sad one that just took a turn for the painful when we heard about first silicon yields. Nvidia's execution has gone from bad to absent with low single digit yields.

A few weeks ago, we said that Nvidia was expecting first silicon back at the end of the week, the exact date was supposed to be Friday the 4th plus or minus a bit. The first bit of external evidence we saw that it happened was on the Northwood blog (translated here) and it was a day early, so props to NV for that. That lined up exactly with what we are told, but the number of good parts was off.

The translation, as we read it, says there were nine good samples that came back from TSMC from the first hot lot. That is below what several experts told us to expect, but in the ballpark. When we dug further, we got similar numbers, but they were so abysmal that we didn't believe it. Further digging confirmed the numbers again and again.

Before we go there though, lets talk about what a good die is in this case. When you get first silicon back, it almost always has bugs and problems. First silicon is meant to find those bugs and problems, so they can be fixed in succeeding steppings.

By 'good', we mean chips that have no process induced errors, and function as the engineers hoped they would. In other words not bug free, but no more errors than there were in the design. 'Good' in this sense might never power on, just that the things that came out of the oven were what was expected, no more, no less.

Several experts in semiconductor engineering, some who have overseen similar chips, were asked a couple of loaded questions: What is good yield for first silicon? What is good yield for a complex chip on a relatively new process? The answers ranged from a high of 50% to a low of 20% with a bunch of others clustered in the 30% range. Let's just call it one-third, plus or minus some.

The first hot lot of GT300s have 104 die candidates per wafer, with four wafers in the pod Nvidia got back a week and a half ago. There is another pod of four due back any day now, and that's it for the hot lots.

How many worked out of the (4 x 104) 416 candidates? Try 7. Yes, Northwood was hopelessly optimistic - Nvidia got only 7 chips back. Let me repeat that, out of 416 tries, it got 7 'good' chips back from the fab. Oh how it must yearn for the low estimate of 20%, talk about botched execution. To save you from having to find a calculator, that is (7 / 416 = .01682), rounded up, 1.7% yield.

Nvidia couldn't even hit 2%, an order of magnitude worse than the most pessimistic estimate. Ouch. No, just sad. So sad that Nvidia doesn't deserve mocking, things have gone from funny to pathetic.

At this point, unless it has a massive gain in yields on the second hot lot, there may not be enough chips to do a proper bring up and debug. This stunningly bad yield may delay the introduction of chip, adding to the current pain and bleak roadmap. If there aren't enough 'good' parts from the second hot lot, it may mean running another set, adding weeks to the total. Q1? Maybe not.

It is going to be very interesting to see what Nvidia shows off at 'Not Nvision' in a couple of weeks. Will it give the parts to the engineers to work on, or show them off as a PR stunt? We will know soon enough. In any case, the yields as they stand are sub-2%, and the status of the GT300 is far worse than we had ever imagined.S|A
http://www.semiaccurate.com/2009/09/15/nvidia-gt300-yeilds-under-2/
 
Last edited:
Man of Honour
Joined
13 Oct 2006
Posts
90,805
Oh dear. New nvidia cards = Q2 next year?

I've heard "May 2010" earliest banded about... which isn't encouraging.

TBH I'm not too dissapointed by it - looking at the list of games to be released in the next 6 months - theres nothing that looks like really taxing existing hardware or moving forwards API wise... so its not like we are in a tearing hurry from the consumers' point of view... and not many developers seem to be too bothered about moving to DX10+ any time soon.
 
Last edited:
Associate
Joined
21 May 2007
Posts
1,464
Not sure anyone's test runs go much better with new, higher coponents density designs. IF it was a production run, then even at 30% yield, 3 runs and they're good to go (albeit they'll mark the cards up to £700 to recoup the "higher manufacturing costs" (ie chips cost 12p to make not 9p.).
Still, as someone said, if it does delay things, it'll at least give us all a chance to USE some DX10 stuff, maybe devels will bother to use it before herding us into 10.1 and 11. At this rate we'll be on DX18, with nothing but 3dmark 2013 to run on it, everthing else still on Dx9/10.
 
Soldato
Joined
8 Feb 2009
Posts
3,462
Location
Sheffield
Not sure anyone's test runs go much better with new, higher coponents density designs. IF it was a production run, then even at 30% yield, 3 runs and they're good to go (albeit they'll mark the cards up to £700 to recoup the "higher manufacturing costs" (ie chips cost 12p to make not 9p.).

Well normally, they'd have a much greater yield of chips. For example, the ATI 4800 chips are all the same, just the better ones go into 4890's, and the worse ones go into 4830's, very few are binned. Nvidia's chip production is inefficient in that if a chip doesn't match up to specifications, it's scrapped, rather than included in a lower end product. This is why ATI do better price-wise.

I can understand low yields for test runs - I'm not gonna shout "FAIL" before it's even released purely based on some prototype problems, I'm sure everyone has some problems before a product releases. Still, if it carries on this way until final production, it's gonna make them a lot more expensive. I don't know what chip yield normally is, but assuming it's 60%, a 30% yield is going to double the price of the chip (not necessarily the entire GPU).

I doubt chips cost 9p or whatever each, because if they did, this low yield problem would be inconsequential (3p price difference doesn't mean anything), and the only possible problem would be stepping up production to meet demand, not prices. And I don't think demand will be a problem for them, seeing as how it's likely to cost a lot at release, and they aren't going to be featured in many prebuilt PC's for a long time.
 
Soldato
Joined
11 Sep 2007
Posts
5,740
Location
from the internet
Well normally, they'd have a much greater yield of chips. For example, the ATI 4800 chips are all the same, just the better ones go into 4890's, and the worse ones go into 4830's, very few are binned. Nvidia's chip production is inefficient in that if a chip doesn't match up to specifications, it's scrapped, rather than included in a lower end product. This is why ATI do better price-wise.

Poor example because the 4890 has a different chip, would be more correct to use the 4870 in that example.
 
Soldato
Joined
24 Jul 2004
Posts
22,594
Location
Devon, UK
I'm surprised they didn't learn from the GT200 with regards to costs. They got seen off there.

I would have had the engineers working on a new architecture from scratch from that very moment.

Looks like i'm going to be staying with the 4870 for a while longer, it's going to be worth peanuts when I come to sell at this rate.
 
Soldato
Joined
7 Nov 2007
Posts
6,814
Location
Required
I'm surprised they didn't learn from the GT200 with regards to costs. They got seen off there.

I would have had the engineers working on a new architecture from scratch from that very moment.

Looks like i'm going to be staying with the 4870 for a while longer, it's going to be worth peanuts when I come to sell at this rate.

This arch was being worked on for like 3 years, it takes a lot longer than you think to design a GPU.
 
Soldato
Joined
8 Feb 2009
Posts
3,462
Location
Sheffield
Poor example because the 4890 has a different chip, would be more correct to use the 4870 in that example.

Does it? Didn't know that, however looking it up, it seems to be the same chip just redone physically for better clocks. Point still stands though, nvidia's chip manufacture is less efficient than ATI's.
 
Soldato
Joined
22 Aug 2008
Posts
8,338
ATI have R800 ready and waiting, to launch alongside W7 at the latest according to FUD.

Unless some DX11 games come along that cause jaw-dropping like Crysis did, it's mostly a propaganda victory though.
 
Back
Top Bottom