nVidia GT300's Fermi architecture unveiled: 512 cores, up to 6GB GDDR5

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
BenSkywalker, I wonder if your comment there in the last sentence goes any distance to explain that Jon Peddie Research data from back in August where it showed just an utter collapse of the discete graphics card total market value around 2 yrs ago. (~$6B -> ~$3B)

This one is a bit tricky, and a factor of the three different forces combining to make a bit of a perfect storm for that segment. First off you have the general economic conditions impacting the global economy. High end add in graphics cards, despite what we on this forum may think, are a luxury item. In terms of discretionary spending, it is going to see a sharper decline during an economic downturn then the broader economy.

Next factor you have is the fairly incredible rise of mainstream graphics relative to demands. The typical consumer is still using a 22" monitor, even sub $100 boards today can run almost anything at maximum details at that resolution(although perhaps not with extreme AA). You can see that while the overall revenue has seen a sharp decline, it is far more a factor of ASP then of volume(although both are clearly down). Part of this is what I think of as the 'Crysis factor'. If you look at the ASP of boards it is actually fairly steady in terms of what it will take to play Crysis at decent settings. Obviously this isn't directly the cause as Crysis sold a pittance in relative terms, but it plots out a performance target that the typical consumer has, and they spent as much as they needed to to get it. Mainstream parts are up from the start of the tracking period they use in that article(although off from their peak, they have been on a steady rise for a couple quarters while high end parts are still in steady decline).

I would say the decline in PC gaming in general is both a contributing factor to this decline and is compounded by the other two factors. Consoles are overall cheaper then putting together a new PC that is capable of playing games. Most PCs from even five years ago will still do everything else the typical consumer wants to without issue, gaming would be the only limiting factor. Most of think of the cost of converting a PC to a gaming machine simply as a function of the graphics card, but the typical consumer doesn't upgrade remotely close to as often as we do. So when faced with dropping ~$800 for a system that can game fairly decently or $300 for a console, particularly when we are close to title parity, a lot of consumers are going to make the jump(that isn't a huge factor in the overall sales numbers, but it is certainly a factor).
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Originally posted by: dguy6789
When is the last time an exclusive and good game came out for the PC?

Here is just one article: http://www.industrygamers.com/...-who-killed-pc-gaming/

In the past year?

Demigod
Civilization 4: Colonization
Zenoclash
Cryostasis
Dawn of War 2
Empire Total War
Crysis Warhead
Stalker Clear Sky
Kings Bounty
Trine (at the moment exclusive to PC)
red alert 3 uprising was exclusive to pc for several months
Sims 3
Aion
Wasteland
Dawn of Discover

All off the top of my head.
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Originally posted by: MarcVenice
First though: Considering this puppy will most likely be 500mm2 or bigger (i've heard 576mm2)

I believe Anand said it was 476mm, smaller then the GT200(he explicitly stated that).

Anand extrapolated this number by assuming 40% more transistors then Cypress will inexplicably result in a 40% bigger diesize. Cypress = 334mm2, times 1.4 = 467,7mm2. Now Nvidia sais it's OVER 3 bill transistors. It's a good a guess as any, but 576mm2 is what I heard from my own sources. Only time will tell the exact number.

That's right.
467mm2 assuming that the density is the same with 5870 and that NV is counting the transistors of the cache also.
I would be very surprised if Fermi is bellow 500mm2.
I wonder if the gaming part will be the same (regarding ECC/caches and ratios).

--------------------------------------------------------------

About Fermi now.
We look probably at a GPU with:
48ROPs / 128TUs (or 256TUs) / 512SPs / 384bit mem con / GDDR5

There is a possibility the design to be 256TUs.

The load/store units are 16 per SM. 16X16=256.
The load store units in GT200 is 8 per TPC. 10X8=80
(Loads and stores are issued to the SM Controller and handled in the texture pipeline)
We will see...

Depending on the core/shader & memory clock it definitely will be faster than a 5870.
I suspect 1,5X faster per MHz will be a good average indication about perf. in relation with a 5870.

I see no problem at all regarding DX11 gaming performance.
The design is very good. (at least in papers)
I just want the price to be competitive.
(Also we don't know if there is going to be a gaming part with different philosophy regarding ECC/caches/ratios)

What is surprising is how good (in what degree) the design is for FP64.
The GTS250 has 1836MHz clock for shaders.
If GTX380 has 1700MHz clock for shaders, it will be 1,5X faster than 5870 for FP64.
(4890 was more than 3X faster in FP64 in relation with a GTX285)

---------------------------------------------------------

If NV can scale this design one level below (a future GTS350) then it would be sweet. (i guess for the value/mainstream parts this design with these ratios doesn't make sense.

The fermi design is much more ambitious than the 5870 design (at least in papers), which is a logic thing for what Nvidia is pursuing...

Also looking at the possible TAM, Nvidia design make sense.
This is a right timing for NV to pursue this market, things are much more mature now.
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
tesla only accounted for 1% of nvidia's income (not profits, just income). now is way too early for the flagship GTX, Quadro, and Tesla mask-sets to diverge simply because of ECC. they can't have two different chips going on a relatively new manufacturing process. a process which, at this size (576? really? that was GT200. what are the odds?) can be considered experimental until we see products. the Tesla cards haven't yet exhibited any exclusive features that were crippled or absent from geforce silicon so I have no idea why they would start now. Tesla may be feature rich, but they definitely aren't sold out yet. nvidia is still begging for growth in that segment.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: alyarb
tesla only accounted for 1% of nvidia's income (not profits, just income). now is way too early for the flagship GTX, Quadro, and Tesla mask-sets to diverge simply because of ECC. they can't have two different chips going on a relatively new manufacturing process. a process which, at this size (576? really? that was GT200. what are the odds?) can be considered experimental until we see products. the Tesla cards haven't yet exhibited any exclusive features that were crippled or absent from geforce silicon so I have no idea why they would start now. Tesla may be feature rich, but they definitely aren't sold out yet. nvidia is still begging for growth in that segment.

Since G80, and moreso with G92 and GT200, Nvidia has been listening to developers requests and what they "needed" for GPGPU based computing to be a viable resource. G80 thru GT200 was "the foot in the door" products to gain some traction in these market segments. Nvidia listenened, and acted. Result: Fermi. ECC was one of the most widely requested features for a GPGPU. For all those who read the whitepaper on Fermi's compute functions, you know this.

Alyarb, I'm not sure why you think there can't be two different chips going on a new manufacturing process because of ECC.

 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
they have no need to differentiate unprecedentedly expensive silicon and it's a complete departure from previous trends. History has shown that nvidia can double their transistor count as well as refine GPGPU capability without differentiating mask sets. ECC is not a huge waste of die area so they have no reason to agonize over it.

Why don't I think they'll differentiate? Because it saves money, keys. Do they need a better reason? No. We all read anand's article and the whitepaper. Did you read my post? tesla income was 1%.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: alyarb
they have no need to differentiate unprecedentedly expensive silicon and it's a complete departure from previous trends. History has shown that nvidia can double their transistor count as well as refine GPGPU capability without differentiating mask sets. ECC is not a huge waste of die area so they have no reason to agonize over it.

Why don't I think they'll differentiate? Because it saves money, keys. Do they need a better reason? No. We all read anand's article and the whitepaper. Did you read my post? tesla income was 1%.

Am I just misunderstanding you? Probably. Is it that you mean they won't have ECC versions and non ECC versions? But all chips will be ECC? In that case, i would agree.
Would be cheaper to keep things consistent.

Initially, I thought you meant different chips in the context of say top of the line 512 core down to say a 256 core. My bad.

 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
oh. well my bad too then.

but yeah as an aside, it raises an interesting question as to whether or not the low-end to mid-range GT300 derivatives, which will have different mask sets, will also include ECC. They've never cannibalized GPGPU from the top-down like this, but it's a possibility. I doubt that a lack of ECC would sabotage single precision performance, which is all you really need if you're asking an IGP to help out with something.
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
For the enthusiast part the most probable thing is to have only one design.

The potential GPGPU TAM according to NV in the next 18 months will be more than $1,1 billion.
The revenue for NV for 18 months is close to $5 billion now (in the recent past $6).

If Tesla accounted for less than 1.3% of NVIDIA's total revenue last quarter and this is indicative for all the quarters then the Tesla revenue will be something like $65 million. (18 months)

It is not very probable, but also it is not impossible for NV to have 2 designs if NV thinks that the Tesla revenue will be increased 5 fold for example ($325 million, less than 30% of the potential GPGPU TAM) (but i doubt)

Also does anyone now if and in what percentage there is going to be a performance hit regarding ECC implementation? (ECC is helping scientific sectors but i don't think it matters in gaming, even if the GDDR5 leading to errors in higher degree than before, i suspect this isn't an issue for gaming applications)

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
Originally posted by: BenSkywalker

Outside of MMOs PC gaming is in a torrid state of decline. You don't need to rely on retail sales, you can easily check the financial results of the publishers. Excluding MMOs PC gaming is going to be lucky to clear $1Billion in revenue this year. By way of comparison, Wii Fit, a singular game, has made a bit over $2Billion to date. It is true if you include MMO subs the situation looks much better, with those included PC gaming should hit about $3Billion in total revenue this year, but the majority of that is WoW. Outside of WoW, PC gaming has been in free fall for several years now.
I?m not sure where your PC numbers are coming from, but they?re simply incorrect:

http://www.pcgamingalliance.or...ad.asp?ContentID=15911

SAN RAMON, Calif. ? May 28, 2009 ? The PC Gaming Alliance (PCGA), a nonprofit corporation dedicated to driving the worldwide growth of PC gaming, today unveiled its 2008 Horizons Report, an exclusive research study encompassing all aspects of the PC gaming industry worldwide. Among the key findings is that PC gaming software revenue was a $12.7 billion industry in 2008, up $1.9 billion or nearly 18% from 2007.
PCGA tracks digital sales which is why their figures are actually accurate:

Growth on the software side was driven by on-line sales which continued to accelerate worldwide, particularly in Asia. On-line revenue trends closely follow broadband penetration in country-by-country reporting.

?In 2008, China became the leading market for PC games and almost all the revenue in China was from online business models that involved no physical retail component.? stated David Cole, an analyst with DFC Intelligence. ?Going forward, we expect China?s business model will be implemented on a global basis.?
EA?s CEO has also stated PC gaming revenue is increasing overall, but most sales figures are derived from traditional boxed sales, so their decline isn?t a reflection of reality: http://www.shacknews.com/onearticle.x/52689

Digital distribution is growing: http://www.gamedaily.com/artic...-worrying/19192/?biz=1
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I?m not sure where your PC numbers are coming from, but they?re simply incorrect:

You realize you just linked Fudzilla who quoted "Magical fairies that pull numbers out of their rectum"? There is zero credibility in your link, well, the date may be accurate, but I wouldn't bet money on it.

While the console video game market has been soaring (as demonstrated by the just released NPD 2008 totals), PC game sales aren't so hot ? from a retail perspective, that is. The NPD Group has revealed to GameDaily BIZ that PC games totaled just $701 million in 2008, which is down 14 percent from 2007.

It's important to keep in mind, however, that this NPD data concerns retail data only and does not include sales of digitally downloaded games, micro-transactions, online subscriptions, etc. The NPD Group recently started paying more attention to online revenue with its quarterly subscription tracker, but this data does not include that. NPD is expected to release more in-depth information on the PC games market next week.

Link.

Don't worry though, I'm just getting warmed up :)

EA's financial reports. Total PC revenue, including MMO subs, $712Million.

Activision's Financials. Non MMO PC games- $99Million. MMORPG- $1.15Billion.

Here is Ubisoft. 138Million Euros for the PC, be generous and give it 2:1 and that would be $276Million total.

Take Two's. $46.1Million

THQ $88Million

Squeenix(which now owns Eidos). $135Million including MMO subs.

There are the top six PC publishers, accounting for what should be 90% of the PC game market in terms of sales. $509Million total last year that we know didn't come from MMOs, $847Million that came from a mix of non MMO and MMO, and $1.15Billion that came from WoW.

These numbers include all revenue sources, and it is criminal to misreport them. Almost $13Billion is an utterly absurd joke to put it mildly.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: alyarb
an interesting question as to whether or not the low-end to mid-range GT300 derivatives, which will have different mask sets, will also include ECC. They've never cannibalized GPGPU from the top-down like this, but it's a possibility. I doubt that a lack of ECC would sabotage single precision performance, which is all you really need if you're asking an IGP to help out with something.

There are many cost and risk-to-timeline concerns to be weighed of course, but one thing that I foresee stripping out ECC from the architecture whilst simultaneously paring down the core counts to reduce diesize is that you lose the ability to leverage the considerable design validation and verification efforts that Fermi incurred.

If ECC is not a significant die-size hit, and if it can go unused without causing unacceptable performance hits or some such, then I can't really see how they would justify pitching the ECC in the lower-cost parts and take on the risk of introducing timeline impacting bugs/etc.

Look at Intel's bloomfield versus nehalem-ep, same chip same masks but the nehalem-EP chips go thru a bit of more validation to have addition memory controller features enabled whereas they are simply disabled (but the circuitry is still present) in the bloomfield variants.

Given the precedence of how these things generally handled I would argue the assumption to be made here is that ECC circuitry will be present but disabled/unused in the mainstream GPU parts.
 

scooterlibby

Senior member
Feb 28, 2009
752
0
0
I know it's overkill, but I would be kind of tempted by either Hemlock or the dual Fermi. I'd pretty much be set until 2012, assuming they still make PC games by then ;)
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I am actually looking forward to this release. I may go the big route and get an X2 or a 395 if they make one and sit on it for a couple of years.

I hope somebody does a review of how the newest gen GPUs fair on the newest i series CPUs vs the Core 2 Duo. The i5 did not impress much compared to my E8400 with the G200 and 4800 series cards. I'd like to see if the new chips are able to show a difference in the next gen GPUs.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: dguy6789
I thought it was common knowledge that their $100 and lower cards sold orders of magnitude higher numbers than their flagship cards. It's not even close. Their flagship cards are more marketing tools than anything else. Just like Intel's Extreme Edition processors. It has always been this way for as long as I've been an enthusiast.

Indeed, I'm not saying I know for certain that PC gaming is dying. It's just what seems to be happening based on my observations.

I don't disagree that the PC is largely a superior platform for multiple genres, but every genre is still possible on consoles.

I think about 5 years ago Nvidia released some numbers that showed cards in 200-300 outsold the top end cards about 30:1 and the 100-200 cards was about 100:1 and sub 100 was something like 300:1

 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
PC gaming may grow again if they ever find an adequate piracy-prevention tool. I wonder if you add lost sales from piracy with actual sales if PC gaming has shrunk at all. And you have to leave MMO's with the rest of the revenue if you're comparing consoles and PC's since it is a PC game. And with Star Wars coming out be prepared to add another billion to that number.
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
thats what i've been saying. declining demand for retail games, without consideration of piracy, has no relation to the [increasing] demand for performance GPUs.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: scooterlibby
I know it's overkill, but I would be kind of tempted by either Hemlock or the dual Fermi. I'd pretty much be set until 2012, assuming they still make PC games by then ;)

I'm not a "supercharged" gamer like many of you guys, but I am curious what people think about the noise (see graph at bottom) aspects of the high-end graphics products.

Do you guys water-cool them to get around the noise issue, or is the noise really a non-issue? (just because the chart goes to 67dB doesn't mean it is an issue, I have no reference point to judge how loud 67dB is really)
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: Idontcare
Originally posted by: scooterlibby
I know it's overkill, but I would be kind of tempted by either Hemlock or the dual Fermi. I'd pretty much be set until 2012, assuming they still make PC games by then ;)

I'm not a "supercharged" gamer like many of you guys, but I am curious what people think about the noise (see graph at bottom) aspects of the high-end graphics products.

Do you guys water-cool them to get around the noise issue, or is the noise really a non-issue? (just because the chart goes to 67dB doesn't mean it is an issue, I have no reference point to judge how loud 67dB is really)

Noise is definitely a non issue. Even the loudest of today's cards don't really sound loud at all. Yes you can hear them, but when you're watching a movie or playing a game or listening to music, the noise goes virtually unnoticed. There are some extremely picky OCD people who insist on having a computer that they can't hear at all even when in a silent room, but aside from that, a non issue. The Geforce FX 5800 Ultra was the first and last card I know of that was so loud that it deterred people from buying it. The quieter the card the better of course, but the even the loudest ones out now aren't really that loud.

The problem with the noise measurement is every site that measures noise measures using a different method or a different distance away. You can really only compare noise using db within the same article.
 

alcoholbob

Diamond Member
May 24, 2005
6,390
469
126
Originally posted by: Idontcare
Originally posted by: scooterlibby
I know it's overkill, but I would be kind of tempted by either Hemlock or the dual Fermi. I'd pretty much be set until 2012, assuming they still make PC games by then ;)

I'm not a "supercharged" gamer like many of you guys, but I am curious what people think about the noise (see graph at bottom) aspects of the high-end graphics products.

Do you guys water-cool them to get around the noise issue, or is the noise really a non-issue? (just because the chart goes to 67dB doesn't mean it is an issue, I have no reference point to judge how loud 67dB is really)

Were you around during the Athlon XP days? 65db should be the noise level of a 60mm CPU Delta "Screamer" fan, the one that could be heard from 30 feet away behind 3 closed doors.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Astrallite
Originally posted by: Idontcare
Originally posted by: scooterlibby
I know it's overkill, but I would be kind of tempted by either Hemlock or the dual Fermi. I'd pretty much be set until 2012, assuming they still make PC games by then ;)

I'm not a "supercharged" gamer like many of you guys, but I am curious what people think about the noise (see graph at bottom) aspects of the high-end graphics products.

Do you guys water-cool them to get around the noise issue, or is the noise really a non-issue? (just because the chart goes to 67dB doesn't mean it is an issue, I have no reference point to judge how loud 67dB is really)

Were you around during the Athlon XP days? 65db should be the noise level of a 60mm CPU Delta "Screamer" fan, the one that could be heard from 30 feet away behind 3 closed doors.

I was but I have no recollection of the noise you speak of so too whatever extent that noise is to be considered "loud" it apparently wasn't loud enough to make an impression on me, which in a round-about way I suppose is the answer to my question. Thanks :thumbsup: (thanks also to dguy6789 for the post as well)
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
I would never step in and try to speak for the entire forum by saying that 67dB is a "non-issue." For most people, it absolutely is an issue, which is why there were so many 2D 4850/4870 fan tweaks that came out when the cards were introduced.


This is why minimizing 2D power and 2D clocks is so important. The GPU should not make any noise unless it's put to work, and i'd imagine that the fans on the new cards are very tame at idle give the supposed 27 watt consumption in 2D.

edit: damn, that's 15 idle watts and 26 peak 2D watts.

http://www.xbitlabs.com/articl...on-hd5870_7.html#sect0


scrolling down to the bottom you see that the 5870 is the quietest idle card in the objective 1-meter test.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: alyarb
scrolling down to the bottom you see that the 5870 is the quietest idle card in the objective 1-meter test.

I know this isn't the thread to talk about it, but just briefly since we have wandered here anyways does this mean the batmobile fan shroud is actually function over form? Meaning despite all the tongue-in-cheek ridicule the design itself is actually a superior one? That would be some nice vindication for the AMD reference design engineers.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
People tweak settings because it's fun getting the most out of your hardware and getting it to run optimally. I don't know of anyone who tweaks the stock fan speed on a video card out of sheer necessity because it is unbearably loud.(Neither myself nor any of my friends who have been into PC gaming for ~10 years) The typical Xbox 360 runs as loud as the loudest video cards do. People complaining about noise are most typically complaining because they can hear it, not because it is a problem. My $0.02

Edit: I know the 5870's idle clock speeds are ridiculously low, far lower than any 4800 series. I'd be more inclined to say that's the cause for the low noise than just the fan shroud.