Isn't Nvidia going the wrong way? (No, I'm not bashing Nvidia)

ndee

Lifer
Jul 18, 2000
12,680
1
0
First of all, I don't want to bash anyone, but AFAIK, in the near future, Nvidia is going with the concept, "faster memory, more transistors = better video cards". I'm really no engineer so I can't really discuss about this Tile-based-rendering detailed stuff, but I do know what it is. So isn't 3Dfx going the better way, when their concept is "better chip design = faster video cards"? They're more concerned about the Chip design and more involved in to the tile-based-rendering, cuz they bought Gigapixel or however this company is called.

Just a thought.
 

MGMorden

Diamond Member
Jul 4, 2000
3,348
0
76
This is the old arguement of AMD vs. Intel basically. It's always argued that Intel just uses brute force to get higher speeds, while AMD innovates with better features and such. Of course if one company does it and it works, the other will follow (ie, Pentium 4). If 3dfx starts making really fast video cards because of features then Nvidia will just (eventually) pick up those features and add them into their own cards.
 

lsd

Golden Member
Sep 26, 2000
1,184
70
91
Well microsoft would not have picked nvidia to supply the xbox video chipset if it didn't show technonlgy that was better than the rest.


<< Nvidia is going with the concept, &quot;faster memory, more transistors = better video cards&quot;. I'm really no engineer so I can't really discuss about this Tile-based-rendering detailed stuff, but I do know what it is. So isn't 3Dfx going the better way, when their concept is &quot;better chip design = faster video cards&quot; >>


I don't see how you came out with that assumption.
 

Bartman39

Elite Member | For Sale/Trade
Jul 4, 2000
8,867
51
91
I guess ATI is Cyrix then??? I dunno but my Radeon 64 meg VIVO sure filled in a big void I never even knew I had with my old Geforce2 GTS 64meg... I used to be nothing but a speed freak but have as they say &quot;seen the light&quot;... The only thing I hope is that ATI follows suit with Nvidia &amp; 3dfx with rapid driver updates and support... UT does look damn cool at 1600X1200 32bit color!
 

EvilDonnyboy

Banned
Jul 28, 2000
1,103
0
0
Just becuase a micorsoft decided to partner with Nvidia doesn't mean nvidia will make the best products. remember when intel partnered with rambus?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Nvidia is going with the concept, &quot;faster memory, more transistors = better video cards&quot;.

If you've seen the NV 20 specs you would know it employs HSR and a special 256K memory cache to combat memory bandwidth stress. So it seems nVidia are going the &quot;smarter&quot; route as well as continueing their old design style.

Is 100 FPS at 1600 x 1200 x 32 in Quake 3 good enough for you?
 

RoboTECH

Platinum Member
Jun 16, 2000
2,034
0
0
BFG, you do understand those Quake3 &quot;numbers&quot; are based purely on statistical analysis, and not on any type of actual test, right?
 

Pocatello

Diamond Member
Oct 11, 1999
9,754
2
76
I think nvidia should be applauded for a part in pushing the popularity and improvement of DDR RAM, which mean lower cost for us.
 

fodd3r

Member
Sep 15, 2000
79
0
0
i'd have to agree nvidia is rather transistor happy, just like amd.

ps: last i checked amd is the brute force not intel. don't quote 3dnow to me, it was an excuse for a weak fpu and not to mention it is largely a rehash of mmx.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
BFG, you do understand those Quake3 &quot;numbers&quot; are based purely on statistical analysis, and not on any type of actual test, right?

I thought he ran actual benchmarks?
 

DaveB3D

Senior member
Sep 21, 2000
927
0
0
Well it isn't publically known if those benchmarks are true or not or reflect anything.. so I'll leave it at that. :)

Yes, NVIDIA is going towards what is known as an advanced traditional architecture. However, not only is this a power hungry and transistor hungry design, but it is also one that is very flawed. The requirements for achivieing something close to deferred rendering include EXTREMELY high transistor counts and a very high cost. Yes, they can use a cache (ala NV20 for early Z compares/rejects). However this approach is very flawed and I don't trust their implementation at all (there isn't enough rough so for the Z data.. I think they are using Z compression (well I'm pretty well certain of that) to get a low-resoltion version of the Z-buffer in the cache to do a rough compare). That is flawed in itself and lacks the MANY benifits to deferred rendering.

NVIDIA's methodology is flawed, and much of it sadly seems to be coming from arrogence (and I'm not the one saying this.. this is from what I've been told.. and not told by 3dfx :)).
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<< However, not only is
this a power hungry and transistor hungry design
>>


Aren't multi-chips normally power hungry? Don't OEM's despise multi-chip solutions? Isn't Rampage multi-chip/multi-T&amp;L?

What about V5???

V5 5500 - 2 .25 micron chips, need for onboard power connector...yet nVidia is blamed for being power hungry:Q


<< NVIDIA's methodology is flawed, and much of it sadly seems to be coming from arrogence (and I'm not the one saying this.. this is from what I've been told.. and not told by 3dfx ) >>


Who saw their profits rise by 104% over last year....I'll give you a hint, it wasn't 3dfx.

Intel is extremely arrogant, but they are also #1.
 

Blackhawk2

Senior member
May 1, 2000
455
0
0
<<...Intel is extremely arrogant, but they are also #1...>>

NFS4, look at how many people bought AMD processors and all because of...cost.

The #1 concern is performance, followed closely by the #2 concern of cost. Nvidia will never be able to satisfy #2 when using a traditional renderer to compete against a deferred renderer.

All thats left for Nvidia is #1 and based on the GP-1 benchmarks and GP-3 being in development with T&amp;L, it looks like Nvidia may even lose that battle. :)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
However, not only is this a power hungry and transistor hungry design

Sorry, for a minute there I thought you were talking about the V5 5500 and V5 6000.

followed closely by the #2 concern of cost. Nvidia will never be able to satisfy #2 when using a traditional renderer to compete against a deferred renderer.

Why not? Look at the GF2 MX. It's got arguably the highest price/performance ratio from any video card currently on the market.
 

DaveB3D

Senior member
Sep 21, 2000
927
0
0
Sorry NFS, your arguments don't hold weight.. Why? Because we are talking about chips, not boards.. volts in a sense = heat.. The future is what we are talking about. we are talking long term.. you don't seem to be able to grasp that.. you continually think in the present, when this industry is continually moving forward...

 

Blackhawk2

Senior member
May 1, 2000
455
0
0
BFG10K,

GP-1 -> 3 million transistors

Voodoo5, Geforce2 -> about 20 - 30 million transistors

Thats about a factor of 10 difference in transistor count leading to a smaller die size for the deferred renderer and a factor of 10 less in the department of cost.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
GP-1

I'm not claiming GigaPixel is going to suck but I'm not jumping to any conclusions before I see benchmarks and prices. Also, there could well be an NV20 MX from nVidia which is very consumer oriented.
 

DaveB3D

Senior member
Sep 21, 2000
927
0
0
They aren't planning an MX last I heard (3 months ago.. should check into any changed plans).

The are a bit stuck on going an MX route because they are limited on what they can trim. It is a case where they'll either kill their performance really bad, or they'll lose DX8 functionality.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<<
Sorry NFS, your arguments don't
hold weight.. Why? Because we are talking about chips, not boards..
volts in a sense = heat
>>


The last time I checked, we the consumer buy CARDS, not chips. So it's not fair to talk heat and complexity (between products) if you don't take into consideration the final product that a person buys.


<< The future is
what we are talking about. we are
talking long term.. you don't seem to be able to grasp that.. you continually think in the present, when this industry is continually moving forward...
>>


Unless you learn about the past and discuss current issues and trends, you're outlook for the future will be cloudy.

Besides, considering that we know nothing about the Rampage and NV20 (well all except Dave;)), how do you all know WHO is going in the right direction.

It's nothing but a smoke screen until we see specs from nVidia and 3dfx...
 

lsd

Golden Member
Sep 26, 2000
1,184
70
91


<< deferred rendering include EXTREMELY high transistor counts and a very high cost >>


I find it hard to believe that Nvda would produce an expensive with the threat of losing their share of the OEM market. I would think that nvda would value the oem market rather than the retail market. If they produced a $500 card they would close themselves to a small niche in the retail market.
 

DaveB3D

Senior member
Sep 21, 2000
927
0
0
That is entirely accurate.. however, by sticking with a traditional architecture for the long term they limit themselves. NVIDIA is actually quickly approaching their point of being stuck. Come next summer they will be re-evaluating fabrication cababilities because they are limited on what the fabs can do for them. Not a safe place to be.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<<
That is entirely accurate.. however,
by sticking with a traditional
architecture for the long term they
limit themselves. NVIDIA is actually
quickly approaching their point of
being stuck. Come next summer they
will be re-evaluating fabrication
cababilities because they are limited
on what the fabs can do for them.
Not a safe place to be.
>>


I understand what you mean, but let's see what NV20 can do before you go writing them off already. Who knows what they have in store after NV20?

Geez, you're taking this &quot;future&quot; thing to the extreme.

Besides, nVidia isn't dumb enough to jeopardize their current status...
 

DaveB3D

Senior member
Sep 21, 2000
927
0
0
umm, I'm not coming up with that myself.. I've been told this.. that is what they are doing... It makes sense they are really too.. I mean NV20 is like 57 million transistors and NV25/NV2a are both over 60 million. I've heard some rumors on the subject, but I'm not one for rumors so I'll keep them until I get something more solid.
 

lsd

Golden Member
Sep 26, 2000
1,184
70
91
So your saying the new nvda chipsets will be insanely expensive? So that would mean the x-box will end up with a price over $1000..
I say lets wait until the actual cards, not the specs, come out to make an accurate judgement.
 

DaveB3D

Senior member
Sep 21, 2000
927
0
0
You clearly don't understand how the inside of the market works.. say the chip costs $40 to make (if it is that much, that REALLY sucks for them.. but just say). They do probably a 3x markup on the chip. However, I expect it will be about $100 a chip for board makers.. that plus memory costs = a very expensive board.. $100 for a chip is just a crazy amount..

However, NVIDIA is making relatively little off of each X-box chip.. so figure each chip costs MS in actuality $30 or so (rought guess).