• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Geforce 4

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Not to discount latency all together. I guess RDRAM was the best option when the PS2 was being developed...

How is the XBox graphics chip related to the GF3? Is it like an nForce with an integrated GF3 core instead of GF2?
 
I can't understand why so many people are saying nVidia is all about speed and that they should focus on quality as well.

What do you think they did with the GF3?
The actually lowered the paper specs and introduced their shading engine, which was at the time, revolutionary.

The Radeon comes out over half a year later and has a similar but improved featureset, and all of a sudden the GF3 is "all about speed".
Much like the GTS was all about speed, even though it was a direct evolution of the part that introduced T&L to the consumer market.

Im not saying ATi isn't doing anything right, I think Truform is a great idea, but for some reason people seem to enjoy bashing nVidia for no reason.
Much for the same reason alot of people seem to enjoy bashing the P4 I guess.
 
Nvidia has said repeatedly that they have no plans whatsoever to use RDRAM. More likely is DDR at some insanely high clock frequency or perhaps even QDR. As per my understanding, nvidia was the first company to bring DDR mainstream 🙂

Just thoughts from an ignernt party.
 


<< I can't understand why so many people are saying nVidia is all about speed and that they should focus on quality as well.

What do you think they did with the GF3?
The actually lowered the paper specs and introduced their shading engine, which was at the time, revolutionary.

The Radeon comes out over half a year later and has a similar but improved featureset, and all of a sudden the GF3 is "all about speed".
Much like the GTS was all about speed, even though it was a direct evolution of the part that introduced T&L to the consumer market.

Im not saying ATi isn't doing anything right, I think Truform is a great idea, but for some reason people seem to enjoy bashing nVidia for no reason.
Much for the same reason alot of people seem to enjoy bashing the P4 I guess.
>>



Ok..if it sounded like bashing nvidia...i am sorry 😉

I still think that nvidia is a good company who developed/develops a bunch of very cool graphic chips..and their developer section is good, too.

The main reason for complaining may be that (imho) the 'new' ti 200 and 'ti500's are no new graphiccards at all..and honestly....dont you ask yourself what's the point to release gf3 ti 200 and ti 500 AFTER the normal GF3...and kinda introduce them as 'new graphic chips'...with prices accordingly.

The gf3 is by far no bad card...i'd be happy w/ a gf3 ti 500 AND/OR a radeon..but eg. madrat posted the specs of the gf4....and (OF COURSE) i except them to have a smaller die, faster mhz.....my god that's how it usually is in the chipset business...this things dont get slower and bigger 😉

But i'd rather like to read something concrete about features and rendering mechanisms/support for directX-versions....optimized rendering..opengl1.3....pixelshader 1.4...directX8.1 (upto dx9.0) instead of the plain list of MHZ and how many instructions per second it can do...its just something which does NOT impress me a bit anymore..since i can ASSUME (hey..it's almost the year 2002 😉 that ANY high-end card is 'awfully' fast.....and has that much billion instructions/triangles/etc per second...i just assume that ! (I also tend to almost ignore all those boring benchmark tests showing Card X vs. Card Y and people raving because card X has 5 fps more...its just ridiculous, because there are MUCH MORE important things..like eg. image quality.....its what you SEE on the screen at the end what counts)


 
"How is the XBox graphics chip related to the GF3? Is it like an nForce with an integrated GF3 core instead of GF2?"

It's kind of like that. The XBox graphics chip (XGPU) is more powerful than the GeForce3 core. The XGPU has two vertex shaders instead of one. Nvidia doesn't offer desktop or mobile computers with two vertex shaders. All Geforce 3 and Titanium cards only have one. The XGPU also integrates the northbridge of the XBox, so in that regard, yes. The XGPU contains the TwinBank memory controller to the PC3200 (200mhz ddr) memory and hypertransport to the southbridge (MCPX). It also links to the Intel 733mhz processor, which Nvidia is able to use in this case because Microsoft has a liscense for the bus.
 
"still, these guys are going to be more than a year ahead of anyone else? I don't think so. .1 isn't really feasible right now."


I don't think you really understand who "these guys" really are. Taiwan Semiconductor Manufacturing Company (TSMC) is a company with a $52 Billion dollar market-cap.

To put it into perspective;

AMD $4.5Billion + Apple $6.9Billion + Nvidia $7.4Billion + Rambus $1Billion + Transmetta $339Million = less than half of TSMC's market value.


You also might want to read this!
 
The other thing you must remember about TSMC is when they jump to a new process that ever single product does not have to come off that line. I'm sure they are still running .25 micron fabs for some ic's.
 
NVIDIA executes their plan and delivers new products on time every other quarter. Whether or not the new products are based on a new architecture is a moot point. What really matters is if improved image quality and / or speed was introduced with the new product. For the last three years, NVIDIA has passed this test with flying colors and that is why the last four video cards in MY box have been stamped NVIDIA.

Look at NVIDIA's success and advances in the last three years. Compare that record to ANY corporation's track record over the same time and explain to me why people pick on NVIDIA? Hardware, drivers, XBox chipsets, NFORCE... All they do is release quality product after quality product and consistently make GREAT hardware affordable to the masses through their rapid release schedule.
 
offer companies can do that too, AMD is far more impressive if you ask me.
You seem to bring up a lot of good points, but you forget the bad.... typical bias. nforce was seriously delayed due to problems, personal cinema is awful. The only things they've released are the GF256, GF2 and GF3, all the rest are just increasing the speed.
 
eg. how effective the bus/memory is used with almost NO performance loss and the same time HUGE improvement in imagequality...

Again, this is exaggerating. It might look good to you, but not to other people. A majority of people still buy Nvidia based cards, and if the image quality difference between Nvidia cards and ATi cards were that great, then don?t you think that maybe, just maybe, a web site like, as an example, Anandtech would start emphasizing this vital piece of information. It's a myth if you ask me, especially the "2D quality" myth. It depends on your monitor, and it depends on your resolution.

I think I've read that AMD is considering outsourcing some of their fab work to TSMC. TSMC is usually the leader in putting products out with a lower micron spec, but they aren't putting out the amount, complexity, and yield that someone like INTEL does.

Actually, the rumors are that AMD is considering outsourcing to UMC, TSMC's competition. Also, TSMC IS "putting out the amount, complexity, and yields" that Intel is, except times 3. Go to tsmc.com for more information. They are by far the largest semiconductor foundry in the world.
 


<< offer companies can do that too, AMD is far more impressive if you ask me.
You seem to bring up a lot of good points, but you forget the bad.... typical bias. nforce was seriously delayed due to problems, personal cinema is awful. The only things they've released are the GF256, GF2 and GF3, all the rest are just increasing the speed.
>>



Can we have an adult conversation without the name calling please? I am NOT biased. I merely described my respect for NVIDIA as a company and outlined my reasons for buying their products. NVIDIA is a kick ass company that deserves respect, even if you consider producing three video card models in two years, each with multiple speed configurations and driver performance increases, mediocre output.

Did I say ATI was a bad company? As you eluded, AMD is VERY impressive, even moreso than NVIDIA. In most respects, the Athlon has passed the Pentium in price and performance, which is why I have an Athlon in my machine. Following this train of thought, I will gladly buy an ATI card once they leap frog NVIDIA with their pricing, hardware AND drivers. All of these variables must be working in tandem for me to even consider ATI. Right now, I found a GF3 TI200 which will end up costing me $99. That's what I consider working in tandem! Winner? NVIDIA and me.

Seriously though, if you think about it, ATI's progress is admirable considering how far ahead NVIDIA was just a year and a half ago. ATI is getting closer to leapfrogging NVIDIA with each and every product release. They're just not there yet in my book. And if GeForce4 is half of what it is cracked up to be, a higher clocked 8500 better be on the back burner or I might never have an ATI card.
 
The way I look at it both ATi and NVidia are on the top because of continuality of successful cores into faster and enhanced versions of similar cores.

* NVidia took the RIVA128 core and rolled it through to the TNT to the TNT2 to the GeForce to the GeForce-2 to the GeForce-3 to the GeForce-Titanium.

* ATi took the RAGE core to the RAGE-II to the RAGE-IIc to the RAGE PRO to the RAGE-128 to the RAGE-128PRO to the RADEON to the 8500.

Where did either company do anything different?
 
The main reason for complaining may be that (imho) the 'new' ti 200 and 'ti500's are no new graphiccards at all..

I find it amusing that this is unexpected to people... nVidia even explained their development process.

They target a 6 month product cycle. Spring and Fall. Fall should be a *new* core, spring is a refresh. A refresh is a minor die change, usually a die shrink to allow faster core speeds, lower power, cost reductions, etx... Only place where they trick us is with the GTS where they actually released it "early". The GF3 put them back on the original schedule.

TnT Fall 98
TnT2/Ultra Spring 99
Geforce256 Fall 99
Geforce2 GTS/MX Spring 00
GF2 Pro/Ultra Fall 01
Geforce3 Spring 01
Geforce3 Ti's Fall 01

I'm abit liberal with the cuttofs between fall/winter winter/spring but it's proabably all within a month (depending on if your tracking announcements or products on shelves...)

Their development cycle for new cores is approximatly 1 year. Geforce3 was released last spring, so expect new core this coming spring if they remain on schedule.
 


<< The main reason for complaining may be that (imho) the 'new' ti 200 and 'ti500's are no new graphiccards at all..

I find it amusing that this is unexpected to people... nVidia even explained their development process.

They target a 6 month product cycle. Spring and Fall. Fall should be a *new* core, spring is a refresh. A refresh is a minor die change, usually a die shrink to allow faster core speeds, lower power, cost reductions, etx... Only place where they trick us is with the GTS where they actually released it "early". The GF3 put them back on the original schedule.

TnT Fall 98
TnT2/Ultra Spring 99
Geforce256 Fall 99
Geforce2 GTS/MX Spring 00
GF2 Pro/Ultra Fall 01
Geforce3 Spring 01
Geforce3 Ti's Fall 01

I'm abit liberal with the cuttofs between fall/winter winter/spring but it's proabably all within a month (depending on if your tracking announcements or products on shelves...)

Their development cycle for new cores is approximatly 1 year. Geforce3 was released last spring, so expect new core this coming spring if they remain on schedule.
>>



Development for a new core is more like 2 years 🙂
 
Development for a new core is more like 2 years

Well, nVidia states that they have 2 design teams. I'm not positive 2 years is accurate, since this would mean the Geforce3 core was being developed in Spring 99, the same time they released the TnT2.

I guess it depends how you define "developed" 🙂 Someone was probably sitting at a table saying " We need 4 dual textured pipelines with programable T&L, shaders, 250MHz DDR ram... etc, but I would be willing to believe that it only took them a year to get the chip layed-out and ramped to production.



 


someone mentioned that GF4 might be released simultaneously with DirectX9...sounds resonable to me...


Btw. no question that nowaday's games dont even utilize GF3's/Radeon features....i always get a good laugh off looking at games like halflife...which imho is only one of the 100s of absolutely similiar looking D3D games (Tombraider like engine....edgy 3d models and some textures glued over it 🙂...amazing that such games are being played on cards for $300 dollars....because they look exactly the same on a TNT....the same as they looked it eg. 1998



 


<< NV25 Anti-Aliasing

--------------------------------------------------------------------------------
While looking at the readme file for Comanche 4, I came across something very, very interesting.

Comanche 4 supports anti-aliasing mode only on next-generation video cards featuring Nvidia?s NV25 chipset. Comanche 4 will automatically detect whether you are running on a NV25 chipset video card and enable the anti-aliasing selector in the options. This feature has been disabled for all other video cards due to a severe performance penalty when enabled.
>>



From nVnews.

 


<< They target a 6 month product cycle. Spring and Fall. Fall should be a *new* core, spring is a refresh >>



You got that reversed. Spring is the new core, Fall is the update.
 
You got that reversed. Spring is the new core, Fall is the update.

I think nVidia reversed it on me when the GTS was released 🙂
 
Back
Top