5800 and 8500: Ironic??

Reliant

Diamond Member
Mar 29, 2001
3,843
0
76
I noticed that too but I just figured that it was 5800 because their last card was 4600 and that was the next step. :)
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: Adul
5 = 5th generation core in the GPU line

800 - 8 rendering piple lines.

Well, it sure as heck isn't their 5th generation core, no matter what marketing sells ya ;)

And the "800" explains the Geforce4 4200, 4400, 4600, and 4800 nicely :)

It's marketing, there's no meaning behind it! Look at the Geforce4 MX. Look at the Radeon 9000 and the 8500 which becomes the 9100. Just numbers (and someone who get's paid alot of money to put the numbers on the box).

 

EdipisReks

Platinum Member
Sep 30, 2000
2,722
0
0
Originally posted by: merlocka

Well, it sure as heck isn't their 5th generation core, no matter what marketing sells ya ;)

it is certainly their fifth generation GPU. whether you want to call the geforce4 an entire generation over the geforce3 is your problem, not nVidia's.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
The GF4 was just as much of a change over the GF3 as the GF2 was over the GF 256.

They used 5800 because their 4800/4600 was their top GF4. Ti 4800 was just coined for marketing, mostly a European thing from what I understand, it isn't any faster than a 4600. The number thing works, works for AMD, works for ATI, nVidia decided it was a good idea starting with their GF3 Ti line instead of using "Ultra" to define the faster of the boards, as the Ti 200 was slower than the GF3 although far from being called an "MX."

Now they have a 5800 Ultra, I guess they are trying to get the best of both worlds.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: EdipisReks
Originally posted by: merlocka

Well, it sure as heck isn't their 5th generation core, no matter what marketing sells ya ;)

it is certainly their fifth generation GPU. whether you want to call the geforce4 an entire generation over the geforce3 is your problem, not nVidia's.

Actually, I want to call you a buttfungus. But I won't!

nVidia has designed more than 8 "cores" since they started calling it a GPU. As far as GPU goes, if you want to be specific, there has been NV10,11,15,17,20,24,25,and30. Most of which have had their GL varients as well. Don't forget pre-"GPU" cores (since I mentioned only cores). Just because marketing tags them as a "generation" doesn't mean they are. Geforce4 MX for example.

But since marketing just use the numbers to sound good, it doesn't really matter. If it was geforce7 it would be the geforce7 7700 cause that sounds big and it leaves room for a 7900 (which is faster) and the 7500 and 7300 (NV31 and NV34).

You guys act like the marketing guys really sweat that stuff out.
 

EdipisReks

Platinum Member
Sep 30, 2000
2,722
0
0
merlocka, only those numbers that are divisible by 5 are generations. those that are not divisible by 5 are simply cut down bargain versions. YOU are the buttfungus.
 

tapir

Senior member
Nov 21, 2001
431
0
0
so... the 5800 will have the same issue as the 8500 and get incredible performance gains later when better drivers are released! muahahhahaha

... except that doesnt make any sense because "nVidia drivers are incredible"
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: EdipisReks
merlocka, only those numbers that are divisible by 5 are generations. those that are not divisible by 5 are simply cut down bargain versions. YOU are the buttfungus.

You ->




Point ->



Since you missed it, I'll clarify. The nv numbers are internal identifiers. Marketing puts whatever name on product which they see fit. My (apparantly overlooked) example was the Geforce4 MX. The "Geforce4" portion of the name would be indicitive of a "4th" generation core, yet it's based on the NV17 core which is neither divisible by 5, nor is it at all similar to the nv25 core which is it's namesake.

Stepping outside the world of nVidia, we see that ATI has chosen a naming scheme which (supposedly) places the DirectX version as the initial digit, then the series identifier as the following three digits. Hence the Radeon 7000,7200,7500 being DX7 parts and the 8500 being a DX8 part. Naturally, the marketeers would have you believe that the Radeon 9000 is a DX9 part? Oh, and the 8500 is deftly renamed to 9100 to demostrate that it is indeed a DX9 part? Or perhaps they meant that the drivers are DX9 compatable? Or perhaps they are just keeping up with nVidia who's cruel marketing tactics forced them to name it as such.

I'll simplify it as much as I can... the names don't mean anything.

Kinda like how AMD called the Palamino the "Athlon 4"... I'm sure it had nothing to do with the naming of the Penitum 4.

Oh yeah, and I clearly stated that I wouldn't call you a buttfungus, but you seemed to miss that too ;)
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
you babies need to quit crying over what they name it. If anything they named it 5xxx because the GeForce 4s were 4xxx and 4xx. The GF3 Ti were 200 and 500, the GF4 MX are 420, 440, and 460, trying to get you to believe they are in that range of GeForce 3s while the Ti's are far above and beyond eclipsing the the 1000s!

merlocka - you are right, nVidia does have more than 5 generations, but you are also wrong, because they don't have more than 5 levels of GeForce line GPUs.

#1 GeForce 1 - GeForce 256 (SDR and DDR)
#2 GeForce 2 - GeForce 2 MX, GeForce 2 GTS, GeForce 2 Ti, GeForce 2 Ultra
#3 GeForce 3 - GeForce 3, GeForce 3 Ti 200, GeForce 3 Ti 500
#4 GeForce 4 - GeForce 4 Ti 4200, GeForce 4 Ti 4400, GeForce 4 Ti 4600/4800, GeForce 4 MX 420-460
#5 GeForce 5 - GeForce FX 5800/Ultra

It seems as if they are dropping off the numbers, no more GeForce "number goes here". Instead the Ti/MX theme seems to be the best way to title their cards, no going from Ti to FX and having a numbering system like ATI has to indicate their levels. Who knows, maybe there will be a low cost stripped down GeForce Ti 5xxx board that doesn't have the capabilities of the FX but still be just as fast as far as gaming goes. Who knows...