New NV20 Specs !!!! (semi confimed by near nvidia!)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

_Silk_

Junior Member
Mar 2, 2000
22
0
0
I am 99% sure that those specs are faked/wrong - as well as that ridiculous price tag.

I think we already heard the real specs earlier.
300Mhz
4 Pipelines
2 TMU per pipeline
1200Mpix/sec
2400Mtex/sec
With Hidden Surface Removal (HSR), assuming an overdraw factor of 4, they call it 4800Mpix/sec

First, these spec rumors started floating around, and then there was an interview with a Microsoft rep on the X-Box. The rep said that the X-Box was actually based off the NV20 instead of the NV25 as originally thought. They basically took the NV20, and included chipset functions in it.

Then Microsoft announced the final official specs for the X-Box, which matched up perfectly with the rumored NV20 specs. Coincidence... I think not.

All I'm trying to say is not to get too worked up over this latest rumor. I suggest taking those new rumored specs, and its price with a mountain of salt.
 

Hardware

Golden Member
Oct 9, 1999
1,580
0
0
Lets see the numbers 100 fps at 1600x1200x32 at q3 VS 60 FPS from an ULTRA
The numbers (speed) are OK!!!

 

RGN

Diamond Member
Feb 24, 2000
6,623
6
81
Will never spend ~$800 for a vid card. he11, I'll never spend $400 for one.
 

Marty

Banned
Oct 11, 1999
1,534
0
0
The numbers may be reasonable, but the specs themselves are obviously fabricated.

Marty
 

Deeko

Lifer
Jun 16, 2000
30,213
12
81
I doubt they charge $800. Almost no one would buy it. That's like the cost of 2 1.2 ghz tbirds. Yea right. Hardware, you said nvidia has no competition. What are you talking about. What do you call ATI, 3dfx, and matrox? The Radeon was better than the GF2 all around. If the V5 6K ever comes out, it may be close in performance to the GF2u. And you said the Radeon looks pale compared to the NV20, duh, they are a generation apart. Does not the V3 look pale compared to a GeForce? Also, who's to say Rampage or G800 or a future ATI card won't be as good/better? Remember the leaked specs for the Rampage? They were 1600 mpixels/6400 mtexels, for the 2 chip mainstream. I doubt it will be that, but still, both of these are unofficial leaked specs.
 

Hardware

Golden Member
Oct 9, 1999
1,580
0
0
Matrox? competition LOL with their new G450 DDR(-M64)
3dfx? I dont know if they have any money left to finish the rampage?
Radeon? Yes maybe right now a competitor but only on a sub fps level
With the NV20 the 3d quality should be the same as the radeon

I dont know if I buy the NV20 at $800 (I am still fighting now to buy the gtsu)
 

Sephiroth_IX

Diamond Member
Oct 22, 1999
5,933
0
0
Hardware - Radeon is not a video card company.

1) Matrox
2) 3dfx
3) Radeon

DOESNT WORK.

ATi is a video card company, and ATi will release a new card just as Nvidia is doing to compete with the NV20
 

VladTrishkin

Senior member
Sep 11, 2000
421
0
0
We still have the 3dFX Rampage, the Matrox G800 and the ATI Radeon MAX... I'd say the next 4-8 months will be very competitive and expensive...
 

superbaby

Senior member
Aug 11, 2000
464
0
0
The two people who will be able to buy it:

Bill Gates
and
The freak who just robbed a bank.

That price is ridiculous. That's $1200 CDN!!!!! THAT'S THE PRICE OF 1GB OF PC133 SDRAM!!!

NO OEM would put that in their system. NO 14-28 year old gamer (their target audience!!!!) would be able to buy that! Are they NUTZZZ?
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
They cant make nv20 cost 800 us dollars because no one would buy it. They can make more money if they sell 1000 cards for 500$ than if they sell 40 for 800$. There is also no chance in hell those cards would sell outside the us since they would cost more than a thousand canadian bucks(imagine australian price) that's just insane. They wont make money on a card no one will buy.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,395
8,558
126
no way in hell they would set that at $800. they can't set the price to whatever they want, like hardware said, they set the price at the profit-maximizing point. since no one would buy this card at $800 (is it really 3x to 4x faster than a GTS? no... why the h3ll would anyone pay that much extra?) that is obviously not the profit-maximizing point. of course, nvidia could have waaay too many engineers and just price themselves out of the market. there is that possibility. if that ram is there then it might just be that expensive but then you have to ask, if it has 2x the bandwidth of a GTS then what was the point of HSR? HSR is an attempt to reduce the dependency on exotic memory tech so that cards can perform better with the same cheapo ram. this DDR256 stuff just blows that theory out of the water.
 

Blackhawk2

Senior member
May 1, 2000
455
0
0
<<They cant make nv20 cost 800 us dollars because no one would buy it...>>

Thats what they said about the Geforce2 Ultra, but apparently people are buying it. :confused:

The chip probably costs $35-$50 to make. How the h*ll do the vendors come up with the pricing of $800?!?!?! Not to mention that their is practically no competition between the vendors of these cards as Geforce2's are still in the $300+ range at which they started selling.

I think there needs to be a FTC investigation into these pricing practices, as it seems there is no stopping this stupid price escalation/lack of competition.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,395
8,558
126
i heard the ultra was $500... thats a little bit lower than $800. $300 is a chunk of change. are those things even for sale yet?
 

Hardware

Golden Member
Oct 9, 1999
1,580
0
0
Any why is Intel releasing a PIV for $1200-$1500?
Hey boys wake up there is always a market for topspeed
 

Soccerman

Elite Member
Oct 9, 1999
6,378
0
0
&quot;I'm sticking with 300MHz. That's the number we've all been hearing for the past 6 months.&quot;

I agree these specs are fake.

1st, if anyone impliments 256bit ram bus width, they gotta be insane! it would be MUCH easier to go the Gigapixel/PowerVR route! hmm.. makes me wonder if nVidia would buy PowerVR for the tech?

2nd, there's that whole XBOX issue, it was released to be 4800MPIXELS. even that # is bull, from the PR, becuase they're trying to take into account their HSR technique (which DaveB3D taught me is only ONE of MANY features that PowerVR, and Gigapixel use to reduce memory bandwidth requirements).

3rd, they'd probably need to shrink the die even more to get 300mhz out of a 6 pipeline core.

as for the price.. hehe.. yeah right!

it's funny to see people defending that price, saying: oh it'll go down before it gets released!

by that logic, I guess the Voodoo 5 6k should go down in price too? oh wait.. that IS logical! the older the tech, the less useful, therefor it's worth less money (unless it's extremely rare).

maybe people should think before they state prices (cough about the V5 6K cough).
 

KarsinTheHutt

Golden Member
Jun 28, 2000
1,687
0
0
Ok ok... the only way to bring vid card prices down is to NOT buy top of the line products. If card makers can't sell any cards for $400, they won't price them so high.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,395
8,558
126
p4s will be ~$800 at launch. the $1200 are the &quot;engineering samples&quot; in japan that are a month before official launch and probably unlocked as well. and thats a CPU, something you use every single second you're on the computer. that has a lot larger demand at $800 than a gaming video card.
 

VladTrishkin

Senior member
Sep 11, 2000
421
0
0


<< The chip probably costs $35-$50 to make. How the h*ll do the vendors come up with the pricing of $800?!?!?! Not to mention that their is practically no competition between the vendors of these cards as Geforce2's are still in the $300+ range at which they started selling. >>



-Yes the GPU chip does ;) Add the memory, and the rest of the PCB components and you?ll run up to around $150+.


---------------


Hey Soc ;)




<< 1st, if anyone implements 256bit ram bus width, they got to be insane! it would be MUCH easier to go the Gigapixel/PowerVR route! hmm.. makes me wonder if nVidia would buy PowerVR for the tech? >>



-I don?t agree here. They could simply implement 2 (Double) data-width paths to the memory addressing, which will be clock synchronized, and still be able to boast 2 (128bit) memory buses on the same channel. Of course this will (probably) require a 2+ GPU chip solution because one chip wont be able to render this much data at a higher latency (on one data bus). The story changes if the latency of the data bus can decrease to say 3 (1 data fetch line per 3 clock cycles), then we might see a powerful GPU and an astonishing amount of bandwidth throughoutput.



<< 3rd, they'd probably need to shrink the die even more to get 300mhz out of a 6 pipeline core. >>



That?s my original point, but shrinking the die still might not do the trick enough to produce high enough chip yields. We might see an advanced rendering pipeline implementation here. I am guessing that the new GPU will have a smaller (15-30mm2 ?) T&amp;L unit on the chip, and a longer more complicated pipeline array. I believe that the 6 pipelines will be able to produce this fillrate by sharing the assigned value data. This basically means that the 6 pipelines will be dependant on each other and will render their own fetched data. IE:

Pipeline 1 renders one (or more) instruction per clock,
2'nd pipeline renders 2'nd+ instructions, and so on...

This creates quite a few problems (i.e.: latency times, slow fetch unit, etc).

NV20 reminds me of the P4 :)
 

xtreme2k

Diamond Member
Jun 3, 2000
3,078
0
0
those guys who got V2 SLI set a FREAKING BAD example that we are willing to pay $600 for ANYTHING that is highend
now the more people buy this $800 NV20, the more we are saying, your NV25 should cost $1000. we are telling nvidia &quot;set the price of your NV25 to $1000 and we will buy them&quot;