Geforce 4

Binge

Junior Member
Nov 19, 2001
17
0
0
Do we have any idea of the aproximate time frame for the next gen offering from Nvidia?
Any link, comments, ideas on what to expect as far as the card's general capabilities?
I know Nvidia seems to be on a 6 month cycle - So next spring/summer is the time frame we may expect?

I'm interested in any and all comments,links,musings, bashings, etc.
 

vash

Platinum Member
Feb 13, 2001
2,510
0
0
We'll be expecting to see the next card from Nvidia come April-May. Teasers will come in March.

vash
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106

well...'til then i'll stick with my radeon 8500 and can get an impression if the radeon works for me, how compatible etc....and i can see whether ati is able to release decent drivers. If not (let's say in march or april)....i sell my radeon and get a geforce 4 (hehe ;) )...assuming that it's price is ok and not insanely overprized like gf3 ti 500...which is not even a new card....its merely another name, some few mhz more..and whooop! "Yeah....let's sell if for $320 !"

I think it's cool that ATI begins to k*ck nvidias a$$....at the end we can only benefit from it...meaning nvidia will think twice if they want to sell their new geforce 4 then for $450 or something in that area just because there is no alternative brands out there...



 

Diable

Senior member
Sep 28, 2001
753
0
0
Flexy, cards using Nvidia's top of line chip will still sell in the $400 range when there first introduced because Nvidia and the card makers know there are guys who will pay $400+ to be the first on there block with the "worlds fastest graphics card".
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106


<< Flexy, cards using Nvidia's top of line chip will still sell in the $400 range when there first introduced because Nvidia and the card makers know there are guys who will pay $400+ to be the first on there block with the "worlds fastest graphics card". >>



not if there are equal products out (maybe not only equal..maybe even superior ;) ) which are priced more attractive...because then people would 'just' go out and buy the better product which (the same time) may even be cheaper. (Yeah..it's called competition...and the consumer usually should benefit from it ;)

I dont know if i'd go out buying a gfxcard for $400...on the other hand it's likely since i already planned to get a ti 500 for $325 (incl. s/h)...and the ti 500 is by far NO new card AT ALL....but i personally was willing to sacrifice that just to have 'the best out there'...ok..'til i changed my mind now and ordered a radeon 8500.

I just think that nvidia should think twice what products they want to sell/release, with what features..and to what price !! $100 more for a 'old' GF3 (added 'Titanium' to the name) and a few MHZ more...i'ts still no NEW chip...but it costs as much as one !

That's what i meant with...'i hope ATI will kick nvidias a$$ a bit'...and nvidia will have no other choice than price their products now with ONE eye at ati and at the features of the radeon....because if not, then suddenly people COULD begin to ask what the **** do they pay their money for actually ???? "Oh hey...here i get a similiar card...with cooler features, for $120 less"
(And i think the success will still be decided by the 'masses'...and not by a few geeks who spend insane amounts just to get 5 fps more....the most successfull card will be the card with the best price/value....)









 

ec98214

Junior Member
Jun 18, 2001
21
0
66
Obviously it won't have dual-vision since it will be a High-End board and Nvidia doesn't do High-End boards with Dual-Vision (ou Twin-vision or Hydravisiom or ciclope-vision or fuzzy-vision or...)

Second it will obviously be a GeForce XP! Everybody is doing it, so why can't they?
XP will mean Xtreme Pricey! ;)

Third it will give us a frame-rate of 400 fps on Quake3 in 640x480!
Wether we use that or not is up to us! :)

Fourth - I'm just kidding! Don't take me wrong!
I have an Asus GeForce DDR Deluxe and I'm very happy with it (although it's a little slow nowadays!)

Take care!
 

NeonFlak

Senior member
Sep 27, 2000
550
7
81
Can't believe people are already worried about what nvidia is coming out with next =) I'm a happy puppy with my 8500....Maybe that's just me.
 

MadRat

Lifer
Oct 14, 1999
11,973
291
126
NVidia GeForce4 Datasheet

Process: .10u
Transistors: 75M
Pixel Pipelines: 8
Simultaneous Textures Per Pixel: 4
Maximum Active Textures Per Pixel per Pass: 4
Number of Pixel Shading Operations per Pass: 36 (4 per combiner, 9 combiners)
Vertex Instructions Per Vertex per Pass: 128
Core Frequency Of Released Card: 400 MHz
Memory Frequency Of Released Card: 533 MHz 32-bit RDRAM
Frame Buffer Of Released Card: 128MB
Supported Memory: RDRAM / DDR266
Memory Interface Width: 512-bits/256-bits
 

Phuz

Diamond Member
Jul 15, 2000
4,349
0
0


<< $400 for a graphics card is insane. I can buy a whole modern computer for less money. >>



Don't move to Canada then.. :)

Its pretty hard not to spend 300-400 for a decent card.. and thats AFTER its been out for a while..
 

Mem

Lifer
Apr 23, 2000
21,476
13
81


<< NV25 Speculation

--------------------------------------------------------------------------------
My contact at a major motherboard/video card company just let me know that the NV25 will be showing up in January and that the NV17 will bridge the gap, showing up next month. From what he told me, the NV25 will deliver a lot more punch than the current GeForce3 line-up. Get ready for NVIDIA to deliver another knock-out punch in the graphic card battle.
>>



from nVNews .
 

vash

Platinum Member
Feb 13, 2001
2,510
0
0


<< Yeah, that's cool, but what about the same level games? Good games are extremely few nowadays and I don't think it makes sense to rush to the store and pay $300 just to play a couple of games. nVidia should be making games too, b/c their cards are much too powerful but nothing to play on them. >>

Most games, unfortunately, aren't written to fully take advantage of the cards existing hardware. Developers make games that try to run on as many cards as possible. Any optimization for one card's feature/functions helps it look good on XXX hardware, but may never run (or run properly) on YYY's card. So, in order to keep a relatively level playing field, they have to make their games that will run on everyone's system.

The only way I justify buying a $300-$400 card every other year is the fact that I play online FPS games that are really fast paced and require a framerate to stay competitive (120+ fps constantly, never below 100). Sure, for 98%+ of the population out there, they don't need Quake3 at very high resolution, all eye candy on and a really incredible framerate, but I do -- that's why I buy the cards.

vash
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106


<< NVidia GeForce4 Datasheet

Process: .10u
Transistors: 75M
Pixel Pipelines: 8
Simultaneous Textures Per Pixel: 4
Maximum Active Textures Per Pixel per Pass: 4
Number of Pixel Shading Operations per Pass: 36 (4 per combiner, 9 combiners)
Vertex Instructions Per Vertex per Pass: 128
Core Frequency Of Released Card: 400 MHz
Memory Frequency Of Released Card: 533 MHz 32-bit RDRAM
Frame Buffer Of Released Card: 128MB
Supported Memory: RDRAM / DDR266
Memory Interface Width: 512-bits/256-bits
>>




MadRat,

it doesnt impress me a BIT ! They can crank the core mhz to 2 ghz and the mem to 3 ghz....i think it's time to re-think philosophy behind graphiccards....do you think there is an real life difference whether you play GameXYX w/ 120 fps....w/ 150 or w/250 fps.

Instead of cranking up the mhz (yawn....) nvidia should finally come up with something similiar to radeon...did you ever look at its specs....what truform really does....eg. how effective the bus/memory is used with almost NO performance loss and the same time HUGE improvement in imagequality...new rendering algorithms etc.... THAT'S what is interesting....and...believe me.....if someone thinks he needs to spend $100 more a gf (say GF4, which's specs/technology i still do not know)..then be it.

I for my part will for sure get the card with the best FEATURES, rendering techniques....DirectX N.N support....because i really don't give a [pleep] anymore if a card has like 10 fps more or less in 5Dmark2004......i think it's time to focus on what a card actually CAN...in terms of image quality..rendering......i am sooooo tired of those blodgy, edgy looking direct3d models in almost every 3d game....it still looks the same as an eternity ago when i had a TNT....and Radeon right now is the ONLY company coming up of their a$$ and doing something which could REALLY impreove the visual appearance of games, new games to come and even older ones !!!!! If NV decides to go with plain mhz......so be it....but the gf4 then for sure wont be mine ;)




 

Binge

Junior Member
Nov 19, 2001
17
0
0
NV25 by January?? Seems early to me - but I hope you're correct. By spring I'll be ready to replace my GF256-ddr.
I figure the NV25 card will carry me through Doom3 and DNF (DNF looks to be a 2002 game now! Will that game ever be finished?).
 

Diable

Senior member
Sep 28, 2001
753
0
0
Flexy, I don't think there going to be a quantum shift in the minds of most hard core gamers(the people who buy +$300 video cards)away from fps. Fps are a quantifiable thing where as image quality is a very subjective thing, what looks good to me on my monitor may not look good to you on yours and vice versa.
 

Spook

Platinum Member
Nov 29, 1999
2,620
0
76
Well, don't count out the Bitboys.... they might have another bright idea by then.....


 

Athlex

Golden Member
Jun 17, 2000
1,258
2
81
Bitboys... Did they ever release a product? Interesting technology on paper, but did any of it see the light of day?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Memory Frequency Of Released Card: 533 MHz 32-bit RDRAM

I'm sure that's definitely a false speculation. There is no way in hell anyone would use RDRAM on video cards.
 

AA0

Golden Member
Sep 5, 2001
1,422
0
0
that spec sheet does not look right, AMD won't have a .13 micron process out by then, and your saying NVidia is going to have .10? Intel doesn't plan on having the next their next die shrink for quite some time too.
 

jeffrey

Golden Member
Jun 7, 2000
1,790
0
0
It's not up to AMD, INTC, or even NVIDIA what micron process the GF4 will be produced on. NVIDIA outsources their chip manufacturing to Taiwanese Semiconductor Manufacturing Corporation TSMC. If you hear that TSMC is sampling .10micron, you can bet NVIDIA will be manufacturing chips on those lines the moment they are ready.

sp
 

AA0

Golden Member
Sep 5, 2001
1,422
0
0
still, these guys are going to be more than a year ahead of anyone else? I don't think so. .1 isn't really feasible right now.
Also, as already said, RD RAM would be extremely poor to incorporate on a videocard for heat and latency reasons.
 

Athlex

Golden Member
Jun 17, 2000
1,258
2
81
Latency isn't as big a deal in a video card as bandwidth which is why Sony went with RDRAM with the PS2
Heat probably wouldn't be an issue either since they're already using heatsinks on many of their SDRAM based cards.
Price is probably the most prohibitive aspect of RDRAM in this case...that and the fact that all their cards up to now use SDRAM. I'd be surprised if they switched...
 

Mats

Senior member
Jul 10, 2001
408
0
0

"b/c their cards are much too powerful but nothing to play on them."


^^^^ This smells like BS. :confused:
 

Fandu

Golden Member
Oct 9, 1999
1,341
0
0


<< Latency isn't as big a deal in a video card as bandwidth which is why Sony went with RDRAM with the PS2 >>



Heheheheh, hahahahahhahahaha, wooooooohoooooooo, hehehhahahahaha. ROFLMAO. I think it Was Kyle Bennet who interviewed both ATI and nVidia and asked about RDRAM, they both stated flat out that the latency was much much too high to work in a graphics card environment. Your definitly right on the cost factor though, although I'm quite sure that if they really wanted to they could eat that few dollars less profit per card.
 

daweasel

Senior member
May 29, 2001
733
0
0
AAO

I think I've read that AMD is considering outsourcing some of their fab work to TSMC. TSMC is usually the leader in putting products out with a lower micron spec, but they aren't putting out the amount, complexity, and yield that someone like INTEL does. They of course offer high spec stuff too ( which of course would be cheaper to the consumer)

I bet it they pushed it hard they could tape something and have it out by Jan. The xbox chip is already more complex than the GF3, and that chip is already in use.