• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

OMG nVidia is going all 3dfx on us :(

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BentValve

Diamond Member
Dec 26, 2001
4,190
0
0
Originally posted by: human2k
on-board graphics is good enuf here. Can invest that $500 on my education instead of stupid graphx card to play quake 3 or some dumb FPS really fast.



www.emachines.com
Your Source For the Best High Performance Computer Parts/Rigs


Nuff said, no? :D
 

Darien

Platinum Member
Feb 27, 2002
2,817
1
0
Originally posted by: BentValve
Originally posted by: human2k
on-board graphics is good enuf here. Can invest that $500 on my education instead of stupid graphx card to play quake 3 or some dumb FPS really fast.



www.emachines.com
Your Source For the Best High Performance Computer Parts/Rigs


Nuff said, no? :D



Hahaha :D
 

sandorski

No Lifer
Oct 10, 1999
70,785
6,345
126
Originally posted by: jasonsRX7
Originally posted by: sandorski
Ah yes, once again 3DFX is vindicated as former criticism comes full circle back to those who first spake it. Extra power, huge real estate, and FSAA, whose the schmuck now? :D

Seriously, the Voodoo 5/6 were omens of the future, one day being upset at a videocards need for a power connector would be like being upset that a hard drive uses a power connector now, who cares? As long as the thing works, pumps out fps in a gratuitous manner, and causes/satisfies gaming addiction are we really going to care or fret over 1 lost power connector or 1(mostly usesless anyway) PCI slot? I'm not.

3dfx was requiring the power brick when their competitors were producing products that performed equally as good or better than the 3dfx cards, and did NOT require the extra power. That was proof that 3dfx's design was not an "omen of the future" but rather a product that was rushed and not refined.

If requiring an external power source on a video card is going to be an industry-wide solution, then so be it. I'll only place blame if all the other manufactures are able to get equal performance without it.

Nvidia, having bought 3dfx after their failure, would not be the schmuck. ATI, having caught up to Nvidia and surpassed them in many areas for the time being, would also not be the schmuck.

This is not true, the Voodoo5/6 were designed to come out with the original GeForce. They, the Voodoo5/6, were far superior to the GeForce. Anyway, the point is that power consumption for video cards is/was exceeding the capacity of the interface, if anything, video cards are advancing faster than chip and interface capacity. That's the point.

BTW, the schmuck is anyone(corporate or user) who slagged 3DFX(in this case) for doing what was necessary, but now finding themselves forced to do the same thing. Besides the power issue, FSAA was considered back then to be a senseless feature, now it's an integral part of every serious vidcards features. ;)
 

Bovinicus

Diamond Member
Aug 8, 2001
3,145
0
0
I don't like losing a PCI slot and an HDD power connecter either, but it doesn't really have any effect on me. I never use all the molex connectors or the first PCI slot anyway. Plus, with more and more equipment moving onboard the motherboard 5 or 6 PCI slots rarely get used. I only am using 2 in my main system, and 2 in my secondary system.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
Its an enthusiast level card, the requirements seem fine to me as long as it fits.
 

Jeff7

Lifer
Jan 4, 2001
41,596
20
81
Can someone also explain why they can't put the components on the other side of the videocard? Is it some EMI issue? Yes it's nice to have the blower to take the hot air right out of the system, but making it mandatory and take up an expansion slot? Not all motherboards have 6 PCI slots. You could have a more effective heatsink on the card - one that's taller, leaving the fan farther away from the heatsink's base to avoid the deadzone caused by the fan hub. That could also allow for increased surface area right over the core, instead of relying on heat pipes.
 

mrman3k

Senior member
Dec 15, 2001
959
0
0
Look people, first let me set the majority of you people straight. First only continue reading my post if you are a power user and/or want system stability.

You should NEVER install any PCI card next to the AGP slot for 2 main reasons.
1). The 1st PCI slot often has IRQ conflicts if it is populated and the AGP slot is populated.
2). AIRFLOW! This seems like a difficult concept for most of you to grasp. This is very important with today's graphics cards, always, always, never install any card next to the AGP slot so that there can be plenty of airflow for the fan, otherwise you create a hot pocket of air that gets trapped and you get hot stagnant air recirculating in the GPU HSF. I actually applaud Nvidia for having a larger cooling apparatus that uses the extra PCI slot since it should allow for much better cooling performance.

Finally for the extra power connector, both ATI and Nvidia use it, it is due to their cards pulling more juice which is okay with the majority of us since we like high clocked graphics cards. So if you do not like it, then stick with your GFMX, R9000, etc.

Any questions, please post and I would be happy to answer.
 

jasonsRX7

Senior member
Aug 9, 2000
290
0
0
Originally posted by: sandorski
Originally posted by: jasonsRX7
Originally posted by: sandorski

This is not true, the Voodoo5/6 were designed to come out with the original GeForce. They, the Voodoo5/6, were far superior to the GeForce. Anyway, the point is that power consumption for video cards is/was exceeding the capacity of the interface, if anything, video cards are advancing faster than chip and interface capacity. That's the point.

BTW, the schmuck is anyone(corporate or user) who slagged 3DFX(in this case) for doing what was necessary, but now finding themselves forced to do the same thing. Besides the power issue, FSAA was considered back then to be a senseless feature, now it's an integral part of every serious vidcards features. ;)

The Voodoo5 did not come out with the original GeForce, regardless of when they hoped to release it, and it was outperformed by the original GeForce in many areas as well. I'm not seeing how they were "far superior" by any means, although they did win a few benchmarks.

At that time, it was only necessary for 3dfx, not for the other vendors who could and did outperform 3dfx without the additionaly power requirements. Now, it has become an issue for other vendors as well, which is understandable. It's just that at the time 3dfx did it, no one else needed it.

AA was not a useless feature and never has been. But 3dfx was trying to push it too early, when there were more important features to take care of first.
 

Jeff7

Lifer
Jan 4, 2001
41,596
20
81
Originally posted by: mrman3k
Look people, first let me set the majority of you people straight. First only continue reading my post if you are a power user and/or want system stability.

You should NEVER install any PCI card next to the AGP slot for 2 main reasons.
1). The 1st PCI slot often has IRQ conflicts if it is populated and the AGP slot is populated.
2). AIRFLOW! This seems like a difficult concept for most of you to grasp. This is very important with today's graphics cards, always, always, never install any card next to the AGP slot so that there can be plenty of airflow for the fan, otherwise you create a hot pocket of air that gets trapped and you get hot stagnant air recirculating in the GPU HSF. I actually applaud Nvidia for having a larger cooling apparatus that uses the extra PCI slot since it should allow for much better cooling performance.

Finally for the extra power connector, both ATI and Nvidia use it, it is due to their cards pulling more juice which is okay with the majority of us since we like high clocked graphics cards. So if you do not like it, then stick with your GFMX, R9000, etc.

1) Never had any problems whatsoever with this - performance or IRQ.
2) Good point. Of course, I prefer to have a 120mm fan blowing onto the expansion slots though.:D
Any questions, please post and I would be happy to answer.

Edit: Happened upon this thread while looking for something else; no idea why I just quoted what I did above like this.:eek:
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: jasonsRX7
AA was not a useless feature and never has been. But 3dfx was trying to push it too early, when there were more important features to take care of first.


Like what? 32-bit color? That was the only knock on the V5 (which could do 24-bit hacked and had better 2D than any ATI or Nvidia cards)....plus the Voodoo's kicked ass in Glide games, and back then, that was a big deal.

Chiz
 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
Originally posted by: chizow
Originally posted by: jasonsRX7
AA was not a useless feature and never has been. But 3dfx was trying to push it too early, when there were more important features to take care of first.


Like what? 32-bit color? That was the only knock on the V5 (which could do 24-bit hacked and had better 2D than any ATI or Nvidia cards)....plus the Voodoo's kicked ass in Glide games, and back then, that was a big deal.

Chiz

I think you're mistaking the V5 for a V3.
The V5 could do 32 bit color, the V3 used it's 22 postfilter.

And when the V3 came out, Glide was a big deal, when the GTS/V5's were out, it wasn't, unless you were into really old games, but by that logic DOS compability was a big deal as well.
 

sandorski

No Lifer
Oct 10, 1999
70,785
6,345
126
Originally posted by: jasonsRX7
Originally posted by: sandorski
Originally posted by: jasonsRX7
Originally posted by: sandorski

This is not true, the Voodoo5/6 were designed to come out with the original GeForce. They, the Voodoo5/6, were far superior to the GeForce. Anyway, the point is that power consumption for video cards is/was exceeding the capacity of the interface, if anything, video cards are advancing faster than chip and interface capacity. That's the point.

BTW, the schmuck is anyone(corporate or user) who slagged 3DFX(in this case) for doing what was necessary, but now finding themselves forced to do the same thing. Besides the power issue, FSAA was considered back then to be a senseless feature, now it's an integral part of every serious vidcards features. ;)

The Voodoo5 did not come out with the original GeForce, regardless of when they hoped to release it, and it was outperformed by the original GeForce in many areas as well. I'm not seeing how they were "far superior" by any means, although they did win a few benchmarks.

At that time, it was only necessary for 3dfx, not for the other vendors who could and did outperform 3dfx without the additionaly power requirements. Now, it has become an issue for other vendors as well, which is understandable. It's just that at the time 3dfx did it, no one else needed it.

AA was not a useless feature and never has been. But 3dfx was trying to push it too early, when there were more important features to take care of first.

I don't want this to turn into a 3DFX vs NVidia(or whomever ) thread and I do agree that when the cards actually came out is a valid issue, but the point still stands: They were meant to target the original GeForce. It was other reasons(not technical, but managerial) that the cards were late. Also, not only was FSAA *not* brought out too early, but it took Nvidia/ATI 2 generations to match the speed and quality of Voodoo5/6 FSAA. On top of that, FSAA was useable(speed wise) on the Voodoo5/6 right from the start, unlike 32bit colour was or Hardware TnL was when they were first introduced. If anyone brought out features before their time, it wasn't 3dfx.
 

buleyb

Golden Member
Aug 12, 2002
1,301
0
0
Originally posted by: Jeff7
Originally posted by: mrman3k
Look people, first let me set the majority of you people straight. First only continue reading my post if you are a power user and/or want system stability.

You should NEVER install any PCI card next to the AGP slot for 2 main reasons.
1). The 1st PCI slot often has IRQ conflicts if it is populated and the AGP slot is populated.
2). AIRFLOW! This seems like a difficult concept for most of you to grasp. This is very important with today's graphics cards, always, always, never install any card next to the AGP slot so that there can be plenty of airflow for the fan, otherwise you create a hot pocket of air that gets trapped and you get hot stagnant air recirculating in the GPU HSF. I actually applaud Nvidia for having a larger cooling apparatus that uses the extra PCI slot since it should allow for much better cooling performance.

Finally for the extra power connector, both ATI and Nvidia use it, it is due to their cards pulling more juice which is okay with the majority of us since we like high clocked graphics cards. So if you do not like it, then stick with your GFMX, R9000, etc.

1) Never had any problems whatsoever with this - performance or IRQ.
2) Good point. Of course, I prefer to have a 120mm fan blowing onto the expansion slots though.:D
Any questions, please post and I would be happy to answer.

1) Agreed, never had any problems. In older 98/95, it was a common recommendation to put NICs in PCI 1 so it gained the largest priority over things like a modem, soundcard, etc. In XP, I've had cards in PCI 1 and NEVER had a problem with it. I'm also never heard of this problem before...

2) This is the reason I don't have PCI 1 used in my system...
 

Xionide

Diamond Member
Apr 20, 2002
8,679
2
81



[Dude your getting a bad reseller rating=Text]http://www.resellerratings.com/seller1867.html[/L]
 

thorin

Diamond Member
Oct 9, 1999
7,573
0
0
Originally posted by: majewski9
I dont think Nvidia has done a very good job with GeforceFX!

1- It uses up two PCI slots
2- Ungodly huge card
Based on what? I haven't seen anything that would lead me to believe it's bigger then a GF4. You got pics?
3- Ram runs too hot
Again where'd you get this? Please link it up.

Originally posted by: jasonsRX7
I dont think Nvidia has done a very good job with GeforceFX!
3dfx was requiring the power brick when their competitors were producing products that performed equally as good or better than the 3dfx cards, and did NOT require the extra power. That was proof that 3dfx's design was not an "omen of the future" but rather a product that was rushed and not refined.
I completely agree.

Thorin
 

Darien

Platinum Member
Feb 27, 2002
2,817
1
0
Originally posted by: thorin
Originally posted by: majewski9
I dont think Nvidia has done a very good job with GeforceFX!

1- It uses up two PCI slots
2- Ungodly huge card
Based on what? I haven't seen anything that would lead me to believe it's bigger then a GF4. You got pics?



To me this seems like point 1 leads to point 2. If I had a card that filled up 2 slots, I'd call it huge.



link -- confirmed that it will take a pci slot. release date 3/9/2003 eh...
 

thorin

Diamond Member
Oct 9, 1999
7,573
0
0
Originally posted by: Darien
Originally posted by: thorin
Originally posted by: majewski9
I dont think Nvidia has done a very good job with GeforceFX!

1- It uses up two PCI slots
2- Ungodly huge card
Based on what? I haven't seen anything that would lead me to believe it's bigger then a GF4. You got pics?
To me this seems like point 1 leads to point 2. If I had a card that filled up 2 slots, I'd call it huge.
link -- confirmed that it will take a pci slot. release date 3/9/2003 eh...
That doesn't make the card huge that makes the HSF huge. It's not like the card is extraordinarily long or wide (not like the V5 6K).

Thorin
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Well, we can scratch one FUD-tasticly popular rumor off of ATI Fanboi's To-Troll List:

GF FX won't be available until March at the earliest

I guess I'll have to live with a GF3 for a few weeks.....this 9700pro is going back to Best Buy, unless someone wants it for $325 shipped?

Chiz

Edit: Link courtesy of BentValve, if you haven't noticed, he's been doing his homework on the GF FX ;)
 

Clockwurk

Junior Member
Mar 23, 2002
9
0
0
I don't think a single person at Anandtech is even remotely capable of questioning the architecture of any
Nivida or ATI board.
*****************************************************

I'll question architecture/design... Why doesn't Nvidia place the chip on the other side of the card? There usually is backplane slot above the AGP slot, but not a card to block it. I'm really not sure why noone has thought of this. It would place the cooling closer to the rear exhaust fan on most cases and has nearly unlimited space. Sheesh
 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
Originally posted by: Clockwurk
I don't think a single person at Anandtech is even remotely capable of questioning the architecture of any
Nivida or ATI board.
*****************************************************

I'll question architecture/design... Why doesn't Nvidia place the chip on the other side of the card? There usually is backplane slot above the AGP slot, but not a card to block it. I'm really not sure why noone has thought of this. It would place the cooling closer to the rear exhaust fan on most cases and has nearly unlimited space. Sheesh

Cause they wanna follow the AGP spec unlike some mobo manufacturers?
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: Clockwurk
I don't think a single person at Anandtech is even remotely capable of questioning the architecture of any
Nivida or ATI board.
*****************************************************

I'll question architecture/design... Why doesn't Nvidia place the chip on the other side of the card? There usually is backplane slot above the AGP slot, but not a card to block it. I'm really not sure why noone has thought of this. It would place the cooling closer to the rear exhaust fan on most cases and has nearly unlimited space. Sheesh

Thankfully you don't work for nVidia.

 

buleyb

Golden Member
Aug 12, 2002
1,301
0
0
Originally posted by: Sunner
Originally posted by: Clockwurk
I don't think a single person at Anandtech is even remotely capable of questioning the architecture of any
Nivida or ATI board.
*****************************************************

I'll question architecture/design... Why doesn't Nvidia place the chip on the other side of the card? There usually is backplane slot above the AGP slot, but not a card to block it. I'm really not sure why noone has thought of this. It would place the cooling closer to the rear exhaust fan on most cases and has nearly unlimited space. Sheesh

Cause they wanna follow the AGP spec unlike some mobo manufacturers?

Well I think its fair to say that by taking a PCI slot, they are kinda smearing that AGP spec a bit :)