Is 3dfx up to their old tricks?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mikewarrior2

Diamond Member
Oct 20, 1999
7,132
0
0
the only really funny thing is bubba making fun of nvidia for missing a product cycle, when they clearly missed the v5 cycle(originally claimed to come out 12/99). This isn't bashing. Its the truth.



Mike
 

steelthorn

Senior member
Jul 2, 2000
252
0
0
You nvidia guys always have to put 3dfx down because you are jealous of them, just admit it. 3dfx puts out a much nicer card that is more compatible and stable than nvidia!
 

xtreme2k

Diamond Member
Jun 3, 2000
3,078
0
0
of cause 3dfx makes a more 'reliable' product

they follow closely to the PCI specs when they make an AGP card

how can that not be stable?? :)
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<< I'm just pointing out the fact that what Bubba said in this interview IS NOT BS at all, even though you are claiming it is. >>


You've got to be kidding me ;) Bubba is BS'ing his own BS ;) Wasn't it not too long ago that 3dfx was screaming that SPEED IS KING and nothing else matter but SPEED. We need more SPEED and FILLRATE. Screw image quality, we want SPEED and breakneck FPS. Then he throws out this:


<< Sustaining 60 FPS is the key. Over a sustained and consistent 60 FPS, is a waste. >>


Hmmm, quite a change of pace.

He then goes on to bitch about nVidia's regurgitation of chip design.



<< Nothing Nivida does will change what we are doing with the 6000. What is clear is that they missed their product cycle. This is the third regurgitation of GeForce! It was our belief that a product cycle meant introducing a new product. How many times Nvidia expect the consumer to buy the same product? >>


More BS (in that it is no different from 3dfx's philosphy), it's no different than them stacking more chips onto a V4 4500 to make a 5500 or 6000. They just go about it different ways. nVidia refines the core process, adds features, and beefs up the memory. 3dfx just adds more chips. The VSA-100 sounds to me like a Voodoo 3 core with FSAA, 32bit color and multichip capabilities. That holds even more true by looking at the scores of a Voodoo 3 3000 and a Voodoo 4 4500 in 16bit color.


<< Are we dependant on multi-chip solutions? Certainly not. >>


I'll let you answer that one ;)
 

BigToque

Lifer
Oct 10, 1999
11,700
0
76


<< Sustaining 60 FPS is the key. Over a sustained and consistent 60 FPS, is a waste >>



If you can stay above 60 FPS, or set it to run at 60 FPS all the time with no variation, why do you need to go higher? You certainly wont see a difference.

Once we can run at 1600*1200 @ 32bit @ 4x FSAA @ 60 FPS, there is no reason to go above that speed or make a more powerful card. Your eyes only refresh like 23 or 24 times a second.
 

TSDible

Golden Member
Nov 4, 1999
1,697
0
76
NFS4 Good points indeed.

According to 3Dfx (once upon a time)

Who want's image quality anyway? 16-bit is good enough for any gamer. What gamers really want is speed. NOT.

Of course Bill Gates once said that &quot;640K ought to be enough for anyone&quot; :)

As for the last question... if you don't need multiple chips, then why produce cards with them? Duh.

I don't praise or bash any of the companies. I will buy whatever I feel the best card at the time is. These companies can learn a bit from the likes of Matrox (and other companies)...

Keep the foot away from the mouth lest it should be thrust therein.

Just my $.02

 

sd

Golden Member
Feb 29, 2000
1,968
0
0
God, you NVIDIA fan boys make me wanna puke with your &quot;GTS is the greatest&quot; BS. I have a CLA2, and for me, it sucks. UT and Deus Ex DO NOT maintain consistant frame rates. I won't even get into the issues with Irongate chipsets. The GTS might have more raw power, but it is far from the best overall video card.



<< nVidia refines the core process, adds features, and beefs up the memory. 3dfx just adds more chips. The VSA-100 sounds to me like a Voodoo 3 core with FSAA, 32bit color and multichip capabilities. >>



So those features plus T-Buffer and large texture support isn't refining the core?;)
 

Radboy

Golden Member
Oct 11, 1999
1,812
0
0
NFS4 is a master at starting threads. :)

On a similar topic, I read a review yesterday comparing the V5-550 vs GF2-GTS vs Radeon HERE

In it, the reviewer says this (copy &amp; paste):

After dethroning 3dfx as the speed king and tearing some very large OEM contracts away from ATI, NVIDIA is the company to watch. The company's stock soars while many in the high-tech community compare it to Intel. Not only did NVIDIA steal the speed crown from 3dfx, it added some semi-innovative features at the same time. The real innovation has been done by NVIDIA's marketing who are legendary for their underhanded tactics, and ability to get the public excited about their new features.

end paste

What do they mean by, &quot;legendary for their underhanded tactics&quot;?
 

xtreme2k

Diamond Member
Jun 3, 2000
3,078
0
0
i duno who the hell says our eyes only capture 24fps

i can hell see the difference between playing at 120fps and 60fps, let alone 30fps

also, why is 3dfx talking about sustaining 60fps when nvidia cards are reaching 100fps+

talking about nvidia 'overclocking their 230MHz DDR ram' when they uses 166MHz SDR

talking about nvidia 'showing off' their new GF2Ultra when they got NOTHING to show to the press

talking about how nvidia missed 1 cycle, when they missed many cycles and are still MISSING

talking about how good FSAA is when you cant even play it at an acceptable performance as well as if u 'rather' play at 640/4xFSAA, those guys dont think how much details u lost already by going to 640FSAA and not 1280

talking about how useless TnL is since their VSA100 dont have it, but never talk about how 'useful' and 'well-implemented' their Tbuffer effects are. (but then i dont know why they are implementing TnL to their Rampage when it is 'useless')



 

BigToque

Lifer
Oct 10, 1999
11,700
0
76


<< Are we dependant on multi-chip solutions? Certainly not. >>



They use multiple chips because even though its more complicated to design, they already have experience doing it (ie voodoo 2 sli). Also, like they said, it helps alleviate memory bandwidth bottlenecks.

Would you go against nVidia if they decided to use a multi chip solution to help with their memory bandwidth? (in addition to the tile-based rendering that should be on the nv20)
 

Doomguy

Platinum Member
May 28, 2000
2,389
1
81
SD: LOL so its nvidia's fault the unreal engine has poor d3d and opengl performance? The unreal looks like crap and runs like crap.
 

Hardware

Golden Member
Oct 9, 1999
1,580
0
0
NSF4: &quot;Due to the fact that the 6 ns chips used on all DDR GeForce cards are rated to perform at 166 MHz&quot;
Can you give me any FACTS that the Geforce DDR &quot;6 ns&quot; ram IS rated 166MHZ?

 

BigToque

Lifer
Oct 10, 1999
11,700
0
76
xtreme2k

does your tv look choppy? nope. it runs at a consistant 30fps. i'm sure if you could run every game at a consistant 30fps (no dipping below or above) it would look completely smooth.

At either 23 or 24 fps (someone please tell me the correct number), the human eye can distinguish each separate image. movies take advantage of this by drawing millions of images, and running it at 30fps, which means the eye doesnt see every image, and it becomes an animation.
 

sd

Golden Member
Feb 29, 2000
1,968
0
0


<< SD: LOL so its nvidia's fault the unreal engine has poor d3d and opengl performance? The unreal looks like crap and runs like crap. >>



Oh, then it must be the fault of all those programmers designing games using that engine?

As for UT runnin and looking like crap, I think I can speak for many when I say you are a total IDIOT for that statement:|
 

BigToque

Lifer
Oct 10, 1999
11,700
0
76
Hardware

6ns RAM is rated as 166.6666666667 MHz

just use this foumula:

1000/ns = Rated MHz

1000/6 = 166.6666666667 MHz

 

xtreme2k

Diamond Member
Jun 3, 2000
3,078
0
0
30fps on the TV looks smooth because there is sth called 'temporal blurr' effect

3dgames dont have that

therefore requiring a lot more frames


temporal blurr is like a slightly blurred image towards the motion of the picture
TV has that very slightly

while 3dgames, there is no such things, images are 'exact'

i dont know how to explain it

but it is WRONG to compare the TV to 3dgames in terms of frame rate
 

IBMer

Golden Member
Jul 7, 2000
1,137
0
76
The blur effect on a tv is done by holding the camera exposure open for 1/24 of a second. TV's run at 24FPS, but since the frame was over exposed there is blur and it all smooths out in our eyes. When images get sharper like on DVD the framerate has to be increased to 30fps or it would seem juttery. HDTV will run at 60FPS.

As for the whole superior and inferior thing.

I think this is nothing more that a bunch of people trying to reasure to themselves that they made the right purchase. EACH card has their good points. To deny this is only to be biased. Oh well.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,393
8,552
126
3dfx was screaming &quot;speed is king&quot; at the same time the hardware websites were screaming &quot;speed is king.&quot; then 3dfx put out the v3, very fast, but all the hardware sites, including this one, like the tnt2 better cuz it had a lot of speed and 32bit rendering, so the sites screamed &quot;image quality is king.&quot; so 3dfx recognized that, and set out to make a chip where image quality was king, the vsa100. problem is that in the year that it took 3dfx to actually bring it to market the sites started screaming &quot;speed is king&quot; again. but since 3dfx has superior image quality they still have the scream &quot;image quality is king.&quot; its as simple as that.
 

RagingGuardian

Golden Member
Aug 22, 2000
1,330
0
0
Bubba is an utter idiot in the way he constantly contradicted himself in the interview. No matter what he says the GTS and Radeon beats the V5 in raw performance. The statement about the Radeon's 16bit and aweful 32bit rendering are lies. The Radeon beats the V5 in 16bit and stomps it in 32bit. Not to mention it's pretty close at Glide which is the only reason anyone should pick up a 3DFX card.

As far as fps go I think the highest the human eye aknowledges is 60fps. Doesn't really matter since the V5 can't do 60fps with 4X FSAA at anything above 800*600.

BTW, I don't hate 3DFX, I hate BS. I have a V3 3000 in my box right now.
 

Eug

Lifer
Mar 11, 2000
24,046
1,674
126


<< No doubt most (if not all) Voodoo 3 2000's hit at least 166MHz and reviews pointed that out. >>



Yes and no. As is, my guess is that many if not most V3 2000's won't hit 166 MHz stably. They require additional cooling, while the V3 3000's don't usually if normally clocked at 166. It may be due to the heatsink partially, but still, you shouldn't expect any Joe Shmoe to stick in a 2000 and expect it to run properly at 3000 speeds.

 

IBMer

Golden Member
Jul 7, 2000
1,137
0
76


<< The Radeon beats the V5 in 16bit and stomps it in 32bit. Not to mention it's pretty close at Glide which is the only reason anyone should pick up a 3DFX card. >>



Aparently you don't read the reviews of this site too much.

IT STOMPS the Radeon in 16-bit color is not only speed, but quality as well. And with the newer drivers is about only 6fps slower at 1024x768x32. Where are you getting your information?
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<< UT and Deus Ex DO NOT maintain consistant frame rates. >>


That's because UT is a manifestation of the Unreal engine which was designed when GLIDE was king. Then they decided to add on D3D when people bitched. It's not nVidia, ATi, or Matrox's fault for subpar performance for an engine that was never designed from the onset for them.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<< They use multiple chips because even though its more complicated to design, they already have experience doing it (ie voodoo 2 sli). Also, like they said, it helps alleviate memory bandwidth bottlenecks.

Would you go against nVidia if they decided to use a multi chip solution to help with their memory bandwidth? (in addition to the tile-based rendering that should be on the nv20)
>>


I didn't say that it was bad...just that nVidia and 3dfx go about different ways of increasing performance. Do I want to see a multi chip nVidia card?? HELL NO. Do you know how much an Ultra costs?? Can you imangine how much it will cost with TWO chips!!?? And I seriously doubt that the NV20 will me dual chip. nVidia knows how to get performance out of just once chips and not multiple ones.