Is 3dfx up to their old tricks?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

IBMer

Golden Member
Jul 7, 2000
1,137
0
76


<< That's because UT is a manifestation of the Unreal engine which was designed when GLIDE was king. Then they decided to add on D3D when people bitched. It's not nVidia, ATi, or Matrox's fault for subpar performance for an engine that was never designed from the onset for them. >>



Funny how little people know about hardware and APIs. The V5 run UT and Dues Ex just fine in D3D. Why? Because it has the ability to do a very basic texture compression called palletized textures. The Radeon can do this as well, hense the radeon scoring pretty high on the UT benchmarks. Funny how all the card that don't use this run like crap.
 

Radboy

Golden Member
Oct 11, 1999
1,812
0
0
Small point - TVs (at least NTSC TVs in USA) run at 60 half-frames (interlaced) per sec, which = 30 FULL fps (not 24). Movie theaters show movies at 24 (full) fps, which is a standard decided upon long ago, because film is/was very expensive, and this was the *minimum* fps they could get &amp; still have a smooth viewing experience. Film's exposure characteristics take advantage of motion-blur, and when the film is exposed, and Arnold walks across the screen, his image is bulred, which gives ur eye (connected to ur brain) additional information about movement. Games don't have motion-blur, so we need higher fps to make it look 'smooth' to our eyes/brains.

Europe uses PAL, which (i think) is 25fps (more 'film-like), but they have more lines of rez, and (I hear) looks better than our NTSC TV in USA.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<< Funny how little people know about hardware and APIs. The V5 run UT and Dues Ex just fine in D3D. Why? Because it has the ability to do a very basic texture compression called palletized textures. The Radeon can do this as well, hense the radeon scoring pretty high on the UT benchmarks. Funny how all the card that don't use this run like crap. >>


Benchmarking is one thing...actual gameplay is another; why can't people understand that? It's one thing to benchmark the hell out of something and get high framerates, but if the actual gameplay isn't fluid, WTF is the point?

By all means, Deus Ex should run like a champ on my machine (Athlon 600 @ 735, 256MB RAM, SDR GeForce), but it runs like a dog. UT on the otherhand works just fine for me in OpenGL mode. Go figure.
 

Sohcan

Platinum Member
Oct 10, 1999
2,127
0
0
Case in point about motion blur: I was watching a special on the making of Jurassic Park. Originally, Spielberg had planned on using stop-motion animation instead of CGI for shots of the entire dinosaurs. But by its very nature (since the objects in each frame are motionless when the frame is exposed), stop-motion animation does not have motion blur. They showed two demos of a T-Rex running (one using stop-motion, the other using CGI + motion blur), and even though both were running at 24 fps, the CGI was perfectly smooth, while the stop-motion was very jerky.

Consumer 3D graphics is a lot like stop-motion animation, since we don't have motion blur (I don't consider 3dfx's &quot;motion blur&quot; true motion blur, since it looks more like mouse trails). If you do a 360 degree spin in one second in a game running at 30 fps, that gives a change of 12 degrees per frame, which IMHO is not enough to be considered flawlessly smooth. When done correctly, motion blur can be very useful. Until then, 60+ fps is very important.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
If you can stay above 60 FPS, or set it to run at 60 FPS all the time with no variation, why do you need to go higher? You certainly wont see a difference.

I don't believe there are still people out there that spread this bullsh*t. Most people who say this sort of thing are people who have never played at high framerates, or in fact have never even played 3D games at all. As a consequence they don't know what the hell they are talking about.

Once we can run at 1600*1200 @ 32bit @ 4x FSAA @ 60 FPS, there is no reason to go above that speed or make a more powerful card. Your eyes only refresh like 23 or 24 times a second.

Then I would seriously get your eyes checked because you may be in the final stages of going blind. Tell me if your eyes refresh 24 times why doesn't your monitor have a 24 Hz refresh? Anything above is wasteful isn't it? Why do we need 75 Hz, 85 Hz, 120 Hz? They're just &quot;wasteful&quot; aren't they?

This issue has been beaten to death so many times it just isn't worth explaining it to yet another &quot;maximum fps&quot; person. Stop talking trash and at least do some research about it before prattling on about &quot;maximums&quot;.
 

Sohcan

Platinum Member
Oct 10, 1999
2,127
0
0


<< Then I would seriously get your eyes checked because you may be in the final stages of going blind. Tell me if your eyes refresh 24 times why doesn't your monitor have a 24 Hz refresh? Anything above is wasteful isn't it? Why do we need 75 Hz, 85 Hz, 120 Hz? They're just &quot;wasteful&quot; aren't they? >>

Good point....Any refresh under 85Hz hurts my eyes after an extended period. At 60Hz, I can see the screen flicker if I move my head.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Good point....Any refresh under 85Hz hurts my eyes after an extended period. At 60Hz, I can see the screen flicker if I move my head.

Yes I'd believe that. Some people can see the screen flickering at 72 Hz. I can see 67 Hz flicker but 72 Hz seems to be alright for me. I run my monitor at 75 Hz and I don't have any problems with it.

Now with 3D games you want your fps to be as high as possible at all times. Movies have no interactivity at all. You don't have to move/jump/fire or do anything like that. Anybody who claims there is no difference above 60 fps is either lying or full of crap.
 

BigToque

Lifer
Oct 10, 1999
11,700
0
76
BFG10K

I've seen games running at 120FPS and games running at 60 FPS and can hardly notice a difference. I have my refresh rate set to 60Hz, and see no problems.

BTW... I have perfect vision. I'm not about to go blind.

 

lsd

Golden Member
Sep 26, 2000
1,184
70
91
you must be blind already because a 60hz refresh rate will hurt anybody's eyes.
you are right however that there isn`t much of a difference in 120 and 60 fps. The only time i can see the difference is in q3.
 

Scorpion

Senior member
Oct 10, 1999
748
0
0
What the hell happened? Now we are in a debate on FPS between TVs and computers?

Back to the topic at hand...

I like this quote by Bubba:
&quot;This is the third regurgitation of GeForce! It was our belief that a product cycle meant introducing a new product. How many times Nvidia expect the consumer to buy the same product? &quot;

Ok Bubba, so what the Hell has 3DFX been doing?

A Voodoo 1, is a Voodoo 2, is a Voodoo 3, is a Voodoo 4, is a Voodoo 5.

So essentially, you've regurgitated the Voodoo core 5 times! At least the GeForce had significant improvements over the TNTs.
The VSA-100 is about the same. Improvements, but it's still based on the foundation of the Voodoo 1.

So Bubba, how many times do YOU expect consumers to buy the same 3DFX product?
 

bernse

Diamond Member
Aug 29, 2000
3,229
0
0
As soon as the 5500 drops $100 I will pick one up. Decent card, great quality, but a little too high priced for what it is.

My .02

 

kami

Lifer
Oct 9, 1999
17,627
5
81
Another sh!tass interview...

Christ, I think more people would respect this company if they didn't lie out of every single oriface.

They should just say &quot;yeah, we fscked up... we are working on fixing our problems with our next product&quot; and not give stupid, dumbfounded excuses why their card is better than everyone elses.
 
Jun 18, 2000
11,197
769
126
You guys already mentioned that the Ultra was shipped to Anand (among other sites) with memory clocked at 250mhz.

Won't that skew nVidia's scores in their favor? So that means at the slower clock the Ultra should be about double (or slightly less) the speed of the V5 at high rez and 32 bit color.

That should give some hope to 3dfx with their 6000.



<< We really can't say anything, so just wait another &quot;2 weeks&quot; to &quot;slightly over a month&quot; to know and understand it all... thats all we can say about 3dfx and Rampage etc... >>

-Kristof from Beyond3d
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<< Won't that skew nVidia's scores in their favor? >>


If it's clocked at the default 230/460 during testing, what's the problem? It's not nVidia's fault that they have better memory :)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I've seen games running at 120FPS and games running at 60 FPS and can hardly notice a difference. I have my refresh rate set to 60Hz, and see no problems.

The keyword being seen. Did you play these games?

BTW... I have perfect vision. I'm not about to go blind.

And yet you can't see 60 Hz flicker? I don't think I know anybody that isn't annoyed by 60 Hz monitors.

The only time i can see the difference is in q3.

ie 3D games. My point exactly. Heck I would gladly take 300 fps if I was able to. Lock the VSYNC on and prepare for ultra smooth gaming.

What the hell happened? Now we are in a debate on FPS between TVs and computers?

Unfortunately there is always someone who brings up the TV/movie argument and compares it to 3D games.
 
Jun 18, 2000
11,197
769
126


<< If it's clocked at the default 230/460 during testing, what's the problem? It's not nVidia's fault that they have better memory >>

Oops, I didn't know it was still clocked at 230mhz. I'm curious. Why use the higher rated RAM if its going to be clocked 20mhz lower than its spec? Doesn't that greatly add cost to the baords?
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,393
8,552
126
NSF4- the price of an ultra is due to the ram. the gpu itself doesn't cost all that much. assuming nvidia splits the 64 thats on there it should only cost another $50 to make. sounds reasonable to get rid of a lot of bandwidth limitations.
 

SSP

Lifer
Oct 11, 1999
17,727
0
0
OMG, Hardware have a Radeon. Just weeks or months ago he was saying how crappy ATI was. :)

Anyway, for me, Frame rates aren?t everything. Give me good FPS + Good features.. And I'll show you a happy man. I'd probably go for a Radeon over a 5500 or a GF2.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<< Why use the higher rated RAM if its going to be clocked 20mhz lower than its spec? Doesn't that greatly add cost to the baords? >>


For the 99th time ;) nVidia gets yields that can hit 500, and ones that will only do 460. Instead of dumping the ones that can only do 460, they drop all memory down to 460 and leave it at that.
 

rickn

Diamond Member
Oct 15, 1999
7,064
0
0


<< Why does 3dfx let this guy speak? >>



they have to have someone with their foot in their mouth, because the rest of 3dfx has their heads up their a$$
 

BigToque

Lifer
Oct 10, 1999
11,700
0
76
BFG10K

Yes I did play the games. (actually only q3 on different systems)

My screen setup in Windows 2000 is like this: 1024x768 @ 32bit @ 60Hz, and I do not have any problems with this flickering your talking about. Its very possible my eyes are not as sensitive as yours under certain conditions, but I assure have no problems with my vision.
 

NFS4

No Lifer
Oct 9, 1999
72,636
47
91


<< 1024x768 @ 32bit @ 60Hz, and I do not have any problems with this flickering your talking about. >>


Gawd damn!! 60Hz KILLS my eyes. Even 70Hz is a pain. 75Hz and higher for me...