• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

ATI's Radeon 9700 versus NVIDIA's NV30 article up just to let everyone know!!!!

Wolfsraider i am not a anandtech member but the anandtech forums always load up in 1 sec or less. Also the topics always load up in 1 sec even the long ones.
 
Just finished reading the article...excellent read. Thanks. 🙂

Did that render of someone's eye impress anyone else? That level of rendering in real time is extremely impressive. Just imagine the games we will be playing in say...1-2 years from now...🙂
 
Originally posted by: Insane3D
Just finished reading the article...excellent read. Thanks. 🙂 Did that render of someone's eye impress anyone else? That level of rendering in real time is extremely impressive. Just imagine the games we will be playing in say...1-2 years from now...🙂

That's the thing with video cards. Sure you can do impressive stuff with the R300 and upcoming NV30, but for gaming, you won't see anything utilizing their capabilities until well beyond their technical lifespan, and at that point there's obviously going to be better video cards. People should look at what the R300/NV30 cards can do for games today, and to me they're pretty much going to be equal. The only advantage really is that the R300 is practically available now, and NV30 is unknown.
 
How is it that Intel was able to beat everyone to the punch with a .13mu part? They've been cranking out .13mu CPUs since January and it looks like everyone else won't be able to do the same until January 2003 at least.
 
Originally posted by: GTaudiophile
How is it that Intel was able to beat everyone to the punch with a .13mu part? They've been cranking out .13mu CPUs since January and it looks like everyone else won't be able to do the same until January 2003 at least.
First oif all, Intel has been mass producing .13um CPU's since Q1'2001 (the Celeron Tualatin). And there is no doubt about it, Intel has really some of the best R&D teams for new manufacturing processes in the world (one of which who played a part in the development of the .13um Process is our local Wingznut Pez). I don't know how Intel does it, but I imagine they may just either have more money than TMSC and UMC have or they may just have more talented Engineers. I dunno.
 
That's the thing with video cards. Sure you can do impressive stuff with the R300 and upcoming NV30, but for gaming, you won't see anything utilizing their capabilities until well beyond their technical lifespan, and at that point there's obviously going to be better video cards. People should look at what the R300/NV30 cards can do for games today, and to me they're pretty much going to be equal. The only advantage really is that the R300 is practically available now, and NV30 is unknown.

That is the old 3dfx philosophy. Who gives a crap about new technology? I just want to play current games at high speed! Well, not everyone upgrades their video card every 6 months or even every year, and this is a pretty big step. Those people who bought geforce3's instead of geforce2 Ultras (no performance difference early on), the geforce3 is doing much better than the geforce2 ultra.

If every company was focusing on CURRENT technology, we'd still be playing Quake 1 type games at 300 fps.
 
Originally posted by: dexvx
That's the thing with video cards. Sure you can do impressive stuff with the R300 and upcoming NV30, but for gaming, you won't see anything utilizing their capabilities until well beyond their technical lifespan, and at that point there's obviously going to be better video cards. People should look at what the R300/NV30 cards can do for games today, and to me they're pretty much going to be equal. The only advantage really is that the R300 is practically available now, and NV30 is unknown.

That is the old 3dfx philosophy. Who gives a crap about new technology? I just want to play current games at high speed! Well, not everyone upgrades their video card every 6 months or even every year, and this is a pretty big step. Those people who bought geforce3's instead of geforce2 Ultras (no performance difference early on), the geforce3 is doing much better than the geforce2 ultra.

If every company was focusing on CURRENT technology, we'd still be playing Quake 1 type games at 300 fps.
definetly a valid point from both of you.

I personaly would rather buy a Geforce4 than a Geforce3 because it has more features, features that I expect to be able to use in its current lifetime (as in 2-3 years max). With the upcoming R300 I doubt that games will start to use its DX9 features untill at the end of its lifespan and with the nv30 its even further away. Directx9 and all the cards that support it is probably the biggest step in computer real time 3d since the Voodoo1. But like so many have said, we wont see games utilizing that untill Directx9 cards will be sold under $100.

 
Could anyone tell me if games will just look better even like old games like quake 3 arena, Max Payne, RTCW, etc on the R300? Like more movie like graphics or no? I know With FSAA and AF to max it suppose to make the image quality better but i can't even notice Maximum FSAA at 4x and AF at 8x at 1024x768 or 1600x1200. Also will the R300 still make games look better without FSAA and AF on lets say at 1600x1200 resolution on GTA3, Max Payne, SOF2, etc

 
Originally posted by: Athlon4all
Originally posted by: GTaudiophile
How is it that Intel was able to beat everyone to the punch with a .13mu part? They've been cranking out .13mu CPUs since January and it looks like everyone else won't be able to do the same until January 2003 at least.
First oif all, Intel has been mass producing .13um CPU's since Q1'2001 (the Celeron Tualatin). And there is no doubt about it, Intel has really some of the best R&D teams for new manufacturing processes in the world (one of which who played a part in the development of the .13um Process is our local Wingznut Pez). I don't know how Intel does it, but I imagine they may just either have more money than TMSC and UMC have or they may just have more talented Engineers. I dunno.

They have more money than god. They spend annually around 7 billion on R&D and a good chunck of that is process R&D. Correct me if I'm wrong pm or wingnut.
 
Originally posted by: dexvx
That's the thing with video cards. Sure you can do impressive stuff with the R300 and upcoming NV30, but for gaming, you won't see anything utilizing their capabilities until well beyond their technical lifespan, and at that point there's obviously going to be better video cards. People should look at what the R300/NV30 cards can do for games today, and to me they're pretty much going to be equal. The only advantage really is that the R300 is practically available now, and NV30 is unknown.
That is the old 3dfx philosophy. Who gives a crap about new technology? I just want to play current games at high speed! Well, not everyone upgrades their video card every 6 months or even every year, and this is a pretty big step. Those people who bought geforce3's instead of geforce2 Ultras (no performance difference early on), the geforce3 is doing much better than the geforce2 ultra. If every company was focusing on CURRENT technology, we'd still be playing Quake 1 type games at 300 fps.

You got me wrong there. I am all for new technologies. In fact I'm always bleeding edge tech myself. I was strictly comparing the R300 and NV30 as the next gen cards. Being both new, you won't see either have any sort of leg up on each other for current apps and the near term software that's coming out. As for future games that use DX9/OGL2, well that won't happen in the technical lifespan of R300/NV30, so why bother having hypothetical comparisons on which is better? The true answer is the video card released a year down the road... What you do get from the R300/NV30 tech is to run existing games now better, ie 1024x768+ with all imaginable eye candy on, something you can't do with earlier tech.
 
Originally posted by: Czar
Originally posted by: dexvx
That's the thing with video cards. Sure you can do impressive stuff with the R300 and upcoming NV30, but for gaming, you won't see anything utilizing their capabilities until well beyond their technical lifespan, and at that point there's obviously going to be better video cards. People should look at what the R300/NV30 cards can do for games today, and to me they're pretty much going to be equal. The only advantage really is that the R300 is practically available now, and NV30 is unknown.

That is the old 3dfx philosophy. Who gives a crap about new technology? I just want to play current games at high speed! Well, not everyone upgrades their video card every 6 months or even every year, and this is a pretty big step. Those people who bought geforce3's instead of geforce2 Ultras (no performance difference early on), the geforce3 is doing much better than the geforce2 ultra.

If every company was focusing on CURRENT technology, we'd still be playing Quake 1 type games at 300 fps.
definetly a valid point from both of you.

I personaly would rather buy a Geforce4 than a Geforce3 because it has more features, features that I expect to be able to use in its current lifetime (as in 2-3 years max). With the upcoming R300 I doubt that games will start to use its DX9 features untill at the end of its lifespan and with the nv30 its even further away. Directx9 and all the cards that support it is probably the biggest step in computer real time 3d since the Voodoo1. But like so many have said, we wont see games utilizing that untill Directx9 cards will be sold under $100.

I don't mean to split hairs, but the R9000/R9000 Pro are DX9 cards that cost less than $100. I realize that their performance in most games isn't as good as the 8500 (though they do best the 8500 in UT2K3), but they are DX9 cards, and if they're popular, they could conceivably cause DX9 games to come to market faster, particularly if nvidia responds with a similarly priced DX9 card.
 
I don't mean to split hairs, but the R9000/R9000 Pro are DX9 cards that cost less than $100. I realize that their performance in most games isn't as good as the 8500 (though they do best the 8500 in UT2K3), but they are DX9 cards, and if they're popular, they could conceivably cause DX9 games to come to market faster, particularly if nvidia responds with a similarly priced DX9 card.

The 9000 Pro and non-Pro are Direct X 8.1 cards. There are no current cards on the market that are DX 9 parts....hell DX9 hasn't even been released yet. AFAIK, the 9700 will be the first DX 9 part with NV30 being the second. The Parhelia has some support for DX9 features, but it is not a fully DX 9 part either. As far as DX 9 goes, the last I heard was Microsoft was shooting for an October release....

🙂
 
Back
Top