• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

2 high end chips from nVidia in 2009

Does DX10 even matter yet? Screens between DX10 and 9 look almost identical. If the 4870 just happened to be DX9 only, I still think people would have bought it and still be buying it for the performance....
 
Originally posted by: Insomniator
Does DX10 even matter yet? Screens between DX10 and 9 look almost identical. If the 4870 just happened to be DX9 only, I still think people would have bought it and still be buying it for the performance....

exactly.

now, tell me dx11 is a free upgrade for vista users? i sure as hell didnt pay real money for this OS to be told i have to buy ANOTHER OS to use DX11. i think they learned their lesson w/ vista in that regards.
 
Originally posted by: Insomniator
Does DX10 even matter yet? Screens between DX10 and 9 look almost identical. If the 4870 just happened to be DX9 only, I still think people would have bought it and still be buying it for the performance....

Dx10 mainly lets you do the same things as Dx9, but at much less of a performance hit. Dx9 does have legitimate issues with scaling once you start to add more objects into a scene, etc.

An example of this would be the new DWM in Windows 7. They switched it to Dx10 now and it uses half of the resources that Dx9 Aero does in Vista.
 
WARP10 will be available for DirectX 10 for Windows 7. Which means you can render the DX10 effects on your quad-core without a DX10 card. 😀 But I doubt the performance will be good.
 
Originally posted by: nRollo
Originally posted by: thilan29
http://www.fudzilla.com/index....view&id=11390&Itemid=1

Lol...Geforce 11. 🙂

Isn't this just the same rumors we've all seen before?

40nm GT200 variant- check

DX11 late 2009- check.

I also disagree with "NVIDIA needs DX11 more than ATi- if DX11 offers additional features, they both will need it.

Fudzilla and AMD have about as much pro-AMD spin as you have pro-NV spin. They probably have a bunch of contacts with them. Maybe AMD even pays them for marketing, who knows. I sometimes wonder if MS pays this site.
 
Originally posted by: SickBeast


Fudzilla and AMD have about as much pro-AMD spin as you have pro-NV spin. They probably have a bunch of contacts with them. Maybe AMD even pays them for marketing, who knows. I sometimes wonder if MS pays this site.


Think we read different blurbs. Anyways quick look had nvidia graphic ads outnumbering the one saphire banner ad. So I must have proved that fud is a good journalist. :laugh:

 
DX11 adds GPU programability... something that both AMD and ATI have already... There should be no reason for them not to be able to implement it all the way back to the geforce 8.

DX10 allows you to do things much more efficiently, as well as a few new highly intensive features. Programmers have been going nuts with it, adding new features that bog it down. But a few games only use DX10 for performance increase while maintaining the same quality.
Many things you can do on DX9, DX10 will do it at a higher FPS.
 
40nm is really needed
these things are getting too hot for my liking
200watts is a lot for me
i like to keep my cards under 100watts a peice

just the way i feel about it atleast
 
Originally posted by: Ocguy31
Originally posted by: Soulkeeper
i like to keep my cards under 100watts a peice




So you dont like good performance? 😕


3870, and 9800GT both run under 100watts and offer excellent performance imo
9800gtx+ and gtx 260 (55nm version) both pull around 100 watts
 
Originally posted by: Soulkeeper
40nm is really needed
these things are getting too hot for my liking
200watts is a lot for me
i like to keep my cards under 100watts a peice

just the way i feel about it atleast

Why did you pick that arbitrary number? 5 years ago you could say under 50w a piece, in 5 years who knows maybe 600w cards will be the norm.

I like the US with 50 states... just how I feel about it.

 
Originally posted by: Insomniator
Originally posted by: Soulkeeper
40nm is really needed
these things are getting too hot for my liking
200watts is a lot for me
i like to keep my cards under 100watts a peice

just the way i feel about it atleast

Why did you pick that arbitrary number? 5 years ago you could say under 50w a piece, in 5 years who knows maybe 600w cards will be the norm.

I like the US with 50 states... just how I feel about it.

you gotta drawl the line somewhere, just my personal choice
 
Originally posted by: Insomniator
Does DX10 even matter yet? Screens between DX10 and 9 look almost identical. If the 4870 just happened to be DX9 only, I still think people would have bought it and still be buying it for the performance....

That and maybe it would have had better compatibility with old games then the DX10 parts we have now. 😛
 
Originally posted by: Insomniator
Does DX10 even matter yet? Screens between DX10 and 9 look almost identical. If the 4870 just happened to be DX9 only, I still think people would have bought it and still be buying it for the performance....

http://www.hardocp.com/article.html?art=MTU5MSw5LCw=

The most benefit to this performance improvement came when antialiasing was enabled, especially on the GTX 280. We experienced a very large 20% performance improvement in DX10 with AA compared to DX9 with AA.
 
Back
Top