Best graphics card for around $250

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TC91

Golden Member
Jul 9, 2007
1,164
0
0
Originally posted by: evolucion8
Originally posted by: TC91
DX10.1 will do nothing in 99% of today's games. It's just another checkmark on the feature list. PhysX and CUDA > DX10.1 IMO. DX11 is pretty close too so I don't really think DX10.1 is going to make any noise.

Wrong, games like S.T.A.L.K.E.R. Clear Sky, Tom Clancy's HAWX, BattleForge, Stormrise, Cloud 9 and even Assassin Creed for example, got between 18% and 25% of performance improvements when Anti Aliasing was used, and that can make the difference between playability and unplayability along with the eye candy. DX10.1 also supports real time global illumination which cannot be implemented in real time with DX10.0, it can use some tricks through driver query, but the time that the vendor removes them, it won't work anymore.

Wrong. DX10.1 is DEAD right now. DX11 does everything that DX10.1 does and is just around the corner (new cards rumoured to be out as soon as August), so no one is really gona push for DX10.1 anymore, and why exactly would you at this point? No reason to do so when DX11 is SO CLOSE. Global Illumination was just a TECH DEMO, show me a good REAL GAME that supports it on DX10.1 hardware. DX10.1 does not do anything in terms of extra eye candy, you list those games but there are many, many more games that have PhysX support, which does change the eye candy in games. Look here: http://www.nzone.com/object/nzone_physxgames_home.html

That is a ton of games vs. ONLY 6 games that you listed (which aren't great games either). AC had rendering issues in DX10.1 mode, that is why Ubisoft stepped up and took it away.

This will be my last post in this thread about topic this since the thread already has served its purpose to the OP.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: TC91
Originally posted by: evolucion8
Originally posted by: TC91
DX10.1 will do nothing in 99% of today's games. It's just another checkmark on the feature list. PhysX and CUDA > DX10.1 IMO. DX11 is pretty close too so I don't really think DX10.1 is going to make any noise.

Wrong, games like S.T.A.L.K.E.R. Clear Sky, Tom Clancy's HAWX, BattleForge, Stormrise, Cloud 9 and even Assassin Creed for example, got between 18% and 25% of performance improvements when Anti Aliasing was used, and that can make the difference between playability and unplayability along with the eye candy. DX10.1 also supports real time global illumination which cannot be implemented in real time with DX10.0, it can use some tricks through driver query, but the time that the vendor removes them, it won't work anymore.

Wrong. DX10.1 is DEAD right now. DX11 does everything that DX10.1 does and is just around the corner (new cards rumoured to be out as soon as August), so no one is really gona push for DX10.1 anymore, and why exactly would you at this point? No reason to do so when DX11 is SO CLOSE. Global Illumination was just a TECH DEMO, show me a good REAL GAME that supports it on DX10.1 hardware. DX10.1 does not do anything in terms of extra eye candy, you list those games but there are many, many more games that have PhysX support, which does change the eye candy in games. Look here: http://www.nzone.com/object/nzone_physxgames_home.html

That is a ton of games vs. ONLY 6 games that you listed (which aren't great games either). AC had rendering issues in DX10.1 mode, that is why Ubisoft stepped up and took it away.

This will be my last post in this thread about topic this since the thread already has served its purpose to the OP.

I'm not gonna argue with a troll who's babbling misleading information, DX11 is an INCREMENTAL UPDATE, means that most of it's benefits will also work on DX10, and specially DX10.1. Ubisoft removed AC DX10.1 thanks to nVidia, DX10.1 does do things for eye candy, it can implement Anti Aliasing in a much more efficient way when a Deferred Rendering engine is used, so it will look better, play STALKER in DX10 and then DX10.1 and see the difference, the trees looks more smooth, oops sorry, you have an obsolete nVidia card which can't support the useless DX10.1 and has the same features since 2006.

And when normal rendering is used, the performance improvement is there, so if the HD 4000 series is already beating nVidia when 8x FSAA is used, I couldn't imagine how badly ATi will beat nVidia when DX10.1 is used, ATI offers performance in games, not useless gimmicks like CUDA or PhysX which like Anandtech states, isn't good enough to give you a reason to buy nVidia hardware when the ATi counterpart offers you similar performance at cheaper prices. The OP did the right choice and bought the videocard which offers the best bang for the buck, thanks to ATi, you don't have to pay $600.00 for a GTX 280 loll
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: evolucion8
Originally posted by: TC91
Originally posted by: evolucion8
Originally posted by: TC91
DX10.1 will do nothing in 99% of today's games. It's just another checkmark on the feature list. PhysX and CUDA > DX10.1 IMO. DX11 is pretty close too so I don't really think DX10.1 is going to make any noise.

Wrong, games like S.T.A.L.K.E.R. Clear Sky, Tom Clancy's HAWX, BattleForge, Stormrise, Cloud 9 and even Assassin Creed for example, got between 18% and 25% of performance improvements when Anti Aliasing was used, and that can make the difference between playability and unplayability along with the eye candy. DX10.1 also supports real time global illumination which cannot be implemented in real time with DX10.0, it can use some tricks through driver query, but the time that the vendor removes them, it won't work anymore.

Wrong. DX10.1 is DEAD right now. DX11 does everything that DX10.1 does and is just around the corner (new cards rumoured to be out as soon as August), so no one is really gona push for DX10.1 anymore, and why exactly would you at this point? No reason to do so when DX11 is SO CLOSE. Global Illumination was just a TECH DEMO, show me a good REAL GAME that supports it on DX10.1 hardware. DX10.1 does not do anything in terms of extra eye candy, you list those games but there are many, many more games that have PhysX support, which does change the eye candy in games. Look here: http://www.nzone.com/object/nzone_physxgames_home.html

That is a ton of games vs. ONLY 6 games that you listed (which aren't great games either). AC had rendering issues in DX10.1 mode, that is why Ubisoft stepped up and took it away.

This will be my last post in this thread about topic this since the thread already has served its purpose to the OP.

I'm not gonna argue with a troll who's babbling misleading information, DX11 is an INCREMENTAL UPDATE, means that most of it's benefits will also work on DX10, and specially DX10.1. Ubisoft removed AC DX10.1 thanks to nVidia, DX10.1 does do things for eye candy, it can implement Anti Aliasing in a much more efficient way when a Deferred Rendering engine is used, so it will look better, play STALKER in DX10 and then DX10.1 and see the difference, the trees looks more smooth, oops sorry, you have an obsolete nVidia card which can't support the useless DX10.1 and has the same features since 2006.

And when normal rendering is used, the performance improvement is there, so if the HD 4000 series is already beating nVidia when 8x FSAA is used, I couldn't imagine how badly ATi will beat nVidia when DX10.1 is used, ATI offers performance in games, not useless gimmicks like CUDA or PhysX which like Anandtech states, isn't good enough to give you a reason to buy nVidia hardware when the ATi counterpart offers you similar performance at cheaper prices. The OP did the right choice and bought the videocard which offers the best bang for the buck, thanks to ATi, you don't have to pay $600.00 for a GTX 280 loll

A guy who's opinion differs from yours is instantly a troll now Evolution8?
You just accused him of babbling misleading information, and you did just exactly that in the next breath. Let me know if you need me to point out what that is. But I think you know already. Check yourself dude.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Keysplayr
A guy who's opinion differs from yours is instantly a troll now Evolution8?
You just accused him of babbling misleading information, and you did just exactly that in the next breath. Let me know if you need me to point out what that is. But I think you know already. Check yourself dude.

Your credibility is at stake since you work for nVidia. I called him like that because he's just spreading misinformation without links to back it up, and I'm not doing that. We all know that before the HD 4800 debuted, nVidia was charging up to $600.00 for the GTX 280. I own STALKER Clear Sky and those difference are easy to spot. When DX10.1 is used in that game, the foliage and trees get anti aliased and does make quite a difference along with a performance boost if Anti Aliasing is used sparingly, if it used heavily, incurs in a 3fps performance drop, is that wrong for you?

DX10 with Anti Aliasing: http://www3.picturepush.com/ph...r-Sky/DX10-DX10-AA.jpg

DX10.1 with Anti Aliasing: http://www1.picturepush.com/ph...Sky/DX101-DX101-AA.jpg

Easy to spot right? Unless when you are getting shot.

http://alienbabeltech.com/main/?p=2344&page=7

If you thought DX10 was demanding on your PC on the ?regular? DX10 pathway for Clear Sky, wait until we check ?MSAA for A-tested objects? - which requires still higher performance from your video card. Notice that the 280GTX takes a bigger performance hit - percentage-wise - than the 4870. Also, we note there is an additional highest setting available for Radeon cards that is not available to Nvidia cards - DX10.1. Fortunately for ATi owners, when we use a powerful card like 4870×2, it?s performance hit is slightly less than with the lower setting that the GTX280 Geforce can run.

The performance hit is minimal for DX10.1 cards, unlike the GTX 280, when Anti Aliasing is used on objects.

http://www.guru3d.com/article/...ith-ati-dave-baumann/3

"we should remember that DirectX 11 is a superset of DirectX 10 and DirectX 10.1, meaning that by including DirectX 10.1 developers are already paving the way to DirectX 11 features and compatibility.

By providing support for DirectX 10.1 we're helping developers be ready for DirectX 11 sooner than if they only limited current development to DirectX 10.

As for DirectX 11, naturally we're happy to see the DirectX API continue to evolve in the manner it has. My feelings are that it offers a sensible evolution of the feature-set capabilities, in line with the directions the IHVs are taking from a hardware perspective and where the ISV's want to go on the software side, whilst also addressing some of the points that were lacking in DirectX 10.

One such element that gets updated in DirectX 11 is that of Display Lists, a new driver model to more effectively multithread graphics workloads over multi-core CPUs, natively within the API. This is something that we know developers have been looking requested. The advantage here is that although this is a DirectX 11 API feature, the functionality will move down to DirectX 10 hardware, so all DirectX 10 hardware users that update to the DirectX 11 runtime will get the benefits of this feature."

http://www.hardwarecanucks.com...-passive-review-2.html

"Even though DX10.1 is a minor update to the Vista-exclusive DX10, ATI feels that its implementation will benefit gamers quite a bit in today?s market. Let?s cut right to the chase: DX10.1 doesn?t offer us anything particularly new in terms of outlandishly new features but it does offer new paths for developers to simplify their code which in turn has the potential to increase performance in certain areas. At present, among the ?big two? graphics processor manufacturers, ATI is the only one which supports DX10.1

Even though we run the risk of editorializing here we have to say that ATI?s acceptance of the DX10.1 API seems to be the right thing to do in today?s graphics card industry. After seeing first-hand the performance benefits it brings when applying AA to a DX10 environment in games like Assassin?s Creed we can only express disappointment and outright shock that other GPU manufacturers haven?t followed ATI?s lead. Consumers have been left high and dry without any reason to purchase an OS with DX10 for the simple fact that the performance in impact of DX10 is does not justify minor graphical benefits. DX10.1 works to alleviate those performance hurdles by offering developers more options when producing their games. We can only hope that ATI?s present generation cards become widespread enough that more game developers will implement DX10.1 into their titles."


http://ati.amd.com/products/pd...hitePaperv1.0FINAL.pdf

If you bother to read the DX10.1 Whitepaper you would understand what I mean.

FEATURE FUNCTION BENEFITS
Cube Map Arrays - Allow reading and writing of multiple cube maps in a single
rendering pass - Efficient Global Illumination in real time for complex, dynamic, interactive scenes. Enable many ray trace quality effects including indirect lighting, color bleeding, soft
shadows, refraction, and high quality glossy reflections

Separate Blend Modes per-MRT - Allows pixel shaders to output to multiple buffers (MRTs),
each with their own blend mode - Efficient deferred shading for improved performance in
complex 3D scenes

Increased Vertex Shader Inputs & Outputs - Doubled from 16 to 32 128-bit values per shader - Improved performance for complex shaders

Gather4 - Allows a 2x2 block of unfiltered texture values to be fetched in place of a single
bilinear filtered texture lookup - Higher quality shadows with improved filtering - Fast procedural noise computation to add more visual variety to 3D scenes - Higher order filtering for high quality fluid simulation on the GPU - Improved performance for Stream Computing applications

LOD instruction - New shader instruction that returns the level of detail for a filtered texture lookup - Custom texture filtering techniques for optimized performance and quality - Fast parallax occlusion mapping for improved 3D surface detail


Multi-sample buffer reads and writes - Allow individual color and depth samples in a multisample buffer to be accessed directly by a shader

Pixel Coverage Masks - Enable programmable antialiasing in a pixel shader - Custom edge detect filters for high quality anti-aliasing with optimized performance and reduced memory footprint - Faster adaptive anti-aliasing - Improved anti-aliasing quality with HDR rendering - Improved anti-aliasing compatibility and performance with deferred shading - High quality volumetric rendering for atmospheric effects - High quality depth of field post-processing effects

Programmable AA Sample Patterns - Allows programmers to define their own sample patterns for
each pixel - Temporal anti-aliasing - Improved image quality for multi-GPU anti-aliasing

FP32 filtering required - Filtering of 128-bit floating point texture formats now a requirement instead of an optional feature
Int16 blending required - Blending of 64-bit integer pixel formats now a requirement instead of an optional feature - Encourages use of these high precision data formats by ensuring hardware compatibility

Minimum 4x MSAA support required - Multi-sample anti-aliasing with at least 4 samples per pixel must be supported for all 32-bit and 64-bit pixel formats

Standardized AA sample patterns - Pre-defined sample locations for 2x/4x/8x/16x AA modes
that hardware must support - Ensures anti-aliasing can behave identically on all DirectX 10.1 GPUs - Encourages support for antialiasing by improving consistency

Increased precision for floating point operations - 0.5 ULP precision required for all floating point math (add/subtract/multiply/divide) and blending operations - Eliminates rounding errors - Matches IEEE standard requirements for these operations

And please don't tell me that's ATi's Marketing because Microsoft is the one who make the DirectX standards, not ATi nor nVidia. So why buy outdated hardware? The OP did the right choice getting an HD 4890, period.