Originally posted by: Keysplayr
A guy who's opinion differs from yours is instantly a troll now Evolution8?
You just accused him of babbling misleading information, and you did just exactly that in the next breath. Let me know if you need me to point out what that is. But I think you know already. Check yourself dude.
Your credibility is at stake since you work for nVidia. I called him like that because he's just spreading misinformation without links to back it up, and I'm not doing that. We all know that before the HD 4800 debuted, nVidia was charging up to $600.00 for the GTX 280. I own STALKER Clear Sky and those difference are easy to spot. When DX10.1 is used in that game, the foliage and trees get anti aliased and does make quite a difference along with a performance boost if Anti Aliasing is used sparingly, if it used heavily, incurs in a 3fps performance drop, is that wrong for you?
DX10 with Anti Aliasing:
http://www3.picturepush.com/ph...r-Sky/DX10-DX10-AA.jpg
DX10.1 with Anti Aliasing:
http://www1.picturepush.com/ph...Sky/DX101-DX101-AA.jpg
Easy to spot right? Unless when you are getting shot.
http://alienbabeltech.com/main/?p=2344&page=7
If you thought DX10 was demanding on your PC on the ?regular? DX10 pathway for Clear Sky, wait until we check ?MSAA for A-tested objects? - which requires still higher performance from your video card. Notice that the 280GTX takes a bigger performance hit - percentage-wise - than the 4870. Also, we note there is an additional highest setting available for Radeon cards that is not available to Nvidia cards - DX10.1. Fortunately for ATi owners, when we use a powerful card like 4870×2, it?s performance hit is slightly less than with the lower setting that the GTX280 Geforce can run.
The performance hit is minimal for DX10.1 cards, unlike the GTX 280, when Anti Aliasing is used on objects.
http://www.guru3d.com/article/...ith-ati-dave-baumann/3
"we should remember that DirectX 11 is a superset of DirectX 10 and DirectX 10.1, meaning that by including DirectX 10.1 developers are already paving the way to DirectX 11 features and compatibility.
By providing support for DirectX 10.1 we're helping developers be ready for DirectX 11 sooner than if they only limited current development to DirectX 10.
As for DirectX 11, naturally we're happy to see the DirectX API continue to evolve in the manner it has. My feelings are that it offers a sensible evolution of the feature-set capabilities, in line with the directions the IHVs are taking from a hardware perspective and where the ISV's want to go on the software side, whilst also addressing some of the points that were lacking in DirectX 10.
One such element that gets updated in DirectX 11 is that of Display Lists, a new driver model to more effectively multithread graphics workloads over multi-core CPUs, natively within the API. This is something that we know developers have been looking requested. The advantage here is that although this is a DirectX 11 API feature, the functionality will move down to DirectX 10 hardware, so all DirectX 10 hardware users that update to the DirectX 11 runtime will get the benefits of this feature."
http://www.hardwarecanucks.com...-passive-review-2.html
"Even though DX10.1 is a minor update to the Vista-exclusive DX10, ATI feels that its implementation will benefit gamers quite a bit in today?s market. Let?s cut right to the chase: DX10.1 doesn?t offer us anything particularly new in terms of outlandishly new features but it does offer new paths for developers to simplify their code which in turn has the potential to increase performance in certain areas. At present, among the ?big two? graphics processor manufacturers, ATI is the only one which supports DX10.1
Even though we run the risk of editorializing here we have to say that ATI?s acceptance of the DX10.1 API seems to be the right thing to do in today?s graphics card industry. After seeing first-hand the performance benefits it brings when applying AA to a DX10 environment in games like Assassin?s Creed we can only express disappointment and outright shock that other GPU manufacturers haven?t followed ATI?s lead. Consumers have been left high and dry without any reason to purchase an OS with DX10 for the simple fact that the performance in impact of DX10 is does not justify minor graphical benefits. DX10.1 works to alleviate those performance hurdles by offering developers more options when producing their games. We can only hope that ATI?s present generation cards become widespread enough that more game developers will implement DX10.1 into their titles."
http://ati.amd.com/products/pd...hitePaperv1.0FINAL.pdf
If you bother to read the DX10.1 Whitepaper you would understand what I mean.
FEATURE FUNCTION BENEFITS
Cube Map Arrays - Allow reading and writing of multiple cube maps in a single
rendering pass - Efficient Global Illumination in real time for complex, dynamic, interactive scenes. Enable many ray trace quality effects including indirect lighting, color bleeding, soft
shadows, refraction, and high quality glossy reflections
Separate Blend Modes per-MRT - Allows pixel shaders to output to multiple buffers (MRTs),
each with their own blend mode - Efficient deferred shading for improved performance in
complex 3D scenes
Increased Vertex Shader Inputs & Outputs - Doubled from 16 to 32 128-bit values per shader - Improved performance for complex shaders
Gather4 - Allows a 2x2 block of unfiltered texture values to be fetched in place of a single
bilinear filtered texture lookup - Higher quality shadows with improved filtering - Fast procedural noise computation to add more visual variety to 3D scenes - Higher order filtering for high quality fluid simulation on the GPU - Improved performance for Stream Computing applications
LOD instruction - New shader instruction that returns the level of detail for a filtered texture lookup - Custom texture filtering techniques for optimized performance and quality - Fast parallax occlusion mapping for improved 3D surface detail
Multi-sample buffer reads and writes - Allow individual color and depth samples in a multisample buffer to be accessed directly by a shader
Pixel Coverage Masks - Enable programmable antialiasing in a pixel shader - Custom edge detect filters for high quality anti-aliasing with optimized performance and reduced memory footprint - Faster adaptive anti-aliasing - Improved anti-aliasing quality with HDR rendering - Improved anti-aliasing compatibility and performance with deferred shading - High quality volumetric rendering for atmospheric effects - High quality depth of field post-processing effects
Programmable AA Sample Patterns - Allows programmers to define their own sample patterns for
each pixel - Temporal anti-aliasing - Improved image quality for multi-GPU anti-aliasing
FP32 filtering required - Filtering of 128-bit floating point texture formats now a requirement instead of an optional feature
Int16 blending required - Blending of 64-bit integer pixel formats now a requirement instead of an optional feature - Encourages use of these high precision data formats by ensuring hardware compatibility
Minimum 4x MSAA support required - Multi-sample anti-aliasing with at least 4 samples per pixel must be supported for all 32-bit and 64-bit pixel formats
Standardized AA sample patterns - Pre-defined sample locations for 2x/4x/8x/16x AA modes
that hardware must support - Ensures anti-aliasing can behave identically on all DirectX 10.1 GPUs - Encourages support for antialiasing by improving consistency
Increased precision for floating point operations - 0.5 ULP precision required for all floating point math (add/subtract/multiply/divide) and blending operations - Eliminates rounding errors - Matches IEEE standard requirements for these operations
And please don't tell me that's ATi's Marketing because Microsoft is the one who make the DirectX standards, not ATi nor nVidia. So why buy outdated hardware? The OP did the right choice getting an HD 4890, period.