ATI's Boundless Gaming is bounded

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sc4freak

Guest
Oct 22, 2004
953
0
0
Originally posted by: Crusader
Angle independent AF isn't nearly as big a deal as guys like you make it out to be. IQ is comparable in most games. HDR+AA may look nicer, but only 3 games support it, and one of them (Oblivion) with known render errors?

It's true, there are many rendering errors in Oblivion. However, most of them only occur on Nvidia cards. If you take a look at the official forums, you'll see that many people have experienced graphical corruption in Oblivion regardless of their settings, and all of them have Nvidia Geforce 7 series cards. The graphical corruption includes broken textures and stretched, broken and spiked errornous polygons. And this occurs on stock, non-overclocked, sufficiently cooled Geforce 7 cards, and is rare on ATI's cards.
 

c0d1f1ed

Member
Jan 10, 2005
49
0
0
Who's going to win the 'physics war'?

Intel. Then Microsoft together with NVIDIA and ATI.

Seriously. The Intel Core 2 Duo has massive floating-point performance. You don't need anything extra for physics processing. Dedicated hardware is nonsense, there isn't a single operation that is unique to physics. AGEIA's PhysX P1 does additions and multiplications just like any generic programmable processor. Core 2 Duo has the advantage of a much higher clock frequency (thanks to 65 nm technology and Intel's expertise), and even one of its cores beats P1 at GFLOPS. Furthermore the communication between cores is way faster than using a PCI bus, both in bandwidth and latency. And it's much easier for game developers to program for.

Why Microsoft? Well, GPUs will get very good at GPGPU applications, including phsics, once Vista and Direct3D 10 is available. It allows GPUs to do their own context switches and have virtualized memory, running multiple 'processes' on the GPU concurrently with minimal overhead. And Microsoft is already working on a physics API. Proprietary solution from NVIDIA or ATI are just not going to survive long. And this Microsoft physics API can be implemented with dual/multi-core CPUs or Direct3D 10 GPUs. It depends on your CPU/GPU configuration and the workload of the game which will be used. Either way, Microsoft and the big hadware manufacturers win, while AGEIA looses.

250 $ for a card that can just do physics is absurd. They need to enter the mainstream market to get any succes. But they need to get below 100 $ and offer much better peformance for that. It's not going to happen any time soon. Intel, AMD, NVIDIA and ATI have their eyes on the same market, and will release extremely cost effective solutions in just a few months. For just a fraction of 250 $ you'll be able to buy a better CPU and/or GPU and get the required amount of phsyics processing without loosing performance on other fronts. A better CPU will help any application, while a better GPU improves graphics at times when 'dedicated' physics hardware would not be used 100%. 250 $ for a physics processor, please... It sounded like a joke when I first heard about it, and apart from actually getting it on the market it's still a joke.

Just save your money and buy a next-generation CPU and GPU when physics intensive games hit the market and you'll get the most bang for the buck.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Oh argumenative little josh..

Originally posted by: josh6079
Coming from someone who doesn't play with a card that supports it, that is rather interesting. It doesn't change the fact that Nvidia can't even do it, regardless whether or not you think it is worth it.
I've seen it many, many times playing on my friends computers. Also, I can always refer to the reviews online that say AF IQ is similar on most games?

Originally posted by: josh6079
False. It does look nicer, no matter how you try and avoid it.
Sure. On all three games that support it. I'm playing Far Cry a lot these days, because I missed it last year and the year before? :roll:

Originally posted by: josh6079
Um, thats twice as many as you said support it and 4 of those ones Nvidia can't do both HDR + AA on.
I wasn't aware TRL supported HDR+AA, or even that the HDR was working very well these days? Got a link?

If not, we're back to the 3 I mentioned. By the time there are games that support this, the X1900s will seem very, very slow.

Originally posted by: josh6079
Like Nvidia never has render errors? Seriously Crusader, what render errors are you talking about? I don't see any when playing Oblivion.
Maybe you should look on their website at the release notes of the unsupported patch and see notation of the render errors with shadows on the grass, or visit some other forums and see user complaints?

Originally posted by: josh6079
The GX2 does not have a good multicard option either. Quad gets beat by Crossfire, and even its own SLI.
So do you often go by benchmarks of brand new, unreleased tech?

Originally posted by: josh6079
Lack of HDCP, like that was somthing useful for the 7950
It will likely be far more useful than HDR+AA on any single ATI card in the future?

I think we are going to have to agree to disagree Josh6079. You put far more weight on a feature that doesn't make any difference in most games on most hardware(angle independent AF) and one that for sure only makes a difference in a couple games. (HDR + AA)

I look at HDR+AA now sort of like I did when the 6800s had HDR- nice to have, but not a dealbreaker. The difference here the 6800Us and X800XT PEs had roughly comparable performance- the GX2 totally owns the X1900XTX at most new games. I would have bought the X800XTPE back then if the performance delta was similar, HDR or no.
 
Apr 6, 2006
32
0
0
Originally posted by: Crusader

Originally posted by: josh6079
The GX2 does not have a good multicard option either. Quad gets beat by Crossfire, and even its own SLI.
So do you often go by benchmarks of brand new, unreleased tech?

NV and Dell braged about the awsome quad sli more then six months ago.

I wouldent trust anything nvidia say they will be supporting in the future, remember the broken purevideo.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Crusader, its okay if your blind and think that angle independent is not worth anything. The fact that you've seen it and "don't care" doesn't change the fact that NO Nvidia card can do it, nor HDR + AA (You forgot Lineage 2). When Nvidia finally supports it, you'll love it. Then again, if Nvidia supported high power consumptions (like both companies seem to need to do for the first generation of DX10) you'll love that too.:roll:

Yes, HDR + AA is not a deal breaker...to YOU. That doesn't mean that you can lable it worthless just because it does not fit your mold. There are many others, including me, who love the fact that I can do both right now. Your pathetic attempt to say Far Cry is too old to care about anyway just shows how you're trying anything you can to discredit a technology that Nvidia didn't want to do. Many people still play HL2, CSS, and many other games that are "old". Just recently I saw a thread with tons of Far Cry pics, talking about how it is the best game "visually" ever, etc.

The heat issue is another sorry attempt too. I've got AS5 and the stock cooler and my temps never go above 58 when playing Oblivion. Granted, the idle is normally around 48. Although, my 7800's would idle around 42-38 and get as hot as 68-73. So in my experience, my ATI is running cooler than my Nvidias were.

You can read many reviews and preach what you haven't experienced. The fact is is that your expectations in a video card are different from mine, and that is that. It doesn't mean that one company or product is better than the other, it just means that you have found a company and a product that gives you what you want just as one company and product has given me what I want. I wouldn't be so "argumentive" if you would not make such false and attacking claims to a product you haven't owned. Fact is, Nvidia and ATI compete with eachother, not constantly blow one out of the water like you want to believe.

Thanks for the link Sc4freak. That just proves that ATI has some ideas that Nvidia likes as well. I'm actually glad to see ATI playing Sega instead of it being reversed.