Tim Sweeny Prefers G70?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: DfiDude
So the 6800 ultra will run better than the X850XT next year (pci-express)?

If you're replying to me, note the "SLI" in my post. Yes 6800U SLI will run everything much better than a X850XT PE this year, and next.
 

Malak

Lifer
Dec 4, 2004
14,696
2
0
Sweeney is an idiot. In an interview a while back, he said "most" gamers use Nvidia so they are coding the U3 engine around that thought. I guess he missed the fact that ATI ships 50% more units per quarter than Nvidia... If can't see that ATI has had the best video card for the last 2 gens and controls more of the market, I am afraid to see this engine no matter how good it may look in tech demos.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Sweeney also said he was targetting the 9800P (and 6200, incongruously enough) as the low end for UE3. I don't think ATI is shipping 50% more 9800Ps and up, and I somewhat doubt they have since the GF6 debuted.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Rollo


LOL- Sweeney has always preferred nVidia, because they have always had more advanced gpus.

Except for the "glory year", when 24bit was considered "full precision".

You were wrong at Hard, and you're still wrong here.

PS 1.4, and 1.1. Which had what with the 8500 and GF3/4? Just one example of NV not having the more advanced card.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: KruptosAngelos
Sweeney is an idiot. In an interview a while back, he said "most" gamers use Nvidia so they are coding the U3 engine around that thought. I guess he missed the fact that ATI ships 50% more units per quarter than Nvidia... If can't see that ATI has had the best video card for the last 2 gens and controls more of the market, I am afraid to see this engine no matter how good it may look in tech demos.

Last Mercury research showed 70% of the high end DX9 market is nVidia. Having 30% of the high end market is not "controlling it". No one here cares how many $10 GPUs ATI sells to motherboard OEMs.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ackmed
Originally posted by: Rollo


LOL- Sweeney has always preferred nVidia, because they have always had more advanced gpus.

Except for the "glory year", when 24bit was considered "full precision".

You were wrong at Hard, and you're still wrong here.

PS 1.4, and 1.1. Which had what with the 8500 and GF3/4? Just one example of NV not having the more advanced card.

I was right at Hard, and am right here. For the most part, nVidia brought more advanced functionality to market first.

In the rare instances they did not (like your example of the 8500s PS 1.4) ATIs drivers were regarded as so inferior developers wouldn't use their products. Or are you forgetting just how bizarre ATIs drivers were pre-9700?

 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
You are not right. Now you've backtracked and say "for the most part". First it was "always", now its down to "the most part." Which is more accurate.

You cant argue the driver problem when comparing it to the GF4. By then the 8500 drivers were ironed out, and it was a great card. At launch yes it sucked, and features didnt work properly, but a few months afterwards, it was working great.

Perhaps you'll learn to stop making blanket statements.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
PS 1.4, and 1.1. Which had what with the 8500 and GF3/4? Just one example of NV not having the more advanced card.

Fire up SplinterCell with that 8500 next to a GF4 and try setting both to the highest quality shadow setting and see what you get. He!l, use a X850XTPE for that one next to a GF3 Ti200 ;)

Sweeney is an idiot. In an interview a while back, he said "most" gamers use Nvidia so they are coding the U3 engine around that thought. I guess he missed the fact that ATI ships 50% more units per quarter than Nvidia...

I must have missed the class where they taught gamers was a synonym for units.....
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: BenSkywalker
PS 1.4, and 1.1. Which had what with the 8500 and GF3/4? Just one example of NV not having the more advanced card.

Fire up SplinterCell with that 8500 next to a GF4 and try setting both to the highest quality shadow setting and see what you get. He!l, use a X850XTPE for that one next to a GF3 Ti200 ;)

What does that matter? The fact is, NV doesnt always have more advanced hardware. If you cant admit that, then there really isnt any hope.

But since you brought it up. Which card will work properly with BF2 out of a 8500 and GF3/4? Hint: Not the GeForces.
 

DfiDude

Senior member
Mar 6, 2005
627
0
0
What if you had one 6800 ultra and one X850 XT, would the 6800 ultra be better next year?
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Epic and Nvidia have always had a good relationship - everything I've heard from Mark Rein, Time Sweeney, and Cliffy B is that they like Nvidia simply because Nvidia treats them well. As Cliff put it "They've shown us a lot of love, and they've been really good to us, so we return the favor a bit."
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: KruptosAngelos
Sweeney is an idiot. In an interview a while back, he said "most" gamers use Nvidia so they are coding the U3 engine around that thought. I guess he missed the fact that ATI ships 50% more units per quarter than Nvidia... If can't see that ATI has had the best video card for the last 2 gens and controls more of the market, I am afraid to see this engine no matter how good it may look in tech demos.


I have never seen a post make someone look so foolish. You call the man who has founded and continued to advance arguably the most versatile, well rounded engine in the industry an idiot?

When you code a COMPLETE DEVELOPMENT TOOLKIT that can compare with Unreal Technology for ease of use, features, and graphic fidelity, then come talk to me. Until then, peace.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
Hah, why are you taking Sweeny's comments so personal KruptosAngelos ? It's just business as usual.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
My personal view is that I've seen more innovation on Nvidia's end - they were the first to introduce the dedicated Hardware TnL platform, the first to do 32 bit precision (ATi still doesn't have it), the first to offer a highly programmable architecture with NV30 (despite it being a performance stinker), the first to introduce a developer relations program (Get in the Game followed TWIMTBP), they came back to SLI before ATI came after multi-rendering...

To me, I see Nvidia take more risks, which I think is what we need in games and gaming hardware right now. We still need to flesh out those features and keep steaming to photorealism, and I see Nvidia go out on a limb regularly to try and accomplish that - sometimes they succeed, sometimes they don't, but the point is they're willing to give it a go which is something I rarely see ATi take the first step on.

I still like ATi and I think they make excellent products, but I wish they'd stop playing wait and see and start taking some more chances - when the playing field is leveled in terms of rendering ability and features, then we can start looking for better ways to do what we're already doing, but when there are still things we can't do visually, I'd like to see IHVs trying to find ways to bring those features to market.

I like both the companies, but Nvidia seems more willing, in my view, to stick their neck out on the chopping block to push digital entertainment forward, and that counts for something in my book. They got slammed with NV30, and now they're taking heat from some people for SLI (despite the fact that ATi is following suit), but I have to say, I think they're doing the right thing as a major player in the industry, and I continue to support them not just because their products are impressive, but also because of this pioneering spirit.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ackmed
You are not right. Now you've backtracked and say "for the most part". First it was "always", now its down to "the most part." Which is more accurate.

You cant argue the driver problem when comparing it to the GF4. By then the 8500 drivers were ironed out, and it was a great card. At launch yes it sucked, and features didnt work properly, but a few months afterwards, it was working great.

Perhaps you'll learn to stop making blanket statements.


Bah. You nitpick statements that are 99% true to further your agenda. You know that nV has had been first to market with tech most of the time, and that their driver history is the other reason most developers have favored nVidia (at least as far back as I can remember) but you pipe in with minutia like "The 8500 had 1.4 first!" to pretend you have a point and pimp ATI.

The 8500 was NOT a fine card. It didn't work at stock speed for me and many others, it had more driver issues than practically any card ever produced by ATI, yet here you are talking as if any developer anywhere said,"The 8500 is the way!" It was a piece of junk, IMO, one of very few cards I returned for not operating correctly.

Links please.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Wow, still cant admit you were wrong.

FACT: NV doesnt always have the "more advanced GPU". More often times than not? Sure, but not always. You can either admit thats right, or continue with your ignorance. Either way, Im done arguing it.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ackmed
Wow, still cant admit you were wrong.

FACT: NV doesnt always have the "more advanced GPU". More often times than not? Sure, but not always. You can either admit thats right, or continue with your ignorance. Either way, Im done arguing it.

Do you know what "pettifogging" means Ackmed?
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Insomniak
My personal view is that I've seen more innovation on Nvidia's end - they were the first to introduce the dedicated Hardware TnL platform, the first to do 32 bit precision (ATi still doesn't have it), the first to offer a highly programmable architecture with NV30 (despite it being a performance stinker), the first to introduce a developer relations program (Get in the Game followed TWIMTBP), they came back to SLI before ATI came after multi-rendering...

To me, I see Nvidia take more risks, which I think is what we need in games and gaming hardware right now. We still need to flesh out those features and keep steaming to photorealism, and I see Nvidia go out on a limb regularly to try and accomplish that - sometimes they succeed, sometimes they don't, but the point is they're willing to give it a go which is something I rarely see ATi take the first step on.
I'm in the mood for quibbling, so don't take this too personally. I don't see FP32 as any more innovative than FP24, FP16, or even FX16. GPUs will get more precision when manufacturing makes it economically feasible. I do believe that 3dfx was slated to be the first with higher precision, though, with either Rampage or Spectre (I'm beginning to forget the 3dfx fable, sorry :)). And 3dfx was the first to SLI, though I understand you meant nV was the first to bring it back (tho it was to be expected, given that they basically bought 3dfx).

I guess I can see that nV is more willing to take risks (NV1, anyone? :)) whereas ATI seemed to focus more on the business aspect. Hopefully the competition between both of them will force ATI to step us as much as nV, and I think R500/Xenos is a promising move in the direction of innovation.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
So if nV come to market with more advanced GPUs than anyone else.

Wouldnt that mean that they dictated the prices of what the competition can set theirs at?

Also, Pete. As you said, i thought ATi was a business or industry GPU manufacturer first while nVidia started out in the gamers market first before moving to the business and industry section?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
What does that matter?

The ATi parts can't handle it do to lack of feature support. There are still certain things that the NV20 can do in hardware that the X850XTPE can't. That matters when discussing which part is more 'advanced'. You have to determine what element you consider to be the most important.

The fact is, NV doesnt always have more advanced hardware.

Then you should bring up the R300 versus the NV25, not the R200. The R200 was seriously lacking on a long list of features v the NV25(real AF as an example). There are certain elements that the NV25 had an advantage over the R300 on, but overwhelmingly the R300 was the superior part.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Supposedly 3Dc is going to be prominent in the future, even the devs are asking nVidia to incorporate it and ATi released this free.

So at least they did something to enhance graphics and increase performance for free for anyone to use.
 

fstime

Diamond Member
Jan 18, 2004
4,382
5
81
There are still certain things that the NV20 can do in hardware that the X850XTPE can't.

Just out of curiosity, what features does the old NV20 have over the x850xt.

To me, it sounds like some BS features that have and will never come into use.

Just curious.,
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Ultimately, performance dictates price more than features, at least to gamers. You could say that the NV30 offered more flexible shaders than R300, but that really only matters to devs. OTOH, I'd say the FP blending in GF6 (and the resultant handful of titles with HDR exclusive to the GF6) warrants a small premium over a comparably fast ATI card.

Then again, 6xAA may be preferable to HDR for someone else.

I don't know the whole history of ATI, but I believe they were big into IGPs and 2D cards before the 3D card market (and the games that pushed it) became the PC big dog. AFAIK, their transition to competing in 3D wasn't as smooth or successful as nVidia's (they were caught cheating in some ZD benchmark, IIRC). Heck, wasn't nVidia's first PC offering a "Saturn on a card?" That's a pretty aggressive intro to 3D gaming. :)
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Just out of curiosity, what features does the old NV20 have over the x850xt.

To me, it sounds like some BS features that have and will never come into use.

Certain lower level functions that can be utilized by developers if they are explicitly targetted(outside of the direct DX or OpenGL specs). As far as them never coming in to use- check out SplinterCell. The NV2x and up parts from nVidia can run the highest detail shadows in the game while all ATi parts are incapable. If you do a search over at B3D one of the developers explain exactly why it can be done on nV hardware and why it can't be done on ATi's parts.