ATI months ahead of NVIDIA with DirectX 11 GPU schedule?

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: error8
Originally posted by: chizow
If Nvidia was selling their competing part for $450 (GTX 260), why not sell their part for $400 and still beat it in price/performance?

Because they could, because even at this price, 4870 was a very profitable card and it also gained much more clients this way. I guess that at 400$ they would still gain profit, but selling it much cheaper attracted more buyers and the profit was even larger.
ATi is not an altruistic firm by any means, they just used aggressive marketing to win more money.

Right. Chizow loves to say this, but has no way of backing it up. We have no idea if pricing the 4850/4870 higher would be better or worse for AMD's bottom line. AMD could have priced them higher, sold less of them and made more profit per unit, or go for market share and sell more cards for less profit per unit.

If the economy was booming, Nvidia was making money and AMD was losing money we could say their way of doing things was probably not right. As it is a lot of companies aren't making money, we have no way of knowing if pricing the cards higher would have been a better or worse move.
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
Originally posted by: error8
Because they could, because even at this price, 4870 was a very profitable card and it also gained much more clients this way. I guess that at 400$ they would still gain profit, but selling it much cheaper attracted more buyers and the profit was even larger.
ATi is not an altruistic firm by any means, they just used aggressive marketing to win more money.
Exactly. This is why manufacturing process is important and both are trying to get there. (currently towards 40nm) Thanks to ATI I didn't have to pay $650 for a GTX 280.
 

solofly

Banned
May 25, 2003
1,421
0
0
If they release it that early, they might be able to release a refresh right after nv releases theirs at around holiday time and Win7's final release.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Keysplayr
You don't have to be just a gamer anymore to consider an Nvidia card. You can be a doctor, astrophysisist, engineer, geologist, research scientist, a dedicated protein folder (check our very own distributed computing forum).

From a gaming ONLY standpoint, the GTX275 and the HD 4890 are super close. And that is primarily what this forum here is concerned about. That's really great, but then there is the rest of the world. Nvidia's visions are far larger and broader than ATI's. This cannot be argued as we sit here and look at the vast capabilities of say the GT200 series currently.

AMD came out with a very good gaming GPU at 4xxx launch. And in doing so, did many gamers a huge favor with competitive performance and pricing and created a price war. Always good for gamers and I commend them for that. Gaming. Everything else though, the 4xxx series kind of falls to it's knees in the GPGPU arena when compared to GT200.

It is what it is. So the folks in here can argue which has the better gaming GPU til doomsday. 275/4890 too close to call in most cases. That's done. Move on. You can choose not to consider the overall capabilities of the GPU's because that can't concern you. Why? Because it's not in your best interests when arguing the true worldly question. "Which is the better GPGPU overall including gaming".
There is a much bigger world out there than first person shooters and MMORPG's. Nvidia saw this long ago.

I would have to say that the percentage of video cards sold to doctors, astrophysisists, engineers, geologists, research scientists and dedicated protein folders primarily for GPGPU purposes is miniscule in the extreme compared to the number sold for gaming purposes. That is the function these cards are primarily designed for and that is the function they are primarily marketed towards.

While having the ability to do GPGPU function is great for those who want those functions, it's still the ability to render video games that drives the enthusiast market.
 

thilanliyan

Lifer
Jun 21, 2005
12,060
2,273
126
Originally posted by: Creig
I would have to say that the percentage of video cards sold to doctors, astrophysisists, engineers, geologists, research scientists and dedicated protein folders primarily for GPGPU purposes is miniscule in the extreme compared to the number sold for gaming purposes. That is the function these cards are primarily designed for and that is the function they are primarily marketed towards.

You're probably right but I think the market is gonna get larger soon as more and more apps are written for GPUs.

I personally don't care but I believe the GPGPU market is gonna get much bigger than it is currently.
 

alcoholbob

Diamond Member
May 24, 2005
6,387
465
126
I wonder if we will be looking at a 2GB card for the reference design for GT300. I'm looking at the VRAM counter in Crysis and its already hitting 1.4GB, and people are saying GTA4 with max view distance is about the same as well. Less paging the better.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Originally posted by: error8
Just found this link: 5870, 5870X2 specs

Looks nice if it's going to be true.

1200 shaders @ 900 Mhz, 256-bit, 4400 MHz DDR5, 2.1 TFLOPS.

This sounds a lot cheaper to make than GT300. The die size will probably be the same too....which will be good for people transfering an aftermarket cooler from HD48xx.

 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: Just learning
Originally posted by: error8
Just found this link: 5870, 5870X2 specs

Looks nice if it's going to be true.

1200 shaders @ 900 Mhz, 256-bit, 4400 MHz DDR5, 2.1 TFLOPS.

This sounds a lot cheaper to make than GT300. The die size will probably be the same too....which will be good for people transfering an aftermarket cooler from HD48xx.

The only thing that looks a bit unreal, is that I don't think that there is enough memory bandwidth at 4.4 ghz for that powerful gpu. But I might be wrong.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Creig
Originally posted by: Keysplayr
You don't have to be just a gamer anymore to consider an Nvidia card. You can be a doctor, astrophysisist, engineer, geologist, research scientist, a dedicated protein folder (check our very own distributed computing forum).

From a gaming ONLY standpoint, the GTX275 and the HD 4890 are super close. And that is primarily what this forum here is concerned about. That's really great, but then there is the rest of the world. Nvidia's visions are far larger and broader than ATI's. This cannot be argued as we sit here and look at the vast capabilities of say the GT200 series currently.

AMD came out with a very good gaming GPU at 4xxx launch. And in doing so, did many gamers a huge favor with competitive performance and pricing and created a price war. Always good for gamers and I commend them for that. Gaming. Everything else though, the 4xxx series kind of falls to it's knees in the GPGPU arena when compared to GT200.

It is what it is. So the folks in here can argue which has the better gaming GPU til doomsday. 275/4890 too close to call in most cases. That's done. Move on. You can choose not to consider the overall capabilities of the GPU's because that can't concern you. Why? Because it's not in your best interests when arguing the true worldly question. "Which is the better GPGPU overall including gaming".
There is a much bigger world out there than first person shooters and MMORPG's. Nvidia saw this long ago.

I would have to say that the percentage of video cards sold to doctors, astrophysisists, engineers, geologists, research scientists and dedicated protein folders primarily for GPGPU purposes is miniscule in the extreme compared to the number sold for gaming purposes. That is the function these cards are primarily designed for and that is the function they are primarily marketed towards.

While having the ability to do GPGPU function is great for those who want those functions, it's still the ability to render video games that drives the enthusiast market.

And I'd have to respond that 95% of all percentage statistics are made up on the spot. ;)
Things that concern forum members here that I have seen talked about as well as participated in, would include:
Gaming, Folding, Audio/Video encoding, Some CAD applications, Adobe's latest and a few scant other apps. That alone covers a lot of area. Now consider the rest of the planet. Don't you think that there have been, are, and will be scientists, geologists, doctors etc. etc. that might be interested in speeding up their crunching power and speed? Instead of waiting 40 minutes for a result from a CAT scan, ground sonar, fossil fuel exploration or whatever the application may be, perhaps it could mean real-time results? Don't you think that may raise some interests among them? Well, we already know that answer. Look how many of them are working with CUDA programming for their apps. You've seen many demos from various professional fields about ways CUDA enhances productivity, reduces time to completion, etc. etc. How many years do you think have been shaved of certain cancer research projects thanks to International distrubuted computing using Nvidia CUDA enabled GPU's. Lets not forget PS3's, ATI cards. They all are great.

Someone like you, Creig, has no use for productivity. So you are only concerned with gaming, as are most members here in this forum. Understandable that the "other stuff" doesn't concern you and others here. But you really shouldn't ignore what CUDA enabled GPU's are, and what has gone into them to make them what they are. Give credit where it is due and I'll be fine with that.

cheers.



 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Keysplayr
Someone like you, Creig, has no use for productivity. So you are only concerned with gaming, as are most members here in this forum. Understandable that the "other stuff" doesn't concern you and others here. But you really shouldn't ignore what CUDA enabled GPU's are, and what has gone into them to make them what they are. Give credit where it is due and I'll be fine with that.

cheers.

Personally, I agree with you that it's fantastic that GPUs are becoming more than a "for gamers only" piece of hardware. We've only begun to scratch the surface of what GPGPUs are capable of, both from ATI and Nvidia. These are interesting times for hardware enthusiasts like us.

Have a nice weekend, Keys. :beer:
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Originally posted by: chizow
Originally posted by: Elfear
Market share? Lowering prices to increase market share is a tried and true business practice. Probably a good idea to do it now when their GPUs probably cost less then the competition to produce so they have a little wiggle room as far as price goes.
But they didn't increase market share, check the latest Peddie figures.

Never heard of Peddie, sorry. Besides, intent and outcome are two very different animals. Whose to say AMD's intent wasn't to gain market share. I have no idea if they succeeded or failed there but it certainly looks like they had the intent.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Originally posted by: error8
Originally posted by: Just learning
Originally posted by: error8
Just found this link: 5870, 5870X2 specs

Looks nice if it's going to be true.

1200 shaders @ 900 Mhz, 256-bit, 4400 MHz DDR5, 2.1 TFLOPS.

This sounds a lot cheaper to make than GT300. The die size will probably be the same too....which will be good for people transfering an aftermarket cooler from HD48xx.

The only thing that looks a bit unreal, is that I don't think that there is enough memory bandwidth at 4.4 ghz for that powerful gpu. But I might be wrong.

Yeah it almost seems like this cheap would only have a little bit more bandwidth than 4890 ....yet 50% more shaders (clocked 50Mhz faster) at the same time.

 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: dguy6789

There are far, far more reviews that favor the x800xt cards and there are also far more people on various forums who agree.

Also, if cost isn't a concern, then why not compare that 7900GX2 with X1900 crossfire?

You are completely right. There's quite a lot of nVidia fans here, hiding the true that the ill fated 7900/7950GX2 died prematurely without driver optimizations and got trounced by the X1950XTX in many scenarios, the same fate that the 9800GX2 is currently having. R300 simply trounced the NV30, R420 was faster and still today compared to the NV40 with it's pathettic SM 3.0 implementation which didn't benefit of the Sm3.0 performance improvements. The R520/R580 was simply a much more elegant design with it's ring bus, ultra threaded architecture which it's performance got a boost when SM3.0 was used, unlike the G70 and it's derivatives which still having the NV40 disease of having a big performance impact when the SM3.0 dynamic branching was used. nVidia did a big win with the G80 and then the GT200, but ATi currently with it's RV770 and RV790 which performs very close to the GTX 285 and yet is much cheaper, we all know who they are, just ignore them and keep moving, ATi's perfect balance of performance/price is what is keeping nVidia selling their $600.00 GTX 280/285 under $350 :laugh:

Originally posted by: Wreckage
It worked better than AVIVO (which did not exist at the time). Also to this day AVIVO is still very CPU dependent.

LOLL a very funny post, ATi isn't the one who can't decode encrypted VC-1 entirely in the pipeline, so who's gonna have a higher CPU utilization? LOLL

Originally posted by: BenSkywalker

According to Anandtech, the 4670 is equal to the 9600GT. Is that bias or just straight stupidity in your estimation?

You are the one who's biased, you seems to forget that the HD 4670 is as fast as the HD 3870 which was as fast as the 9600GT, both the HD 4670 and HD 3870 has the same 320 stream processors, the main difference is that the HD 4670 has better Anti Aliasing performance. Try better next time.

Originally posted by: error8
Yes, but Nvidia's features are better. :roll:

LOLL I share your thoughts :roll: Is the best thing that we can do when other people get married with brands and not by the truth which we all know :roll:
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: error8
Yes, but Nvidia's features are better. :roll:
[/quote]

If anyone can debate that nV's board partners arent better, I would be interested in hearing that.
 

thilanliyan

Lifer
Jun 21, 2005
12,060
2,273
126
Originally posted by: OCguy
If anyone can debate that nV's board partners arent better, I would be interested in hearing that.

It got a lot better since XFX went over to ATI but overall the nV partners I'd say are much better.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: evolucion8
Originally posted by: dguy6789

There are far, far more reviews that favor the x800xt cards and there are also far more people on various forums who agree.

Also, if cost isn't a concern, then why not compare that 7900GX2 with X1900 crossfire?

You are completely right. There's quite a lot of nVidia fans here, hiding the true that the ill fated 7900/7950GX2 died prematurely without driver optimizations and got trounced by the X1950XTX in many scenarios, the same fate that the 9800GX2 is currently having. R300 simply trounced the NV30, R420 was faster and still today compared to the NV40 with it's pathettic SM 3.0 implementation which didn't benefit of the Sm3.0 performance improvements. The R520/R580 was simply a much more elegant design with it's ring bus, ultra threaded architecture which it's performance got a boost when SM3.0 was used, unlike the G70 and it's derivatives which still having the NV40 disease of having a big performance impact when the SM3.0 dynamic branching was used. nVidia did a big win with the G80 and then the GT200, but ATi currently with it's RV770 and RV790 which performs very close to the GTX 285 and yet is much cheaper, we all know who they are, just ignore them and keep moving, ATi's perfect balance of performance/price is what is keeping nVidia selling their $600.00 GTX 280/285 under $350 :laugh:

Originally posted by: Wreckage
It worked better than AVIVO (which did not exist at the time). Also to this day AVIVO is still very CPU dependent.

LOLL a very funny post, ATi isn't the one who can't decode encrypted VC-1 entirely in the pipeline, so who's gonna have a higher CPU utilization? LOLL

Originally posted by: BenSkywalker

According to Anandtech, the 4670 is equal to the 9600GT. Is that bias or just straight stupidity in your estimation?

You are the one who's biased, you seems to forget that the HD 4670 is as fast as the HD 3870 which was as fast as the 9600GT, both the HD 4670 and HD 3870 has the same 320 stream processors, the main difference is that the HD 4670 has better Anti Aliasing performance. Try better next time.

Originally posted by: error8
Yes, but Nvidia's features are better. :roll:

LOLL I share your thoughts :roll: Is the best thing that we can do when other people get married with brands and not by the truth which we all know :roll:

6800U vs. X800XT
Digital Daily
3DExtreme
Nordic Hardware
PCStats
ixbt labs
Xbit

Dude, a quick google shows review sites calling X800XT/PE and 6800U too close to recommend one or the other. Kind of similar to what we are seeing today with 4890 and GTX275. What's your problem?

X1800XT 512 vs 7800GTX 512:
Xbit
Firing Squad
Same thing here dude. Trades blows. What's the problem?

X1900XTX vs. 7900GTX:
Xbit

Same here. I'm getting bored with this.

And comments like this: "we all know who they are, just ignore them and keep moving,"
Absolutely applies to you on the ATI side. You are one of the "they" from the flip side of the coin. Even I as a focus group member and my preference for Nvidia graphics cards, looks at things the way they are and give credit where it is due. That is apparently something you cannot boast.

And anyway, why the hell is everyone going back to 2003/4 arguments? What brought this on? It's downright depressing!! There is plenty going on today to talk about.

 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
Dude, a quick google shows review sites calling X800XT/PE and 6800U too close to recommend one or the other. Kind of similar to what we are seeing today with 4890 and GTX275. What's your problem?

Maybe his problem has to do with a few posters claiming that in that generation NVIDIA's cards are much better? And that the 7800GTX and the 7900GTX are much better than ATI's counter parts, and which according to your links they are trading blows instead of one being clearly superior than the other?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Pantalaimon
Dude, a quick google shows review sites calling X800XT/PE and 6800U too close to recommend one or the other. Kind of similar to what we are seeing today with 4890 and GTX275. What's your problem?

Maybe his problem has to do with a few posters claiming that in that generation NVIDIA's cards are much better? And that the 7800GTX and the 7900GTX are much better than ATI's counter parts, and which according to your links they are trading blows instead of one being clearly superior than the other?

Yes, and the equally f-ed up thing is that a few posters are firing back that the ATI cards were much better instead of actually doing the legwork, get links and crush the FUD that runs rampant in here. Instead of doing the right thing, they do the antagonistic and "easy" thing instead of finding correct data. I admit, it is boring as hell to look up 5 year old benchmarks, reviews and conclusions, but it is necessary to stop this kind of crap being posted, or at least crush this self-imposed amnesia most members involved in these types of conversations here seem to develope.

 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
What woud you rather play a modern game today with, a 1900XTX or 7900GTX?

Not sure why this matters as just about everyone here has moved well beyond that generation of cards, but since it's being discussed I'll ask. Which one has aged better? So we have two cards that were close in their generation, and one card that has aged realitively well.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: SlowSpyder
What woud you rather play a modern game today with, a 1900XTX or 7900GTX?

Not sure why this matters as just about everyone here has moved well beyond that generation of cards, but since it's being discussed I'll ask. Which one has aged better? So we have two cards that were close in their generation, and one card that has aged realitively well.


Well, it matters to someone apparently. You. Why? I have no idea. You're talking 4, almost 5 generations old DX9 cards. This is 2009 folks. If you wish, find 2009 benchmarks for x1900XTX and 7900GTX. I'm not going to bother with that hunt. Knock yourself out. ;)
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Keysplayr
Same here. I'm getting bored with this.

And comments like this: "we all know who they are, just ignore them and keep moving,"
Absolutely applies to you on the ATI side. You are one of the "they" from the flip side of the coin. Even I as a focus group member and my preference for Nvidia graphics cards, looks at things the way they are and give credit where it is due. That is apparently something you cannot boast.

And anyway, why the hell is everyone going back to 2003/4 arguments? What brought this on? It's downright depressing!! There is plenty going on today to talk about.

Cherry picking the benchmarks is an ability of yours, keep going nVidia Focus Member. Your credibility was never there because there's no impartiality when you work for a company. But at least you have more objectivity than Wreckage's renegade posts, but thumbs down for both of you for derailing the thread.

Originally posted by: BFG10K


http://www.xbitlabs.com/articl...on-hd4650_7.html#sect0

It looks to me ATi have the lowest CPU utilization in those tests overall, even the 4550 which is an excessively low end part compared to the nVidia parts being tested.

That would be the X1900 era. I'm not sure how that could not be more clear.
Again I'm not sure what you're basing your claims off. The benchmarks I linked to refuted you on several levels, so I?ll link them again:

http://www.computerbase.de/art...rmancerating_qualitaet

The X1800XT crushes the 7800 GTX 256 MB (which it originally competed with) and matches the 7800 GTX 512 MB, whose availability was rare at best.

Now the previous generation:

http://www.computerbase.de/art...rmancerating_qualitaet

The X850 XT (non PE edition i.e. it was widely available, unlike the 7800 GTX 512 MB) is a lot faster than the 6800 Ultra; heck, even ATi?s second best card of that generation (X800XL) is a tad faster than the 6800 Ultra.

Like I said, the 2xxx/3xxx series is where ATi dropped the ball. R4xx and R5xx parts using later drivers generally outperform nVidia?s competing parts of the time, in some cases significantly. Revisionist history will get you nowhere when the facts speak for themselves.

So what does that make the 7800 GTX 256 MB which it directly competed against, and crushed?

Or how about the 7800 GTX 512 MB which nVidia was forced to release and generally wasn?t available outside of review space?

http://www.hardocp.com/article...EwLCxoZW50aHVzaWFzdA==

http://www.anandtech.com/showdoc.aspx?i=2044&p=11

And the performance/price ratio winner is: ATi

http://www.anandtech.com/guides/showdoc.aspx?i=3538

Read throughly and see that ATi has the best bang for the buck, was the first implementing DX9, the first implementing an unified architecture (Xbox 360), the first implementing DX10.1 and now will be the first implementing DX11. Is that so hard for you to swallow? I'm getting bored by nVidia fans who always derails the ATi threads with their own nVidia marketing campaign, boring. :roll:
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: evolucion8
Originally posted by: Keysplayr
Same here. I'm getting bored with this.

And comments like this: "we all know who they are, just ignore them and keep moving,"
Absolutely applies to you on the ATI side. You are one of the "they" from the flip side of the coin. Even I as a focus group member and my preference for Nvidia graphics cards, looks at things the way they are and give credit where it is due. That is apparently something you cannot boast.

And anyway, why the hell is everyone going back to 2003/4 arguments? What brought this on? It's downright depressing!! There is plenty going on today to talk about.

Cherry picking the benchmarks is an ability of yours, keep going nVidia Focus Member. Your credibility was never there because there's no impartiality when you work for a company.

Originally posted by: BFG10K


http://www.xbitlabs.com/articl...on-hd4650_7.html#sect0

It looks to me ATi have the lowest CPU utilization in those tests overall, even the 4550 which is an excessively low end part compared to the nVidia parts being tested.

That would be the X1900 era. I'm not sure how that could not be more clear.
Again I'm not sure what you're basing your claims off. The benchmarks I linked to refuted you on several levels, so I?ll link them again:

http://www.computerbase.de/art...rmancerating_qualitaet

The X1800XT crushes the 7800 GTX 256 MB (which it originally competed with) and matches the 7800 GTX 512 MB, whose availability was rare at best.

Now the previous generation:

http://www.computerbase.de/art...rmancerating_qualitaet

The X850 XT (non PE edition i.e. it was widely available, unlike the 7800 GTX 512 MB) is a lot faster than the 6800 Ultra; heck, even ATi?s second best card of that generation (X800XL) is a tad faster than the 6800 Ultra.

Like I said, the 2xxx/3xxx series is where ATi dropped the ball. R4xx and R5xx parts using later drivers generally outperform nVidia?s competing parts of the time, in some cases significantly. Revisionist history will get you nowhere when the facts speak for themselves.

So what does that make the 7800 GTX 256 MB which it directly competed against, and crushed?

Or how about the 7800 GTX 512 MB which nVidia was forced to release and generally wasn?t available outside of review space?

http://www.hardocp.com/article...EwLCxoZW50aHVzaWFzdA==

http://www.anandtech.com/showdoc.aspx?i=2044&p=11

And the performance/price ratio winner is:

http://www.anandtech.com/guides/showdoc.aspx?i=3538

Read read and see that ATi has the best bang for the buck, was the first implementing DX9, the first implementing an unified architecture (Xbox 360), the first implementing DX10.1 and now will be the first implementing DX11. Is that so hard for you to swallow? I'm getting bored by nVidia fans who always derail the ATi threads for their own nVidia marketing campaign, boring. :roll:

Hey, know nothing wise guy. The thread title???? Hello? Derailing what???? I didn't cherry pick a damn thing. Just the first googles that came up. And I can only assume you did not do that. And if all you can do is try to use my focus group affiliation to win an argument, have at it. I'm a big boy. But you however are starting to resemble a broken record. My credibility is just fine because I "can" and "do" try to look at things objectively. You can't. So how is yours? And how is high road treating you?

Your links are nice and all, but why would you ignore mine? The random ones? Care to explain that?
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Please stay in topic, a big boy won't derail a thread for pure marketing reasons like you are doing, so if you don't have anything to say related to the thread, please don't say nothing at all, it's annoying.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: evolucion8
Please stay in topic, a big boy won't derail a thread for pure marketing reasons like you are doing, so if you don't have anything to say related to the thread, please don't say nothing at all, it's annoying.

But you have..

Heh, now suddenly you care what's discussed after all your postings? Why? Apparently I brought up a topic in a direction which you clearly do not wish the conversation to go. Ok.
Basically, this topic started and ended with the title. Not much more to say on that. ATI will be first to market with DX11 products. Pretty much sums it all up don't you think? And you think that is hard for me to swallow because?

 

solofly

Banned
May 25, 2003
1,421
0
0
Originally posted by: evolucion8
Cherry picking the benchmarks is an ability of yours, keep going nVidia Focus Member. Your credibility was never there because there's no impartiality when you work for a company. But at least you have more objectivity than Wreckage's renegade posts, but thumbs down for both of you for derailing the thread.

Well there is NV in the title which justifies his posts. Just remember, a pile of dog shit on the street has more value than what he has to say. I'll take wreckage's fanboism over a salesman any day...



There is nothing that justifies the comments you just made. This isn't the first time you've gone at it with people because of their video card brand preferences - take a few days off to cool your heels.

The rest of you, take it to PM if you want to continue bickering with each other.

AmberClad
Video Moderator