NDA for Cayman is lifted 22nd?!

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Because of the complaints? Or because the gtx460 generally overclocks better and that would have been another perceived pro-NV article by anandtech?

Im thinking the complaints. I remember when the 4890 came out, it also had a follow up article about its overclocking. Then after that the 275 had an article like that and anandtech included the numbers from the 4890 overclocking article. That's what I was hoping to see this time.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
You are telling me that most people that post in this forum (more than 1/2) are running a 30" display with 2560x1600 resolution? Is that what you are seriously saying? There is absolutely no way that is true. Keep in mind you said "higher than 1920x1200" and not "1920x1200 and higher".

And if you intent to include 'multi' monitor setup in the expanded resolution, that would not count, because 95% of all multi monitor setups are two independant displays, not two displays with one single image. In other words, the game is played on one of the screens only, with the other screen used for browsing, writing, etc...

I'd be more than willing to bet the majority of people on this forum are using 1680x1050 and lower, with next largest part in the pie going to 1920x1080-1200 and with a very small part of the pie going to the 2560x1600 crowd.

A poll should confirm that.
 

tyl998

Senior member
Aug 30, 2010
236
0
0
You are telling me that most people that post in this forum (more than 1/2) are running a 30" display with 2560x1600 resolution? Is that what you are seriously saying? There is absolutely no way that is true. Keep in mind you said "higher than 1920x1200" and not "1920x1200 and higher".

And if you intent to include 'multi' monitor setup in the expanded resolution, that would not count, because 95% of all multi monitor setups are two independant displays, not two displays with one single image. In other words, the game is played on one of the screens only, with the other screen used for browsing, writing, etc...

I'd be more than willing to bet the majority of people on this forum are using 1680x1050 and lower, with next largest part in the pie going to 1920x1080-1200 and with a very small part of the pie going to the 2560x1600 crowd.
I have no figures or hard numbers, but that feels right. Can we have a poll to see what resolution people's monitors are at? I chose my 23 inch 1900x1080 due to price.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
The only controversy is in the minds of a few fan boys who are worried about seeing their favourite manufacturer being seen in a negative light. Just shows they have lost the plot somewhat and now care more about some big ugly corporation then they do about discovering what is the best graphics card for their money.

For the rest of us its more information - we can see how the 6800's compare with both a stock and an o/c 460. More information is good, there is no negative side to it.

if there was no controversy then anand would not have felt compelled to personally address the issue. if AT didn't have a policy that they explicity ignored then they would not have caught any heat over it, but it did look pretty bad that NV asked them to ignore their policy "just this once" and they did. do I think that AT is biased? absolutely not. however, I do think that ryan might have been cajoled/wheedled/etc by NV to throw in their ftw card b/c of all the mitigating circumstances. It was a bad move by AT, and I expect that they won't do it again regardless of the level of crying/complaining/cajoling/etc from amd/nvidia.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Im thinking the complaints. I remember when the 4890 came out, it also had a follow up article about its overclocking. Then after that the 275 had an article like that and anandtech included the numbers from the 4890 overclocking article. That's what I was hoping to see this time.

Yes I remember those articles, and both were excellent.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
You guys don't see what is going on here? What fools. Anandtech has Nvidia cards benchmarked on ATi releases, and ATi cards benchmarked on Nvidia releases. Every once in a while we even see Intel graphics benchmarked against red/green. Ryan is attempting to ruin every card launch.

This is where the conspiracy rears its ugly head though. There is one manufacture that Ryan has never ever benchmarked against the other three. Not once has any of these cards had their launches spoiled by including comparison benchmarks with Intel/AMD/Nvidia.

Ryan Smith has been playing us for fools! He is clearly working for MATROX!!!!!
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
TPU tested a load of games, and got the 580 as being an average of 25% faster in all games they tested. So cant we just use 25% as how much faster it is?

I linked TPUs review summing all resolutions. GTX580 = 100. HD5870 = 77. Mathematically this means GTX580 is 30% faster or HD5870 is 23% slower.

I do not think they are biased. By the way, I think it is a bit misleading to average out all resolutions, with AA on and then off. As others noted, I would object to this because 1) The extra video memory would have given it a distinct advantaged when running the insane settings (2560x1600 w/AA).
Averaging all resolutions doesn't help anyone determine anything, especially if you are trying to inform people. I don't care what the difference between A and B is at 1280x1024, 1680x1050, 1920x1200 and 2560x1600 all averaged together if I'm playing at 1920x1200.

In my #1 perspective point, you can view any resolution by itself with and without AA. I was not trying to provide misleading results. An average, being a typical amount, rate, quantity, rating, or the like that represents or approximates an arithmetic mean. This is how I arrived at 36% "all resolutions average".

However, here is the complete breakdown (GTX580 vs. HD5870):
1680x1050 0AA = 32%
1920x1200 0AA = 30%
2560x1600 0AA = 31%
1680x1050 4AA/16AF = 40%
1680x1050 8AA/16AF = 33% (shows that AMD is more efficient at 8AA)
1920x1200 4AA/16AF = 38%
1920x1200 8AA/16AF = 31% (shows that AMD is more efficient at 8AA)
2560x1600 4AA/16AF = 53%

If you consider any of the above resolutions, GTX580 is at least 30% faster. Even if we discount 2560x1600 due to VRAM limitations, you are still looking at 34% difference across what remains (that's why I said GTX580 ~ approximately 35% faster).

The initial review had the GTX 480 barely faster than the 5870 (less than 10% overall) and now it appears to be a solid 20% faster. This, when combined with the faster GTX 580, makes the 5870 trail by at least 35-40% in my opinion.
First thing I would like to point out is that the 480 speeds have seen quite a few increases since launch. If any of the reviews were with launch drivers, then I believe at this point they should be ignored.I think that a good effort was made to show evidence of this 35% claim, using multiply sources.
At the end of the day, the differences will be less or more depending on the games, settings and resolutions. AMD cards tend to perform well in Crysis, BF:BC2, F1 2010. So I definitely expect HD6970 to win in those 3 games. If the current PowerColor rumours of HD6970 being 30-50% faster than HD6870 are true, then HD6970 may even come out with a 10%+ performance lead overall.
 
Last edited:

JoshGuru7

Golden Member
Aug 18, 2001
1,020
0
0
JoshGuru7 said:
I'd bet half or more were running a resolution higher than 1920x1200, with most of those being people using multi-monitor setups.
ArchAngel777 said:
You are telling me that most people that post in this forum (more than 1/2) are running a 30" display with 2560x1600 resolution?
I bolded the apparent source of confusion in my post.

ArchAngel777 said:
I'd be more than willing to bet the majority of people on this forum are using 1680x1050 and lower, with next largest part in the pie going to 1920x1080-1200 and with a very small part of the pie going to the 2560x1600 crowd.
This seems reasonable to me if you are talking only about primary displays, although I would probably switch 1920x? with 1680x1050. I think you are missing the point of my post however. Is there one person on this sub-forum with a 1680x1050 display that is also going to be purchasing either a 6870 or a 480/580? A more appropriate poll to address my point would be: "What resolution do users who are looking at purchasing the highest end video card run in?" I agree with you that 2560x1600 is a small piece of the pie, but it is probably very highly represented when it comes to people who are actually buying these cards.

ArchAngel777 said:
And if you intent to include 'multi' monitor setup in the expanded resolution, that would not count, because 95% of all multi monitor setups are two independant displays, not two displays with one single image.
True enough for straight gaming benchmarks. However, I imagine there are plenty of people besides myself who tend to play video (hulu, netflix, etc) while they are gaming on the first display, and there are plenty of people who game using multiple monitors. Like above, these people may be the minority but they are also more likely to be looking at a high end video card than people who run a single display at a medium resolution.

Honestly, if I had thought this would have been a bone of contention I would have elaborated the point more carefully in my original post. It just seems obvious to me that you would put most of the weight on high end resolutions when testing high end video cards, and on mid-range resolutions when testing mid range video cards, etc.
 

Triggaaar

Member
Sep 9, 2010
138
0
71
Basically by including a clearly labled OC'd card in the review, Anand single-handedly swayed thousands of idiots into buying nV instead of AMD.
The point is that some will look at the review, and see that AMD have just brought out new cards, that are no faster, and no cheaper, than an old nvidia card. Regardless of what price range that person is buying at, it gives the impression that AMD aren't as good as nvidia (new card no better than old). Those that frequent forums like this know a lot more about each card, but general users who stumble across a review when checking out the latest cards will make subconscious judgements on the manufacturers. That's why brands spend so much on marketing, and why nvidia ask for their OC'd card to be reviewed. If adding the 'clearly labeled OC'd card' to the review made no difference to nvidia sales, why would they ask for it? They wouldn't.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Yes. Basically by including a clearly labled OC'd card in the review, Anand single-handedly swayed thousands of idiots into buying nV instead of AMD.

[translation]Everyone but me is a blithering moron, I believe that this specific article will be misunderstood and misinterpreted by thousands of people because they are very stupid (I get it because I am smarter then them). They will now buy nvidia (which I think is evil and inferior) because of misunderstanding the article in such a way, had they not misunderstood the article they would have purchased AMD instead[/translation]
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It just seems obvious to me that you would put most of the weight on high end resolutions when testing high end video cards, and on mid-range resolutions when testing mid range video cards, etc.

Makes sense. Why test a $500 GTX580 / HD5970 on a 1280x1024 $90 monitor? The majority of gamers who are buying $500 videocards are probably running at least 1920x1080. With PC gaming graphics stangating imo since Crysis (2007), even mid-range cards today such as HD6870 or GTX470 are capable of smooth gameplay at 1920x1080.

Interestingly enough Hardware Heaven actually dropped 2560x1600 resolution when testing multiple videocards citing the fact that HD5870CF/GTX580 SLI setups are more likely to be used with multiple monitors due to the power reserve. They have added some results of 5760x1080.

The point is that some will look at the review, and see that AMD have just brought out new cards, that are no faster, and no cheaper, than an old nvidia card.

Ironically that is the truth. $190 HD6850 is no faster than the cheaper $170 715mhz GTX460. Similarly, a $250 HD6870 is no faster than the cheaper $220 850mhz GTX460. If AMD produces a 950mhz factory pre-overclocked HD6850 tomorrow for $230, and you can readily buy it in your home market, it should be included in the review when GTX560 launches.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Ironically that is the truth. $190 HD6850 is no faster than the cheaper $170 715mhz GTX460. Similarly, a $250 HD6870 is no faster than the cheaper $220 850mhz GTX460. If AMD produces a 950mhz pre-factory overclocked HD6850 tomorrow for $230, and you can readily buy it in your home market, it should be included in the review when GTX560 launches.

AMD fans will say they still won't want this, and ironically it wouldn't matter because a 950mhz hd6850 probably won't be as fast as a reference gtx560. ZING!
 
Last edited:

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
You are telling me that most people that post in this forum (more than 1/2) are running a 30" display with 2560x1600 resolution? Is that what you are seriously saying? There is absolutely no way that is true. Keep in mind you said "higher than 1920x1200" and not "1920x1200 and higher".

And if you intent to include 'multi' monitor setup in the expanded resolution, that would not count, because 95% of all multi monitor setups are two independant displays, not two displays with one single image. In other words, the game is played on one of the screens only, with the other screen used for browsing, writing, etc...

I'd be more than willing to bet the majority of people on this forum are using 1680x1050 and lower, with next largest part in the pie going to 1920x1080-1200 and with a very small part of the pie going to the 2560x1600 crowd.

Well the point that I was making simply was that most people who purchase these cards that we are actually talking about, have at least 1920x1080 or better.
 
Feb 19, 2009
10,457
10
76
AMD fans will say they still won't want this, and ironically it wouldn't matter because a 950mhz hd6850 probably won't be as fast as a reference gtx560. ZING!

I would welcome any similarly priced OC cards in reviews. Users can make their own minds up, just give them extra info.

What do you all think the gtx560 is? gf104 with all cores enabled, running ~>750mhz?
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
I have no figures or hard numbers, but that feels right. Can we have a poll to see what resolution people's monitors are at? I chose my 23 inch 1900x1080 due to price.

From the LCD poll it appears that nearly half of users have 1920x1080 or better.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Well the point that I was making simply was that most people who purchase these cards that we are actually talking about, have at least 1920x1080 or better.

Not to split hairs here, but the point you were 'trying' to make was "most people who purchase these cards that we are actually talking about, have at least 1920x1080 or better." and I'd agree with that statement. However, your original statement said "higher than 1920x1200", which excludes everything 1920x1200 and below.

As for the poll results, only 3 people out of 40+ on THIS forum have a monitor with 2560x1600 resolution.
 

manimal

Lifer
Mar 30, 2007
13,559
8
0
another thread de-evolving into wasted time.


its official....

may as well go elsewhere for real analysis....
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
another thread de-evolving into wasted time.


its official....

may as well go elsewhere for real analysis....

There isn't much that could keep this thread on topic:

Guy#1: Yo dawgs its the 23rd, why isn't info being released?
Guy#2: Oh its been changed to Dec 13th.
/thread

Since we are such organized people on this forum, we decided to condense multiple topics together for space saving measures. This thread is now deciding if Ryan including the EVGA FTW was FTL.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
What do you all think the gtx560 is? gf104 with all cores enabled, running ~>750mhz?

Isn't it supposed to be GF114? A revised/updated gf104 core? Either way, yeah the cores and clock speeds sound about right. I personally would like to see a more aggressive core clock. And AMD should counter with a 3rd tier cayman, so competition will be really healthy! :D
 

JoshGuru7

Golden Member
Aug 18, 2001
1,020
0
0
Not to split hairs here, but the point you were 'trying' to make was "most people who purchase these cards that we are actually talking about, have at least 1920x1080 or better." and I'd agree with that statement. However, your original statement said "higher than 1920x1200", which excludes everything 1920x1200 and below.

As for the poll results, only 3 people out of 40+ on THIS forum have a monitor with 2560x1600 resolution.
I don't see how the poll is relevant as it mentions nothing about whether the voters have one of the high end graphics cards that is being discussed. You are still missing the point, which is that 2560x1600 resolution results may not be important when reviewing a budget card, but they are extremely important when reviewing a high end graphics card.
 
Sep 19, 2009
85
0
0
I would welcome any similarly priced OC cards in reviews. Users can make their own minds up, just give them extra info.

What do you all think the gtx560 is? gf104 with all cores enabled, running ~>750mhz?
I doubt gf114 exists. I mean, gf104 was already fixed as far as we know.
Therefore, I think GTX 560 is a full enabled GF104, maybe with a bit higher clocks. Really don't think they can make a lot of 750MHz that fit the power level.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
I don't see how the poll is relevant as it mentions nothing about whether the voters have one of the high end graphics cards that is being discussed. You are still missing the point, which is that 2560x1600 resolution results may not be important when reviewing a budget card, but they are extremely important when reviewing a high end graphics card.

I didn't say anything about about that resolution not being important. What I said is that it is not important to my setup as I do not run that resolution.

If I do not have to drive a high resolution display with 2560x1600 resolution, why would I particular care much about those results as they apply to me? Sure it is good information to see who can handle the fill rate or what not, but in the end, it will not affect my purchase decision for myself.

I don't think anyone should stop testing at that resolution, I just don't think (as stated earlier by another member) that they should average all the resolutions together for a final percentage.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
if there was no controversy then anand would not have felt compelled to personally address the issue. if AT didn't have a policy that they explicity ignored then they would not have caught any heat over it, but it did look pretty bad that NV asked them to ignore their policy "just this once" and they did. do I think that AT is biased? absolutely not. however, I do think that ryan might have been cajoled/wheedled/etc by NV to throw in their ftw card b/c of all the mitigating circumstances. It was a bad move by AT, and I expect that they won't do it again regardless of the level of crying/complaining/cajoling/etc from amd/nvidia.

Forget *policy* for a second here - I as a reader want as much information as possible, give me a real reason why I, the reader shouldn't be allowed to see a review containing both a stock and an o/c 460 card as well as the 6800's.

Are you saying I'm too thick to work out it's o/c and I might get *confused* into buying an nvidia card? Does it give too many lines in the figures and mean I am unable to see the ati cards due to there being too many cards reviewed? What?

All I've heard is a technicality, which using common sense doesn't apply in this case.

As for the outrage - I think you'll find the only people who were *outraged* were a bunch of sad fan boys who need to grow up and realise amd, nvidia, intel, whoever really don't care about them beyond the money they spend. If they want to put their heart and soul into defending something they should pick something that really matters. In the mean time I hope anandtech ignores them and writes for the people who just want a comprehensive review of graphics cards so they can pick which one they want to buy next.