AMD Radeon HD 6970 already benchmarked? Enough to beat GTX480 in Tesselation?

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

buckshot24

Diamond Member
Nov 3, 2009
9,916
85
91
You are making it a personal issue. I am just having a technical discussion about tessellation. This isn't about me, nor is it really about AMD vs nVidia (it seems that people are mostly making it console vs PC). It's just about why tessellation is (or will be) important.
I won't look bad, because I support good tessellation performance. If the Cayman delivers that, all the better.
What I am saying is that with the Cayman coming out so soon and by some accounts fixes the issue you are arguing for you are jumping the gun.

I do think you will look bad when/if Cayman releases with superior tess performance solely because you have been so vocal about AMD's strategy. Because you seem to be jumping the gun by a few weeks.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
I don't think he'll look bad. He's stated several times that he doesn't care if it's made by AMD or Nvidia-- he just wants everyone to support tessellation at a useful level so developers adopt it faster. He's not being unreasonable.
 

buckshot24

Diamond Member
Nov 3, 2009
9,916
85
91
Speculation is half the fun! :D
He isn't even speculating though.


This thread isn't about Scali, but pretty much the entirety of your posts in it are about him and his opinions...

Stop the thread-derail/thread-crap/personal-attack...either post something technical and related to the thread's topic or don't post in this thread.

Moderator Idontcare
 
Last edited by a moderator:

Scali

Banned
Dec 3, 2004
2,495
1
0
Wait for it to come out, then evaluate the situation.

As I said, that's what I'm doing. I haven't made any kind of comments, let alone evaluations regarding Cayman. I'm waiting for the (official) benchmarks before I draw my conclusions.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
I do think you will look bad when/if Cayman releases with superior tess performance solely because you have been so vocal about AMD's strategy. Because you seem to be jumping the gun by a few weeks.

Can you be more specific? In what way have I been "vocal about AMD's strategy"? What are you referring to?
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Perhaps a brief perusal of your blog posts can shed some light on his comment....
I particularly liked the "Are AMD fans idiots"? masterpiece.


Member call-outs and personal attacks are not acceptable.

Moderator Idontcare
 
Last edited by a moderator:
Feb 19, 2009
10,457
10
76
http://forum.beyond3d.com/showpost.php?p=1488243&postcount=4276

94189504.jpg


Groove ought to be pleased with the 2GB DDR5 standard. ;)

Scali ought to be pleased that geometry is only 2x the previous generation and not up to 8x like Fermi was. ;)

VLIW4 is almost for sure now, not just because of this but because of driver details.

No definite statement on tessellator though.

A bit late, but i'll chime. :)

The info on Cayman since August AIB briefings has always been 4D shaders, brand new architecture focusing on CF scaling and eyeinfinity with a heavily tweaked tesselator that scales very well. There no different info from the briefings to different partners like there was with Barts. So these slideshow leaks are more likely to be right.

As i've said previously, there's a reworked set of drivers optimized for the 6K series to boost dx11 perf. Initial leaks (cat10.10) put Cayman only ~20% faster than the 480. If the gtx580 is 20% faster than the gtx480, then they'll need the extra perf.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
How much smaller are those 4D shaders then the 4+1 ones currently used? Ive heard people talk alot about the 4D shaders but apart from that they should be smaller and almost same performance.. Is it just something AMD is doing to save die space?
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
How much smaller are those 4D shaders then the 4+1 ones currently used? Ive heard people talk alot about the 4D shaders but apart from that they should be smaller and almost same performance.. Is it just something AMD is doing to save die space?

They are smaller and more efficient. So they can fit more shaders into the same amount of space, plus being more efficient means the architecture will likely be faster than barts per mm2. Thats if they employed the same space saving techniques as Barts.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
just out of curiousity is this true:
AMD and Nvidia both with same image quality settings but the Nvidia one looks like it has less?
His coat looks differnt on nvidia cards.

"AMD should learn from NV, secretly reduce the picture quality to increase speed."
from: http://translate.googleusercontent.com/translate_c?hl=da&ie=UTF-8&sl=auto&tl=en&u=http://www.chiphell.com/forum-viewthread-tid-133732-extra-page%255C%255C%253D1-page-2.html&prev=_t&rurl=translate.google.com&usg=ALkJrhj5x7nPnTWsOKy21Yw4-yAH4t5ATQ

1004092010416b81a5ba0fb4eb.jpg
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
So the age old adage that ATi has superior image quality is still true?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
just out of curiousity is this true:
AMD and Nvidia both with same image quality settings but the Nvidia one looks like it has less?
His coat looks differnt on nvidia cards.

"AMD should learn from NV, secretly reduce the picture quality to increase speed."
from: http://translate.googleusercontent.com/translate_c?hl=da&ie=UTF-8&sl=auto&tl=en&u=http://www.chiphell.com/forum-viewthread-tid-133732-extra-page%255C%255C%253D1-page-2.html&prev=_t&rurl=translate.google.com&usg=ALkJrhj5x7nPnTWsOKy21Yw4-yAH4t5ATQ

1004092010416b81a5ba0fb4eb.jpg

The lighting is clearly different in the AMD screenie versus the NV...just look at how saturated the guy's face is on the right (his left) in the AMD screenshot versus the Nvidia one. The Nvidia one keeps all the facial shadowing and structure, one big blob of a cheekbone in the AMD screenshot.

Personally it looks to me like someone juiced up the contrast in the AMD screenie after the fact, photoshop FTW, or did not set the same gamma for the game.

You guys might want to checkout BFG10K's article:
nVidia 400 Series Image Quality Analysis

by BFG10K

So the age old adage that ATi has superior image quality is still true?

Without a doubt, we've even got a google translated source saying as much with a screenshot to prove it! What more proof does one need?
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
it was a user on chiphell... when they got into talking about metro2033, posted those.
I assume the guy has both a 480 and a 5850. There are some ingame shots from him that show it much better, I didnt post those because they where pretty big pictures so you could easier compair the quality.

**edit I ll go look for them and post them:

Nvidia 480:
235617f6bnddbboasbffdi.jpg



amd 5850:
235648offyfyfpnpgegofs.jpg
 
Last edited:

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
I think he was talking about texture quality, not any other settings.

The textures do look somewhat worse in Nvidia's sample, more blurry. Then again, it's hard to tell with the tiny pictures.

edit: LOL I'm loading this site at 6KB/sec and there's two 2MB pictures on it D:
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
Why do people feel the need to announce that they have put someone on ignore? it just comes off as a juvenile "getting the last word".

It obviously is, let alone he couldn't explain anything hence using Scali's (false) arguments as a cover - that's even more juvenile, I think.

And Phil1977, I'm with you on this (so far). Scali has commented that we are likely to see a lot more tessellation in the future, so if that turns out to be the case, perhaps the effects of much higher tessellation levels (and the hardware to support it) will be far more obvious to the relevant gamer.

In the future - yes, obviously, thanks to DX11.
In the immediate future - no, obviously, thanks to DX11.

In short it will shine on next-gen cards and this is why AMD was so smart when they didn't bother with tessellation in 58xx-series: they knew they had the 69xx-series in the pipeline a year later and nobody will write a new engine in less than 2-3 years so for countering NV's PR nonsense there's 69xx-series from now and real games with heavy tessellation won't be in before late 2011-2012 ie nothing en masse before next-gen CryEngine, UnrealEngine, Frostbite, X-Ray etc implement it (probably NV-supported CryEngine 3 will be the first one to make heavy use of it.)

This entire thread-derail thread-crap campaign needs to stop.

Keep the scope of your postings in this thread on-topic or you will be infracted for thread-derail.

Moderator Idontcare
 
Last edited by a moderator:

Scali

Banned
Dec 3, 2004
2,495
1
0
It obviously is, let alone he couldn't explain anything hence using Scali's (false) arguments as a cover - that's even more juvenile, I think.

I haven't yet seen anyone invalidate any of my arguments... especially not you, Mr "I keep repeating that AMD has fixed tessellation while nVidia uses shaders" no matter how often that was debunked.