More R520 "delay" rumors

nRollo

Banned
Jan 11, 2002
10,460
0
0
http://www.theinquirer.net/?article=24203

Then comes the nastiest one, performance. "Fudo" is reported to be able to push 10K in 3DMark05, but what version? I have heard it is a 24 pipe version with 32 available if need be, but in the interest of yield, and therefor money, 24 will be the number if at all possible. I would bet that 10K is for 32, which would put the 24 pipe version about on top of the 7800GTX.

Well that would be interesting. If they had a part everyone could buy in September that was comparable to a 7800GTX and a "Phantom Edition" that those of us willing to part with $800 on Ebay get to enjoy.

 

Beiruty

Senior member
Jun 14, 2005
282
0
0
So it is more confusion... R520 in late july at 10k with 24 pipes? Or wait a sec the R520 is unified shader so it will be 24+8. next spte is 32+12 unit.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
Originally posted by: Rollo
Well that would be interesting. If they had a part everyone could buy in September that was comparable to a 7800GTX and a "Phantom Edition" that those of us willing to part with $800 on Ebay get to enjoy.

From what I have read, the "speculative" release date for the 32pipe R520 is September. I would guess Ati is scrambling to figure out if a 24pipe vers is possible with the current silicon spin which they could release sooner, but I have also read that they are spinning an improved (fixed?) R520 32pipe silicon.

A 24pipe vers might bring Ati stock into level flight from a downward spiral, which would be good reason to do it :)
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: Beiruty
So it is more confusion... R520 in late july at 10k with 24 pipes? Or wait a sec the R520 is unified shader so it will be 24+8. next spte is 32+12 unit.

R520 is not unified shader R600 most likely will be, microsoft probably wouldn't let unified shader architecute out before xbox 360 is established
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Is interesting that almost everyone agrees that Nvidia is soon to be releasing a card that will easily beat the 7800gtx. Should be a hell of a card, but can't be helping sales of the dated 6800 line. With the 7800gtx over $800.00 canadian - has me firmly on hold waiting for the vanilla versions to hit reasonable levels.
 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
Originally posted by: fierydemise
Originally posted by: Beiruty
So it is more confusion... R520 in late july at 10k with 24 pipes? Or wait a sec the R520 is unified shader so it will be 24+8. next spte is 32+12 unit.

R520 is not unified shader R600 most likely will be, microsoft probably wouldn't let unified shader architecute out before xbox 360 is established
ATI has already said that the R520 will not use a unified shader. I believe it was someone commenting on the Xbox 360's architecture that mentionned that it had been developped separately from the PC products and was very different from what was currently being done on the PC. FWIW, if the rumours are to be believed the 'Xenos' GPU for the Xbox is a 24 pipeline device whereas the R520 is 32 pipes wide.

This is purely speculation on my part, but if they are getting poor yields for 32 pipes, they might just release a card that rivals the 7800 with 24 pipes and have a PE card with 32 pipes with an astronomical MSRP to limit demand on that card.

It should be interesting to see how ATI handles the situation regardless of which rumours are true and which aren't. nVidia pulled a fantastic coup with the release day availability of the 7800, we'll see how ATI responds.

edit: nVidia's faster card is likely to be just a clocked up version of the 7800, you can't just paste on 8 pipelines to an existing core. In addition, even if it were possible, adding 8 pipes would increase die size by up roughly 30%. I'm not sure how feasible this is on an 110nm process. It's likely that nVidia will leverage their development of the RSX on 90nm to eventually release a GPU with 32 pipes on the PC eventually. I can't imagine this will happen on the short term, however.
 

legcramp

Golden Member
May 31, 2005
1,671
113
116
Originally posted by: ronnn
Is interesting that almost everyone agrees that Nvidia is soon to be releasing a card that will easily beat the 7800gtx. Should be a hell of a card, but can't be helping sales of the dated 6800 line. With the 7800gtx over $800.00 canadian - has me firmly on hold waiting for the vanilla versions to hit reasonable levels.

well a vanilla version would be comparable to a 6800 gt+ card....so guess you can enjoy that NOW :)
 

HDTVMan

Banned
Apr 28, 2005
1,534
0
0
Not sure what to make of the article. I for one like the INQ I think they deliver enough of the time that I can usually believe what they say.

It could be worse if more games like BF2 are released in the delay period then ATI is going to get slammed. Right now there arent enough games pushing the limit so ATI has a little breathing room. This is also the slow time of year.

They still need to liquidate the 111 day inventory.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Prediction

ATI releases a 24 pixel pipe card. It is on par with the 7800GTX but the GTX has been out for 3-4 months. Nobody is terribly impressed. Nvidia may or may not release a 32 pixel pipe card to counter. Might just call it a victory due to the headstart they got and workon their next release for next summer.

Unified shader cards wont showup until closer to Longhorn and DirectNext which is the iteration of DX that will expose unified shaders.

 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: RIFLEMAN007
Originally posted by: ronnn
Is interesting that almost everyone agrees that Nvidia is soon to be releasing a card that will easily beat the 7800gtx. Should be a hell of a card, but can't be helping sales of the dated 6800 line. With the 7800gtx over $800.00 canadian - has me firmly on hold waiting for the vanilla versions to hit reasonable levels.

well a vanilla version would be comparable to a 6800 gt+ card....so guess you can enjoy that NOW :)

Why would I want old, loud tech with reduced IQ? Nvidia has come a long ways with this release and have really upped the ante. Would not be surprised to see the gtx take over the spot of the 6800gt when the ultra is released.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: ronnn
Originally posted by: RIFLEMAN007
Originally posted by: ronnn
Is interesting that almost everyone agrees that Nvidia is soon to be releasing a card that will easily beat the 7800gtx. Should be a hell of a card, but can't be helping sales of the dated 6800 line. With the 7800gtx over $800.00 canadian - has me firmly on hold waiting for the vanilla versions to hit reasonable levels.

well a vanilla version would be comparable to a 6800 gt+ card....so guess you can enjoy that NOW :)

Why would I want old, loud tech with reduced IQ? Nvidia has come a long ways with this release and have really upped the ante. Would not be surprised to see the gtx take over the spot of the 6800gt when the ultra is released.


Link to 6800GTs being "loud"? They're also single slot design?
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
ATI is going to be in trouble if the R520 card isn't 32 pipes...otherwise, everyone will just get a 7800gtx because it's actually available.
Personally, i think ATI has already lost in this generation of cards, but maybe they have something up their sleeve. who knows.

I can't see a 24 pipeline card getting 10k in 3dmark 05 UNLESS ati did something to the card to make it do really well on 3dmark05...which...could happen just so they could impress people. you never know. I'm just guessing its the 32 pipe card their talking about.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: hans030390
I can't see a 24 pipeline card getting 10k in 3dmark 05 UNLESS ati did something to the card to make it do really well on 3dmark05...which...could happen just so they could impress people. you never know.

Actually, artifically boosting 3DMark scores is Nvidia's specialty, not ATI's.

3DMark03 Driver chart with/without anti-Nvidia cheat patch applied

Dark gray bars show performance of Nvidia card using their Detonator FX 44.03 and 43.51 WHQL drivers.
Light gray bars show actual performance of Nvidia card after anti-cheat patch is applied.

Audit Report: Alleged NVIDIA Driver Cheating on 3DMark03
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Creig
Originally posted by: hans030390
I can't see a 24 pipeline card getting 10k in 3dmark 05 UNLESS ati did something to the card to make it do really well on 3dmark05...which...could happen just so they could impress people. you never know.

Actually, artifically boosting 3DMark scores is Nvidia's specialty, not ATI's.

3DMark03 Driver chart with/without anti-Nvidia cheat patch applied

Dark gray bars show performance of Nvidia card using their Detonator FX 44.03 and 43.51 WHQL drivers.
Light gray bars show actual performance of Nvidia card after anti-cheat patch is applied.

Audit Report: Alleged NVIDIA Driver Cheating on 3DMark03

Meh. ATI cheated too, I notice you failed to mention it?

http://forums.anandtech.com/messageview...atid=31&threadid=1625020&enterthread=y

Earlier this year both NVIDIA and ATI Technologies were caught on cheating in 3DMark game benchmarks, and even though the latter removed its cheats,

 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Rollo
Meh. ATI cheated too, I notice you failed to mention it?

http://forums.anandtech.com/messageview...atid=31&threadid=1625020&enterthread=y

Earlier this year both NVIDIA and ATI Technologies were caught on cheating in 3DMark game benchmarks, and even though the latter removed its cheats,


And since when have YOU ever posted anything bad about Nvidia? Hmmmmm? Well then, since you're suddenly so interested in giving equal air time to both companies, let's compare the two:


What Is the Performance Difference Due to These Cheats?

A test system with GeForceFX 5900 Ultra and the 44.03 drivers gets 5806 3DMarks with 3DMark03 build 320. The new build 330 of 3DMark03 in which 44.03 drivers cannot identify 3DMark03 or the tests in that build gets 4679 3DMarks ? a 24.1% drop. Our investigations reveal that some drivers from ATI also produce a slightly lower total score on this new build of 3DMark03. The drop in performance on the same test system with a Radeon 9800 Pro using the Catalyst 3.4 drivers is 1.9%. This performance drop is almost entirely due to 8.2% difference in the game test 4 result, which means that the test was also detected and somehow altered by the ATI drivers. We are currently investigating this further.


ATI's 1.9% performance gain came from optimization of the two DirectX 9.0 shaders (water and sky) in Game Test 4.


Whereas Nvidia's 24.1% increase came from:


1. The loading screen of the 3DMark03 test is detected by the driver. This is used by the driver to disregard the back buffer clear command that 3DMark03 gives. This incorrectly reduces the workload. However, if the loading screen is rendered in a different manner, the driver seems to fail to detect 3DMark03, and performs the back buffer clear command as instructed.

2. A vertex shader used in game test 2 (P_Pointsprite.vsh) is detected by the driver. In this case the driver uses instructions contained in the driver to determine when to obey the back buffer clear command and when not to. If the back buffer would not be cleared at all in game test 2, the stars in the view of outer space in some cameras would appear smeared as have been reported in the articles mentioned earlier. Back buffer clearing is turned off and on again so that the back buffer is cleared only when the default benchmark cameras show outer space. In free camera mode one can keep the camera outside the spaceship through the entire test, and see how the sky smearing is turned on and off.

3. A vertex shader used in game test 4 (M_HDRsky.vsh) is detected. In this case the driver adds two static clipping planes to reduce the workload. The clipping planes are placed so that the sky is cut out just beyond what is visible in the default camera angles. Again, using the free camera one can look at the sky to see it abruptly cut off. Screenshot of this view was also reported in the ExtremeTech and Beyond3D articles. This cheat was introduced in the 43.51 drivers as far as we know.

4. In game test 4, the water pixel shader (M_Water.psh) is detected. The driver uses this detection to artificially achieve a large performance boost - more than doubling the early frame rate on some systems. In our inspection we noticed a difference in the rendering when compared either to the DirectX reference rasterizer or to those of other hardware. It appears the water shader is being totally discarded and replaced with an alternative more efficient shader implemented in the drivers themselves. The drivers produce a similar looking rendering, but not an identical one.

5. In game test 4 there is detection of a pixel shader (m_HDRSky.psh). Again it appears the shader is being totally discarded and replaced with an alternative more efficient shader in a similar fashion to the water pixel shader above. The rendering looks similar, but it is not identical.

6. A vertex shader (G_MetalCubeLit.vsh) is detected in game test 1. Preventing this detection proved to reduce the frame rate with these drivers, but we have not yet determined the cause.

7. A vertex shader in game test 3 (G_PaintBaked.vsh) is detected, and preventing this detection drops the scores with these drivers. This cheat causes the back buffer clearing to be disregarded; we are not yet aware of any other cheats.

8. The vertex and pixel shaders used in the 3DMark03 feature tests are also detected by the driver. When we prevented this detection, the performance dropped by more than a factor of two in the 2.0 pixel shader test.



Did ATI "cheat" on 3DMark2003?

Version 3.30 of 3DMark03 not only defeated the DetonatorFX driver's app detection mechanisms, but it also managed to uncover an ATI optimization that landed the company in a bit of hot water.

ATI's optimization involves using a more optimized shader program in Game Test 4 for the water shader program, but according to ATI, this water shader renders "?the scene exactly as intended by Futuremark, in full-precision floating point. Our shaders are mathematically and functionally identical to Futuremark's, and there are no visual artifacts; we simply shuffle instructions to take advantage of our architecture?.However, we recognize that these can be used by some people to call into question the legitimacy of benchmark results, and so we are removing them from our driver as soon as is physically possible. We expect them to be gone by the next release of CATALYST."

http://www.extremetech.com/article2/0,1558,1105259,00.asp

Did ATI "cheat" on 3DMark2003?

The line between a bona fide optimization and a cheat is sometimes not well-defined, but in our estimation, ATI's optimization is suspect, and at the end of the day, it bought them all of a 9% increase in performance, or a barely mention-worthy 1.7fps.

This small gain leads us to question why ATI would implement this optimization in the first place, since the risk of being perceived as cheating would far outweigh a less than 2fps performance increase. However, ATI has been forthcoming in its public statements about the nature of this optimization, and has stated that because of the potential perception of being considered a cheat (and probably because it didn't buy them much performance anyway), that this optimization will be pulled from the next Catalyst driver release.

Meanwhile, nVidia's "optimizations" are far more aggressive, and we believe they go beyond what can be considered reasonable driver optimization, and into the realm of cheating. nVidia has thus far been unable, or perhaps unwilling, to address these issues, since it apparently still refuses to rejoin FutureMark's beta program to gain access to the developer version of 3DMark03, and the company maintains that what ExtremeTech uncovered are driver bugs, and nothing more.



I'd say that's a bit more than "Meh".
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: Rollo

Link to 6800GTs being "loud"? They're also single slot design?
Quick search turned up these: link
link

Am sure if I researched it futher, I could find other people complaining of noise. Certainly too much noise is a subjective matter - so spare me the posts about how you don't notice gpu fan noise over your other 15 fans. :beer:
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
Originally posted by: Creig
I'd say that's a bit more than "Meh".
Neat! So, since their cheating was less successful, it's not as bad that they did it.

They're both cheaters, and ATi are also incompetent cheaters. Yes, a place in video heaven is definitely reserved for those so pure.

Originally posted by: ronnn
Originally posted by: Rollo
Link to 6800GTs being "loud"? They're also single slot design?
Quick search turned up these: link
link

Am sure if I researched it futher, I could find other people complaining of noise. Certainly too much noise is a subjective matter - so spare me the posts about how you don't notice gpu fan noise over your other 15 fans. :beer:
First link was written off as a bad fan on that particular card. No one else with the same brand and model had any issues.

Second link is more generic, and the worst comment I could find about the 6800GT was that "you have to use high fan settings with the Zalman and a 6800GT, but it is still completely silent at that setting." Hardly a condemnation.

Doing my own 30 seconds of research I found comments that the X800 Pro gets very noisy when the fan goes on high mode after 30 minutes, unless "in a regular tower case with 10 other fans" to drown it out.

Gee, whatdoya know, you're right about it being subjective, and nVidia isn't the only loud one depending on your own personal ears. The "nVidia cards are noisy" mentality is a holdover from the FX days, IMHO. The 6800 vs. X800 cards I've heard are pretty comparable.

(Cue comments that I'm an nVidia fanboy because god forbid BOTH companies be guilty of something.)
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
I wasn't commenting on the x800pro. :roll: Was commenting on new gen nvidia versus old gen. What I said, silence is a subjective matter and more important to some.
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
Is the 7800GTX really any quieter than the 6800GT? While the 7800GTX has lower power consumption than the 6800U, I don't think it's less than the 6800GT. (The GT runs at a lower voltage than the Ultra.)

I've only seen graphs comparing the 6800U and 7800GTX, though, never noticed the GT on the same graph, so I could be wrong.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: Xentropy
Originally posted by: Creig
I'd say that's a bit more than "Meh".
Neat! So, since their cheating was less successful, it's not as bad that they did it.

They're both cheaters, and ATi are also incompetent cheaters. Yes, a place in video heaven is definitely reserved for those so pure.

Wow, just wow.

If you could shave 25 seconds off of your laptime on a particular track by taking a shortcut when noone is looing, and it usually takes you 60 seconds to run, you've run it in 35 seconds and everyone believes you. On the other hand if someone else can run it in 38 seconds without the shortcut, but takes some 'roids to shave off 3 seconds they've also been clocked in at 35 seconds.

However on all the other tracks if there's no guarantee they can cheat, my money is on the guy who can run it in 38 seconds before cheating vs. the one who does it in 60 seconds. There isn't always going to be a shortcut or roids to take.

Granted that analogy isn't quite the same but there is a big difference. Yes they both cheated but if one does it to the extent where they look far better for it, then it is actually a worse "crime"
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
Hmm. We'll just have to agree to disagree that failing to cheat well is better on the morality scale than succeeding.

If someone attempted to rob a bank and only got away with twenty bucks, they're still a threat to society, no less than if they ran off with millions. Whether the guy who got $20 out of it already makes six figures at his normal job and the other guy is unemployed is beside the point.