• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

R680 To Score 20K In 3DMark06

JPB

Diamond Member
Fudzilla reported that AMD plans to introduce a dual chip card called Radeon HD 2950X2X and it will have two RV670 chips on a single PCB. They called this dual GPU solution R670. On the other hand, we are still hearing R680 from our sources but both of us could be referring to the same thing. We even heard faintly that R680 could be AMD's ambitious plan to integrate two RV670 into a single die, if not on the same package.

VR-Zone has got hold of a presentation slide recently that gave some clues that R680 is a dual GPU (RV670 slide shows only one GPU pic). Our good buddy, CJ told us R680 is indeed a dual RV670 solution and is 1.5X faster than a Radeon HD 2900 XT in Crossfire. Another source told us that R680 could reach 20K in 3DMark06 or rather the target AMD is trying to achieve.

R680 To Score 20K In 3DMark06
 
wow, i sure hope that AMD isn't so stupid that they are engineering their card around a 3dmark benchmark! try actually achieving performance in games for a change! R600 wins no meaningful benchmarks against any competition (i.e. settings that actually matter, AA in a game where you would use it, etc.)
 
It kinda seems like they are.

Look @ the 2900s.

They all do ridiculously well in 3DMock, & yet terribly in games comparably 🙁

Dual GPU scares me...if it requires CF to be enabled, it'd better be dirt cheap, since we all know how well CF works...not.
 
I could actually see this type of thing happening and being of great benefit. AMD should be able to easily 'attach' multiple ATI GPUs together just like they would a multi-core CPU. It would probably be faster than crossfire, and would be much much cheaper. 1 PCB vs. 2, and you wouldn't waste the memory of one of the cards.
 
What's happening to ATI ffs, they were doing brilliant since the 9700 series every card since then has been spot on until the 2000 series, i think that 512bit crap make no diffrence over 256bit cards which is why both Nvidia and ATI made there mid range cards 128bit and Ewwwww 64bit, if the 2600XT and the 8600GT were 256bit they would of been just as fast as there big brothers.


Heres a tip for you ATI, Test your cards with Various games new and old at diffrent resoutions and settings and try to make your cards as fast as you can with all the bells and whistles turned on, i think it might help.

Another way ATI could make a statment is to fuck DX10 right off cos its pants and a big waste of bandwidth and card speed, if they were to bring out a DX9 card NOW that was faster than the 8800 and 2900's at DX9 games (Everygame out there ) then id buy it hands down, DX10 is a really poor buggy gimmick that hasnt pulled off for MS.

Rant over......lol
 
Originally posted by: SniperDaws
What's happening to ATI ffs, they were doing brilliant since the 9700 series every card since then has been spot on until the 2000 series, i think that 512bit crap make no diffrence over 256bit cards which is why both Nvidia and ATI made there mid range cards 128bit and Ewwwww 64bit, if the 2600XT and the 8600GT were 256bit they would of been just as fast as there big brothers.

This is what 8800gt is going to be. A 8600gts power with 256bit memory bus and 32 more shader units.

2600xt just doesn't have the raw horse power to pull it off.

512bit memory bus on 2900xt was a waste imho. Now if it was 8800gtx I think it could have used some more bandwidth.
 
The funny thing is that if R680 really was RV670x2, it'd consume less than or as much power as a single 2900XT. On that reason alone, I believe this may be a possibility.
 
Oh god, here we go with the "WTF, 3dmark is not a game!"

Of course its not, but does anyone have a better indicator of performance at this point in time?
 
Originally posted by: Corporate Thug
Oh god, here we go with the "WTF, 3dmark is not a game!"

Of course its not, but does anyone have a better indicator of performance at this point in time?

many

as a single benchmark it is nearly worthless ... taken together with many others it begins to provide a picture of a particular GPU's relative performance

3DMarkXX's particular strength lies in tracking changes within a single rig.
 
Originally posted by: apoppin
Originally posted by: Corporate Thug
Oh god, here we go with the "WTF, 3dmark is not a game!"

Of course its not, but does anyone have a better indicator of performance at this point in time?

many

as a single benchmark it is nearly worthless ... taken together with many others it begins to provide a picture of a particular GPU's relative performance

3DMarkXX's particular strength lies in tracking changes within a single rig.

Actually, if you are comparing G80 to R600, 3DMark06 appears more as an anomaly in a group of video benchmarks, since the R600 3DMark06 score is much higher then how it actually performs in game benchmarks relative to G80. I think that is the real issue that anyone could have with judging anything R6xx related based on 3DMark06 scores.
 
Originally posted by: nitromullet
Originally posted by: apoppin
Originally posted by: Corporate Thug
Oh god, here we go with the "WTF, 3dmark is not a game!"

Of course its not, but does anyone have a better indicator of performance at this point in time?

many

as a single benchmark it is nearly worthless ... taken together with many others it begins to provide a picture of a particular GPU's relative performance

3DMarkXX's particular strength lies in tracking changes within a single rig.

Actually, if you are comparing G80 to R600, 3DMark06 appears more as an anomaly in a group of video benchmarks, since the R600 3DMark06 score is much higher then how it actually performs in game benchmarks relative to G80. I think that is the real issue that anyone could have with judging anything R6xx related based on 3DMark06 scores.

so are some DX10 benchmarks if we use your reasoning ... it just appears that either 1) the r600's architecture is just especially suited for running 3DMarkXX better; or, 2) AMD optimized for 3DMarkXX

i am inclined to pick Number1 ... and "we'll see" how well G80 and r600 run 3DMark08 .. i bet the results will be very different and perhaps closer to more 'real-world' gaming benches.
 
Originally posted by: apoppin
... it just appears that either 1) the r600's architecture is just especially suited for running 3DMarkXX better; or, 2) AMD optimized for 3DMarkXX

i am inclined to pick Number1 ... and "we'll see" how well G80 and r600 run 3DMark08 .. i bet the results will be very different and perhaps closer to more 'real-world' gaming benches.

I'm inclined to pick #2.

Why?

At 3DMark06 defaults, my HD 2900 XT 1 GB beat my 8800 GTX.

At every other resolution/settings i tried, it lost.
I don't believe in that kinda coincidence.
 
All the bickering you guys are doing and I'm astounded to see ATI take the huge leap towards dual core GPU's. Thats where the focus should be. 2 GPU's 1 die = fuck Nvidia once again 🙂

Immagine the damage you could do to folding with a quad core CPU and 2 dual core high end video cards 🙂 Thats 8 cores folding away 😵 .

Also thats either 2 cores for rendering or 1 core for rendering and one for physics 😵

Omfg, the possibilites are fn endless 😀
 
Originally posted by: Dazed and Confused
All the bickering you guys are doing and I'm astounded to see ATI take the huge leap towards dual core GPU's. Thats where the focus should be. 2 GPU's 1 die = fuck Nvidia once again 🙂

Immagine the damage you could do to folding with a quad core CPU and 2 dual core high end video cards 🙂 Thats 8 cores folding away 😵 .

Also thats either 2 cores for rendering or 1 core for rendering and one for physics 😵

Omfg, the possibilites are fn endless 😀

I'll get excited when i see this on shelves for a decent price actually doing well in games, not just 3DMock.

I really thought the HD 2900 XTs had awesome potential when they were released, but thus far, they've proven to be very disappointing in general, especially in DX10 games.

I hate to bash AMD, but seriously, for the last year now, if not longer, AMD has been all talk, no action.

Hence my lack of interest in believing rumors or marketing BS from AMD...
 
Originally posted by: n7
Originally posted by: apoppin
... it just appears that either 1) the r600's architecture is just especially suited for running 3DMarkXX better; or, 2) AMD optimized for 3DMarkXX

i am inclined to pick Number1 ... and "we'll see" how well G80 and r600 run 3DMark08 .. i bet the results will be very different and perhaps closer to more 'real-world' gaming benches.

I'm inclined to pick #2.

Why?

At 3DMark06 defaults, my HD 2900 XT 1 GB beat my 8800 GTX.

At every other resolution/settings i tried, it lost.
I don't believe in that kinda coincidence.

Lets not forget ATI has been caught in the past releasing drivers that were specifically tuned to get higher benchmarking scores in 3dmark. Not that nvidia hasn't done it either, but they've both done it, but lets not forget that.
 
Originally posted by: bfdd
Originally posted by: n7
Originally posted by: apoppin
... it just appears that either 1) the r600's architecture is just especially suited for running 3DMarkXX better; or, 2) AMD optimized for 3DMarkXX

i am inclined to pick Number1 ... and "we'll see" how well G80 and r600 run 3DMark08 .. i bet the results will be very different and perhaps closer to more 'real-world' gaming benches.

I'm inclined to pick #2.

Why?

At 3DMark06 defaults, my HD 2900 XT 1 GB beat my 8800 GTX.

At every other resolution/settings i tried, it lost.
I don't believe in that kinda coincidence.

Lets not forget ATI has been caught in the past releasing drivers that were specifically tuned to get higher benchmarking scores in 3dmark. Not that nvidia hasn't done it either, but they've both done it, but lets not forget that.

so has nvidia ... so what ... they are allowed to "optimize" - but not at the expense of IQ ... or other forms of cheating.

the *reason* i am inclined to believe AMD's r600 architecture "favors" 3DMark is because nvidia would love to have the highest 3DMark score and no doubt the G80 is already also optimized for it ... you don't think nvidia is ignoring it, do you? ... they are downplaying it [rightly, imo]

let's also not forget that nvidia also does better in some games and AMD does better in others ... that is related to architecture imo

I really thought the HD 2900 XTs had awesome potential when they were released, but thus far, they've proven to be very disappointing in general, especially in DX10 games.
i am not the slightest bit disappointed as i was never expecting a GTX-killer ... after the first delay ... and the flood of AMD FUD ... i was expecting worse.

As it is, 2900xt turned out perfectly for me ... i got a cheap [$320] card in June that *matches* my [pre-planned] 16x10 LCD resolution exactly and gives me excellent FPS in all my maxed-out plus 4xAA/16xAF DX9c games.
--as to DX10 ... an Ultra wouldn't satisfy me - even at 16x10 😛
-i'd still be looking to SLI for huge bucks.

I hate to bash AMD, but seriously, for the last year now, if not longer, AMD has been all talk, no action.
Pray tell ... what has nvidia been doing since last December ... can you point to anything - besides "talk"?
 
Anyway, like I was saying, who cares about default 3dmark, let the new default be 1920x1200 x 4xAA and let's see who is the 3dmark champion. 2900 XT's cant win anything with AA enabled. MEANINGFUL BENCHMARKS ARE WHAT MATTER
 
Originally posted by: votelibertarian35

Anyway, like I was saying, who cares about default 3dmark, let the new default be 1920x1200 x 4xAA and let's see who is the 3dmark champion. 2900 XT's cant win anything with AA enabled. MEANINGFUL BENCHMARKS ARE WHAT MATTER

well in that case, with your new 'default' @ 1920x1200 x 4xAA in the DX10 pathway and even the Ultra sucks in real games

"Games" are what matter ... 3DMarkXX is not a "meaningful benchmark" in comparing nvidia to AMD GPUs - only for tracking changes in a single system ... or to get a 'ballpark' compared to similar systems.
 
Originally posted by: bfdd
Originally posted by: n7
Originally posted by: apoppin
... it just appears that either 1) the r600's architecture is just especially suited for running 3DMarkXX better; or, 2) AMD optimized for 3DMarkXX

i am inclined to pick Number1 ... and "we'll see" how well G80 and r600 run 3DMark08 .. i bet the results will be very different and perhaps closer to more 'real-world' gaming benches.

I'm inclined to pick #2.

Why?

At 3DMark06 defaults, my HD 2900 XT 1 GB beat my 8800 GTX.

At every other resolution/settings i tried, it lost.
I don't believe in that kinda coincidence.

Lets not forget ATI has been caught in the past releasing drivers that were specifically tuned to get higher benchmarking scores in 3dmark. Not that nvidia hasn't done it either, but they've both done it, but lets not forget that.
maybe they should have spent a little bit more energy optimizing for actual games...

 
Originally posted by: apoppin
Originally posted by: Corporate Thug
Oh god, here we go with the "WTF, 3dmark is not a game!"

Of course its not, but does anyone have a better indicator of performance at this point in time?

many

as a single benchmark it is nearly worthless ... taken together with many others it begins to provide a picture of a particular GPU's relative performance

3DMarkXX's particular strength lies in tracking changes within a single rig.

I meant a better indicator of R680's performance, not general video performance...

 
All this bitching about 3dmock, I know it's meaningless, so why doesn't everyone use some standard benchmark that we all agree on? Oh, that's right, because there isn't one!
 
Back
Top