How far back in time for HD4600 to be top dog?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

el etro

Golden Member
Jul 21, 2013
1,584
14
81
again, on what? real world performance is not the same as simply theoretical numbers, theoretical numbers for FLOPS would put the HD 4600 on par with the early 2006 high end card, but some other numbers are much lower... and the HD 4600 can do a lot more, and it's probably much faster at running something like Crysis 2 (DX9C game)

Early 2006?? What card?
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
again, on what? real world performance is not the same as simply theoretical numbers, theoretical numbers for FLOPS would put the HD 4600 on par with the early 2006 high end card, but some other numbers are much lower... and the HD 4600 can do a lot more, and it's probably much faster at running something like Crysis 2 (DX9C game)

Crysis 2 didn't exist at a time when the highest end card had the performance of HD 4600.
 

Ventanni

Golden Member
Jul 25, 2011
1,432
142
106
We're talking about full theoretical performance, independent of graphical technologies added over the time. The OP purpose is to "make HD 4600 the GPU king", No?

I get where you're going with that, but what we're finding is that unfortunately we can't directly use those comparisons. Old games like Quake3 basically stress the pixel and texture performance of a pipeline and nothing more, while most of the advancements and transistor budgets in GPU technology over the last 5 years has been on the shading performance and programability side of things. That's why Balla's overclocked HD4600 is almost as fast as my Geforce 285 in Quake 3, because pixel/texture performance hasn't changed a whole lot outside of improved clockspeeds, but realworld performance is nowhere near that.

Case in point, I played Quake 3 for a half hour last night just for old time's sake, and my GPU never went above 50C. If I play Planetside 2 for 30 minutes, it'll rise to 90C in no time. That's why I proposed HL2 as a more accurate comparison, because at least the engine makes hardware calls that will utilize 100% of the GPU of both eras.

I think if we actually started benchmarking Q3 with newer cards, what we'd see is a relative plateauing of framerate not consistent with real world performance. It's only now that 4k is being pushed that we'll start actually seeing improvements in the pixel performance side of things.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
Crysis 2 didn't exist at a time when the highest end card had the performance of HD 4600.

well, a lot of the HD 4600 design was made for things that didn't exist back in the day, so it's hardly worth comparing, what was the worst game in early 2006, I can't remember, Oblivion? COD2?

it's hard to find any data comparing old cards and the HD 4600 with the same software but, using 3dmark06 it looks like the HD 4600 is significantly faster on the SM3.0 tests and a little slower on the SM 2.0 (not a surprise), so, there is no easy comparison here, newer games HD 4600, older games 7900GTX (and other high end cards from 2005/2006),

Early 2006?? What card?


pre G80 stuff, I was mainly thinking about 7900GTX, X1900XT...
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Anand did an interesting comparison- smartphone GPUs and (low power tablet) HD4000 against the GPUs of yesteryear: http://www.anandtech.com/show/6877/the-great-equalizer-part-3/2

That should give you a rough idea of how the HD4600 would perform by comparison.

I remember that article but I took it with a big grain of salt as newer benchmarks are of course going to be better suited to newer GPUs when thinking of hardware compatibility. The "3DMark Graphics Test 2" is probably the closest thing to actually putting them in a decent order and representation of relative graphics processing power in standard workloads. The HD4000 and even E-350 are both held back by their bandwidth restrictions.

It's still impressive to know that most of what we see on screen in many PS3 games is indeed handled by the RSX - what is essentially a highly gimped 7800. Even when using the Cell BE as a graphics aid, it still goes to show that proprietary pixel and vertex shaders can be made to compete against today's massive unified shader arrays.

I'd also really love to get a real breakdown how the iPad Air and iPad 4's graphics compete against PC GPUs too.