Is my Phenom bottlenecking my 295?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: toyota
Originally posted by: Schmide
Don't take this the wrong way, but I think the use of bottleneck is semantically poor for this type of computing relationship. I know it's the common term, but I think it's a skewed version of it. The CPU may be the limiting factor in terms of maximum FPS; however, it is the video card that would limit the system at the point of minimum FPS. The only time a CPU would truly limit the video card in this relationship is, if it attempted to compute some particle system which by good reason no modern game attempts to do. In bottle necked system one component continuously prevents another component from ever reaching saturation.

sorry but a cpu can and does effect minimum framerates. also you really have things backwards as a faster video card can indeed increase max framerate(to a certain point) even if the cpu is the limiting part of the system.

for example if you had a 2.6 P4 and 8600gt and then replaced it with a 9600gt your min framerate would still be same in many games but your max framerate would go up especially at higher res.

That's not quite right when 8600gt is gpu limited in many of the games out there now.
 

Schmide

Diamond Member
Mar 7, 2002
5,712
978
126
Originally posted by: Astrallite
Originally posted by: Schmide
Stuff

Depends on the game of course, although in my experience, whether its an RPG (long view distance/cutscenes) or shooter, typically CPU is limiting the minimum framerate.

PCGH has the Crysis review with the GTX280 where the average fps remained at 17 during a timedemo run on an E8600. They ran the cpu from 2.4GHz to 3.6GHz and the minimum framerate varied from from 6 to 12 fps.

That said this is a pretty worthless example since its unplayable either way. Perhaps a better example, typically with RPG cutscenes where the screen is generating far view distance and 100,000s of polygons the CPU becomes a serious bottleneck.

When Dragon Age Origins comes out I would imagine you'd do well OC'ing your cpu through the roof.

I just read their September 22, 2008 Crysis Warhead Scalability of CPU and it's very telling as to what level of CPU will bottleneck this game. I think you're right, as I understand what you're saying, that RPG and large scene environments will stress the CPU more. The cutscenes and dynamic generation of content certainly poison the rendering system at the CPU when they happen.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Originally posted by: toyota
Originally posted by: Schmide
Don't take this the wrong way, but I think the use of bottleneck is semantically poor for this type of computing relationship. I know it's the common term, but I think it's a skewed version of it. The CPU may be the limiting factor in terms of maximum FPS; however, it is the video card that would limit the system at the point of minimum FPS. The only time a CPU would truly limit the video card in this relationship is, if it attempted to compute some particle system which by good reason no modern game attempts to do. In bottle necked system one component continuously prevents another component from ever reaching saturation.

sorry but a cpu can and does effect minimum framerates. also you really have things backwards as a faster video card can indeed increase max framerate(to a certain point) even if the cpu is the limiting part of the system.

for example if you had a 2.6 P4 and 8600gt and then replaced it with a 9600gt your min framerate would still be same in many games but your max framerate would go up especially at higher res.

That's not quite right when 8600gt is gpu limited in many of the games out there now.

well it doesnt really matter if the cpu is too weak to do anything better with a faster card when it comes to minimum framerates. moving to a 9600gt from an 8600gt while still having a really slow single core cpu like a 2.6 P4 or so will not deliver a better gaming experience in most cases. an 8600gt may be a weak card but its certainly much much faster than cards that were around when the P4 @ 2.6 came out.

when I used a 4670 or 9600gt in my 5000 X2 system my minimum framerates were no better in games like Crysis, Far Cry 2 and UT3 then they were with the 8600gt. averages and certainly maximums went up and I could play with more settings turned up but playability wasnt improved because of the low minimum framerates. just to be clear though even running the EXACT same settings as with the 8600gt I could get no better minimum framerate in many games than with the wimpy 8600gt. in other words all the newer games that I wanted to make faster also needed a little more cpu power to get those minimum framerates up with a faster card.
 

Schmide

Diamond Member
Mar 7, 2002
5,712
978
126
Originally posted by: toyota
well it doesnt really matter if the cpu is too weak to do anything better with a faster card when it comes to minimum framerates. moving to a 9600gt from an 8600gt while still having a really slow single core cpu like a 2.6 P4 or so will not deliver a better gaming experience in most cases. an 8600gt may be a weak card but its certainly much much faster than cards that were around when the P4 @ 2.6 came out.

when I used a 4670 or 9600gt in my 5000 X2 system my minimum framerates were no better in games like Crysis, Far Cry 2 and UT3 then they were with the 8600gt. averages and certainly maximums went up and I could play with more settings turned up but playability wasnt improved because of the low minimum framerates. just to be clear though even running the EXACT same settings as with the 8600gt I could get no better minimum framerate in many games than with the wimpy 8600gt. in other words all the newer games that I wanted to make faster also needed a little more cpu power to get those minimum framerates up with a faster card.

I think you're getting stuck on one particular sample time to judge the overall gaming performance and experience. Those minimum frame rates are most likely occurring during dynamic scene generation where a much slower device (disk) is stalling an already burdened CPU. (As Astrallite pointed out to me) I would bet though, the other dips in FPS would be less under the greater GPU same CPU combination.

Edit: cut down the quotes...
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Schmide
Originally posted by: toyota
well it doesnt really matter if the cpu is too weak to do anything better with a faster card when it comes to minimum framerates. moving to a 9600gt from an 8600gt while still having a really slow single core cpu like a 2.6 P4 or so will not deliver a better gaming experience in most cases. an 8600gt may be a weak card but its certainly much much faster than cards that were around when the P4 @ 2.6 came out.

when I used a 4670 or 9600gt in my 5000 X2 system my minimum framerates were no better in games like Crysis, Far Cry 2 and UT3 then they were with the 8600gt. averages and certainly maximums went up and I could play with more settings turned up but playability wasnt improved because of the low minimum framerates. just to be clear though even running the EXACT same settings as with the 8600gt I could get no better minimum framerate in many games than with the wimpy 8600gt. in other words all the newer games that I wanted to make faster also needed a little more cpu power to get those minimum framerates up with a faster card.

I think you're getting stuck on one particular sample time to judge the overall gaming performance and experience. Those minimum frame rates are most likely occurring during dynamic scene generation where a much slower device (disk) is stalling an already burdened CPU. (As Astrallite pointed out to me) I would bet though, the other dips in FPS would be less under the greater GPU same CPU combination.

Edit: cut down the quotes...

and I think that you like to talk a lot of theory without anything concrete to back it up. those minimums were due to the cpu and thats a fact. I put that same 4670 in a Dell with a core 2 9300 and the minimums and overall framerate went way up. it wasnt all in my head and I still have some of the benchmarks on a thumb drive. all the demanding games ran with ease and were smooth and the card performed across the board just like what all the sites were getting where before it wasnt even close with the 5000 X2. you really need to look at cpu scaling in modern games because what you have been saying so far is not accurate. its a fact that a better cpu will indeed give you better minimums and overall framerate compared to a slow cpu in the same system if you have a relatively good gpu.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: toyota
Originally posted by: Schmide
Originally posted by: toyota
well it doesnt really matter if the cpu is too weak to do anything better with a faster card when it comes to minimum framerates. moving to a 9600gt from an 8600gt while still having a really slow single core cpu like a 2.6 P4 or so will not deliver a better gaming experience in most cases. an 8600gt may be a weak card but its certainly much much faster than cards that were around when the P4 @ 2.6 came out.

when I used a 4670 or 9600gt in my 5000 X2 system my minimum framerates were no better in games like Crysis, Far Cry 2 and UT3 then they were with the 8600gt. averages and certainly maximums went up and I could play with more settings turned up but playability wasnt improved because of the low minimum framerates. just to be clear though even running the EXACT same settings as with the 8600gt I could get no better minimum framerate in many games than with the wimpy 8600gt. in other words all the newer games that I wanted to make faster also needed a little more cpu power to get those minimum framerates up with a faster card.

I think you're getting stuck on one particular sample time to judge the overall gaming performance and experience. Those minimum frame rates are most likely occurring during dynamic scene generation where a much slower device (disk) is stalling an already burdened CPU. (As Astrallite pointed out to me) I would bet though, the other dips in FPS would be less under the greater GPU same CPU combination.

Edit: cut down the quotes...

and I think that you like to talk a lot of theory without anything concrete to back it up. those minimums were due to the cpu and thats a fact. I put that same 4670 in a Dell with a core 2 9300 and the minimums and overall framerate went way up. it wasnt all in my head and I still have some of the benchmarks on a thumb drive. all the demanding games ran with ease and were smooth and the card performed across the board just like what all the sites were getting where before it wasnt even close with the 5000 X2. you really need to look at cpu scaling in modern games because what you have been saying so far is not accurate. its a fact that a better cpu will indeed give you better minimums and overall framerate compared to a slow cpu in the same system if you have a relatively good gpu.

Here is the article I was referring too earlier:

http://www.anandtech.com/mb/showdoc.aspx?i=3506&p=4

Looking at that game you can see that PhII has the highest minimum frame rate at 1680x1050 despire being slower at average frame rate. As the resolution moves up, the minimum frame rate drops as well, showing that more likely it is a GPU limitation causing the minimum frame rate. There isn't a black and white right answer for all scenarios. But to say the PhII is not competitve in real world resolutions with multi GPU is incorrect. Depending on the game and the resolution you can see that the CPU's are usually quite competitive.
 

Schmide

Diamond Member
Mar 7, 2002
5,712
978
126
Originally posted by: toyota
and I think that you like to talk a lot of theory without anything concrete to back it up. those minimums were due to the cpu and thats a fact. I put that same 4670 in a Dell with a core 2 9300 and the minimums and overall framerate went way up. it wasnt all in my head and I still have some of the benchmarks on a thumb drive. all the demanding games ran with ease and were smooth and the card performed across the board just like what all the sites were getting where before it wasnt even close with the 5000 X2. you really need to look at cpu scaling in modern games because what you have been saying so far is not accurate. its a fact that a better cpu will indeed give you better minimums and overall framerate compared to a slow cpu in the same system if you have a relatively good gpu.

What does it matter what I talk? I would say you have a hostile reaction to constructive criticism. I've seen it in other threads. When discussing unknowns, you should avoid using the word fact. Your observations can only be classified as assumptions.

A benchmark can only give you a limited window into the inner workings of a rendering system. You only get 3 numbers maximum, minimum and average which really tell you very little about the up and down swing of the rendering pipeline. The weakest CPU can feed a GPU to bottleneck. Look at FURmark, hardly a CPU bound program.

I more than concede, the CPU can be a major bottleneck in some games. I assert that the GPU can be an equal bottleneck depending on the workload of the system.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Schmide
Originally posted by: toyota
and I think that you like to talk a lot of theory without anything concrete to back it up. those minimums were due to the cpu and thats a fact. I put that same 4670 in a Dell with a core 2 9300 and the minimums and overall framerate went way up. it wasnt all in my head and I still have some of the benchmarks on a thumb drive. all the demanding games ran with ease and were smooth and the card performed across the board just like what all the sites were getting where before it wasnt even close with the 5000 X2. you really need to look at cpu scaling in modern games because what you have been saying so far is not accurate. its a fact that a better cpu will indeed give you better minimums and overall framerate compared to a slow cpu in the same system if you have a relatively good gpu.

What does it matter what I talk? I would say you have a hostile reaction to constructive criticism. I've seen it in other threads. When discussing unknowns, you should avoid using the word fact. Your observations can only be classified as assumptions.

A benchmark can only give you a limited window into the inner workings of a rendering system. You only get 3 numbers maximum, minimum and average which really tell you very little about the up and down swing of the rendering pipeline. The weakest CPU can feed a GPU to bottleneck. Look at FURmark, hardly a CPU bound program.

I more than concede, the CPU can be a major bottleneck in some games. I assert that the GPU can be an equal bottleneck depending on the workload of the system.

sorry I wasnt trying to sound hostile. I just didnt want to be in an argument just for arguments sake. we are at a point where many people have really slow and outdated cpus but keep upgrading to relatively high end gpus and in some cases multiple high end gpus. this can be a problem in several current games. sure if you have a decent Core 2 or Phenom then most single graphics cards are non issue. yes the gpu or cpu can both be a major bottleneck. it totally depends on the game and the users particular setup as to where the biggest limitation is. I always like to have a very balanced system which of course is not always possible.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: TidusZ
With yorkfield @ 4.25 a gtx 280 was bottlenecking me hardcore. With the 295, it feels like a much more even match.

only in very specific situations. There are places where a faster CPU would not make any difference. And you can even load down your GTX295 with maxed out details and filtering - in Clear Sky , for example - where it still slows down badly

that said, i also prefer CrossFireX-3 over GTX280 with my Q9550S at 4 Ghz for many games where i need the added details or AA

i have a friend who has GTX280 SLi - similar performance to the OPs - but who had an Athlon X2 6000+. His frame rates was *crippled* by his CPU; when he upgraded to Phenom II, his frame rates - in a few cases - doubled with the same SLI'd GTX280s

rose.gif
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: apoppin
Originally posted by: TidusZ
With yorkfield @ 4.25 a gtx 280 was bottlenecking me hardcore. With the 295, it feels like a much more even match.

only in very specific situations. There are places where a faster CPU would not make any difference. And you can even load down your GTX295 with maxed out details and filtering - in Clear Sky , for example - where it still slows down badly

that said, i also prefer CrossFireX-3 over GTX280 with my Q9550S at 4 Ghz for many games where i need the added details or AA

i have a friend who has GTX280 SLi - similar performance to the OPs - but who had an Athlon X2 6000+. His frame rates was *crippled* by his CPU; when he upgraded to Phenom II, his frame rates - in a few cases - doubled with the same SLI'd GTX280s

rose.gif
yep even with a single gtx285 having a 6000 X2 can easily limit you from getting all the performance that a card like that is capable of especially in some of the newer more demanding games. as you and your friend could see sticking a second gtx285 in there did little and likely close to nothing in some cases for performance with that relatively slow cpu in there. good thing he has a better cpu now to actually get the performance he desired. :cool:

only bad thing is that the Phenom 2 cpus dont scale very well with current tri sli and above. http://hardocp.com/article.htm...w1LCxoZW50aHVzaWFzdA== so this means the next time your friend upgrades to very high end multi gpus in the future he will be right back in a similar situation although likely not as bad of course. ;)

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Phenom 2 cpus dont scale very well with current tri sli and above

How many gamers does this impact?

Tri-Sli is probably .1% of gaming rigs; if you have that for graphics, you will no doubt go for the fastest i7 money can buy

for the rest of us, *generally* the performance of the GPU "matches" the CPU
- as if they GPU designers actually took the current speed of the CPU into consideration when making a balanced design
;)

When the next true generation of cards comes out - GT300 series [graphics Tesla Core 3]; and ATi has a competitive solution for DX11 - by THEN, Phenom will likely be plenty fast .. for all but Tri-SLi

good bang for buck

 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: apoppin
Phenom 2 cpus dont scale very well with current tri sli and above

How many gamers does this impact?

Tri-Sli is probably .1% of gaming rigs; if you have that for graphics, you will no doubt go for the fastest i7 money can buy

for the rest of us, *generally* the performance of the GPU "matches" the CPU
- as if they GPU designers actually took the current speed of the CPU into consideration when making a balanced design
;)

When the next true generation of cards comes out - GT300 series [graphics Tesla Core 3]; and ATi has a competitive solution for DX11 - by THEN, Phenom will likely be plenty fast .. for all but Tri-SLi

good bang for buck

I was only referring to tri sli because of hinting at future gpu power. gtx280 in tri sli could be a hint at what to expect as for as what type of cpu is needed to fully push just regular sli for 2 next gen high end cards. if the Phenom 2 is having trouble with current cards in tri sli then it could likely have trouble with the next gen in regular sli and maybe even single high end cards 2 years down the road. of course there are variables such as sli scaling itself here but it is possible to end up in that same situation again in a year or 2. remember his 6000 X2 was also a decent cpu 2 years ago. thats just the nature of pcs in general and thats why we upgrade.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: toyota
Originally posted by: apoppin
Phenom 2 cpus dont scale very well with current tri sli and above

How many gamers does this impact?

Tri-Sli is probably .1% of gaming rigs; if you have that for graphics, you will no doubt go for the fastest i7 money can buy

for the rest of us, *generally* the performance of the GPU "matches" the CPU
- as if they GPU designers actually took the current speed of the CPU into consideration when making a balanced design
;)

When the next true generation of cards comes out - GT300 series [graphics Tesla Core 3]; and ATi has a competitive solution for DX11 - by THEN, Phenom will likely be plenty fast .. for all but Tri-SLi

good bang for buck

I was only referring to tri sli because of hinting at future gpu power. gtx280 in tri sli could be a hint at what to expect as for as what type of cpu is needed to fully push just regular sli for 2 next gen high end cards. if the Phenom 2 is having trouble with current cards in tri sli then it could likely have trouble with the next gen in regular sli and maybe even single high end cards 2 years down the road. of course there are variables such as sli scaling itself here but it is possible to end up in that same situation again in a year or 2. remember his 6000 X2 was also a decent cpu 2 years ago. thats just the nature of pcs in general and thats why we upgrade.

Of course, and when the Next Gen GPUs get way more powerful, the Phenom CPU will also be faster - then.
- it has been this way as long as i can remember

 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
Pretty sure an X2 5000+ would not keep a 9600gt from dominating an 8600gt in every aspect of your frame rate including the minimum. I only game at 1900x1200, but my X2 5000 and 4850 match minimums of systems with a better cpus. I have a 9600GSO sitting on my desk that I can use to compare to the 4850. Also have a 6600gt.. but thats too old to prove anything...
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: apoppin
Originally posted by: toyota
Originally posted by: apoppin
Phenom 2 cpus dont scale very well with current tri sli and above

How many gamers does this impact?

Tri-Sli is probably .1% of gaming rigs; if you have that for graphics, you will no doubt go for the fastest i7 money can buy

for the rest of us, *generally* the performance of the GPU "matches" the CPU
- as if they GPU designers actually took the current speed of the CPU into consideration when making a balanced design
;)

When the next true generation of cards comes out - GT300 series [graphics Tesla Core 3]; and ATi has a competitive solution for DX11 - by THEN, Phenom will likely be plenty fast .. for all but Tri-SLi

good bang for buck

I was only referring to tri sli because of hinting at future gpu power. gtx280 in tri sli could be a hint at what to expect as for as what type of cpu is needed to fully push just regular sli for 2 next gen high end cards. if the Phenom 2 is having trouble with current cards in tri sli then it could likely have trouble with the next gen in regular sli and maybe even single high end cards 2 years down the road. of course there are variables such as sli scaling itself here but it is possible to end up in that same situation again in a year or 2. remember his 6000 X2 was also a decent cpu 2 years ago. thats just the nature of pcs in general and thats why we upgrade.

Of course, and when the Next Gen GPUs get way more powerful, the Phenom CPU will also be faster - then.
- it has been this way as long as i can remember

if all the Phenom does is increase in speed with no architectural improvements then no it wont be enough for future high end cards. that link I showed you even had a Phenom 2 at 3.8 which made little difference. to be fair even the Core 2 architecture looked quite inferior to the i7 when it came to pushing those 3 gtx280 cards in tri sli.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Insomniator
Pretty sure an X2 5000+ would not keep a 9600gt from dominating an 8600gt in every aspect of your frame rate including the minimum. I only game at 1900x1200, but my X2 5000 and 4850 match minimums of systems with a better cpus. I have a 9600GSO sitting on my desk that I can use to compare to the 4850. Also have a 6600gt.. but thats too old to prove anything...

well sorry but that didnt happen when it came to minimums. in every other aspect yes the 9600gt was easily a large improvement. there are games where your 5000 X2 is only giving you 3/4 of what your card is capable of especially in the minimum framerate department. of course most games are still playable but I promise you that your 4850 would give you a much better experience with a better cpu. you just dont know what you are missing.
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
Originally posted by: toyota
Originally posted by: Insomniator
Pretty sure an X2 5000+ would not keep a 9600gt from dominating an 8600gt in every aspect of your frame rate including the minimum. I only game at 1900x1200, but my X2 5000 and 4850 match minimums of systems with a better cpus. I have a 9600GSO sitting on my desk that I can use to compare to the 4850. Also have a 6600gt.. but thats too old to prove anything...

well sorry but that didnt happen when it came to minimums. in every other aspect yes the 9600gt was easily a large improvement. there are games where your 5000 X2 is only giving you 3/4 of what your card is capable of especially in the minimum framerate department. of course most games are still playable but I promise you that your 4850 would give you a much better experience with a better cpu. you just dont know what you are missing.

stop giving me reasons to upgrade! :p
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Insomniator
Originally posted by: toyota
Originally posted by: Insomniator
Pretty sure an X2 5000+ would not keep a 9600gt from dominating an 8600gt in every aspect of your frame rate including the minimum. I only game at 1900x1200, but my X2 5000 and 4850 match minimums of systems with a better cpus. I have a 9600GSO sitting on my desk that I can use to compare to the 4850. Also have a 6600gt.. but thats too old to prove anything...

well sorry but that didnt happen when it came to minimums. in every other aspect yes the 9600gt was easily a large improvement. there are games where your 5000 X2 is only giving you 3/4 of what your card is capable of especially in the minimum framerate department. of course most games are still playable but I promise you that your 4850 would give you a much better experience with a better cpu. you just dont know what you are missing.

stop giving me reasons to upgrade! :p

you know you want to. lol :cool:

I really want to upgrade myself so I can crank up the settings in Clear Sky enhanced DX10 maxed. at 1920 its going to take something like a gtx295 to get close to what I want though so I might just have to wait. funny thing is that I got a bigger boost in Clear Sky benchmark at 1920 by overclocking my cpu than my gpu. that means if I get something like a gtx295 I will really need to oc my cpu or go ahead and make the move to i7. it never ends. :shocked:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: toyota
Originally posted by: apoppin
Originally posted by: toyota
Originally posted by: apoppin
Phenom 2 cpus dont scale very well with current tri sli and above

How many gamers does this impact?

Tri-Sli is probably .1% of gaming rigs; if you have that for graphics, you will no doubt go for the fastest i7 money can buy

for the rest of us, *generally* the performance of the GPU "matches" the CPU
- as if they GPU designers actually took the current speed of the CPU into consideration when making a balanced design
;)

When the next true generation of cards comes out - GT300 series [graphics Tesla Core 3]; and ATi has a competitive solution for DX11 - by THEN, Phenom will likely be plenty fast .. for all but Tri-SLi

good bang for buck

I was only referring to tri sli because of hinting at future gpu power. gtx280 in tri sli could be a hint at what to expect as for as what type of cpu is needed to fully push just regular sli for 2 next gen high end cards. if the Phenom 2 is having trouble with current cards in tri sli then it could likely have trouble with the next gen in regular sli and maybe even single high end cards 2 years down the road. of course there are variables such as sli scaling itself here but it is possible to end up in that same situation again in a year or 2. remember his 6000 X2 was also a decent cpu 2 years ago. thats just the nature of pcs in general and thats why we upgrade.

Of course, and when the Next Gen GPUs get way more powerful, the Phenom CPU will also be faster - then.
- it has been this way as long as i can remember

if all the Phenom does is increase in speed with no architectural improvements then no it wont be enough for future high end cards. that link I showed you even had a Phenom 2 at 3.8 which made little difference. to be fair even the Core 2 architecture looked quite inferior to the i7 when it came to pushing those 3 gtx280 cards in tri sli.

IF .. IF .. IF..

who really believes that is gonna happen? Phenom II actually has a few advantages with 64-bit vista over C2D if i remember right
- i expect their architecture to evolve as intel's does :p
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: apoppin
Originally posted by: toyota
Originally posted by: apoppin
Originally posted by: toyota
Originally posted by: apoppin
Phenom 2 cpus dont scale very well with current tri sli and above

How many gamers does this impact?

Tri-Sli is probably .1% of gaming rigs; if you have that for graphics, you will no doubt go for the fastest i7 money can buy

for the rest of us, *generally* the performance of the GPU "matches" the CPU
- as if they GPU designers actually took the current speed of the CPU into consideration when making a balanced design
;)

When the next true generation of cards comes out - GT300 series [graphics Tesla Core 3]; and ATi has a competitive solution for DX11 - by THEN, Phenom will likely be plenty fast .. for all but Tri-SLi

good bang for buck

I was only referring to tri sli because of hinting at future gpu power. gtx280 in tri sli could be a hint at what to expect as for as what type of cpu is needed to fully push just regular sli for 2 next gen high end cards. if the Phenom 2 is having trouble with current cards in tri sli then it could likely have trouble with the next gen in regular sli and maybe even single high end cards 2 years down the road. of course there are variables such as sli scaling itself here but it is possible to end up in that same situation again in a year or 2. remember his 6000 X2 was also a decent cpu 2 years ago. thats just the nature of pcs in general and thats why we upgrade.

Of course, and when the Next Gen GPUs get way more powerful, the Phenom CPU will also be faster - then.
- it has been this way as long as i can remember

if all the Phenom does is increase in speed with no architectural improvements then no it wont be enough for future high end cards. that link I showed you even had a Phenom 2 at 3.8 which made little difference. to be fair even the Core 2 architecture looked quite inferior to the i7 when it came to pushing those 3 gtx280 cards in tri sli.

IF .. IF .. IF..

who really believes that is gonna happen? Phenom II actually has a few advantages with 64-bit vista over C2D if i remember right
- i expect their architecture to evolve as intel's does :p


as for as I know AMD has nothing architecturally new coming anytime soon for their cpus. I guess we will both see in a few months when the next high end cards come out exactly where they stand cpu wise.
 
Dec 30, 2004
12,553
2
76
Originally posted by: toyota
Originally posted by: garritynet
Originally posted by: toyota
well in general the Phenom 2 doesnt do all that great with with mutli gpu setups compared to the i7. at 4.0 and with a gtx295 though there shouldnt be much of an issue if any. 3 gtx285 in tri sli or a 2 gtx295 for quad sli would certainly show some weakness from the Phenom 2 though.

Thanks. This is what I was looking for.

yeah the i7 architecture really shines with tri sli or a better. http://hardocp.com/article.htm...w1LCxoZW50aHVzaWFzdA==

Would be a nice article, but please stop linking to HardOCP. We already know they have a hard on for Intel. They were and still are (you'd think they'd learn) showing AMD Ph2's nearly 20% slower clock/clock compared with the Penryn series. We know this not to be true; nearly every other website shows them performing more like 5-8% slower clock/clock.
 

garritynet

Senior member
Oct 3, 2008
416
0
0
Originally posted by: soccerballtux
Originally posted by: toyota
Originally posted by: garritynet
Originally posted by: toyota
well in general the Phenom 2 doesnt do all that great with with mutli gpu setups compared to the i7. at 4.0 and with a gtx295 though there shouldnt be much of an issue if any. 3 gtx285 in tri sli or a 2 gtx295 for quad sli would certainly show some weakness from the Phenom 2 though.

Thanks. This is what I was looking for.

yeah the i7 architecture really shines with tri sli or a better. http://hardocp.com/article.htm...w1LCxoZW50aHVzaWFzdA==

Would be a nice article, but please stop linking to HardOCP. We already know they have a hard on for Intel. They were and still are (you'd think they'd learn) showing AMD Ph2's nearly 20% slower clock/clock compared with the Penryn series. We know this not to be true; nearly every other website shows them performing more like 5-8% slower clock/clock.

Hey nobody start talking about anything Intel, AMD or HardOCP. Please.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: soccerballtux
Originally posted by: toyota
Originally posted by: garritynet
Originally posted by: toyota
well in general the Phenom 2 doesnt do all that great with with mutli gpu setups compared to the i7. at 4.0 and with a gtx295 though there shouldnt be much of an issue if any. 3 gtx285 in tri sli or a 2 gtx295 for quad sli would certainly show some weakness from the Phenom 2 though.

Thanks. This is what I was looking for.

yeah the i7 architecture really shines with tri sli or a better. http://hardocp.com/article.htm...w1LCxoZW50aHVzaWFzdA==

Would be a nice article, but please stop linking to HardOCP. We already know they have a hard on for Intel. They were and still are (you'd think they'd learn) showing AMD Ph2's nearly 20% slower clock/clock compared with the Penryn series. We know this not to be true; nearly every other website shows them performing more like 5-8% slower clock/clock.

yeah they just didnt run the Phenom 2 at normal settings where it actually looked good. they first ran benchies at low res which made the Phenom 2 look horrible and then used tri sli where the i7 clearly is superior. at the settings and cards that most people use the Phenom 2 is a good cpu. heck most sites had it actually beating the low end end i7 at realistic settings in Far Cry 2 and I brought that up their forums. its not an intel or amd debate as even the core 2 didnt do much better in tri sli scaling. again the only reason I linked to that was to say that it could be an indicator of a cpu bottleneck for the next gen of cards in regular sli.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
as for as I know AMD has nothing architecturally new coming anytime soon for their cpus. I guess we will both see in a few months when the next high end cards come out exactly where they stand cpu wise.
they have a few very nice things coming no one knew about last month; i don't think they are telegraphing what they are doing any more

http://www.techreport.com/articles.x/16448
http://www.fudzilla.com/index....iew&id=12086&Itemid=35

AMD is becoming competitive again and they do not really provide roadmaps like intel does

when it comes to gaming .. except in *extreme* cases like tri-SLi .. there were only brief periods when they were not so competitive; Athlon 6000+ worked fine except for GTX280SLI .. there were also gaps in the Pentium iV line, where Athlon 64 was better suited for gaming .. but still it was generally OK

i would say that HardOCP exaggerates this 'lack' with Phenom II; there were flaws in that review; and when Phenom II finally routinely hit over 4.0 - 4.5 Ghz with O/C'ing, they should be fine for most gaming - even with GTX-285 SLI or even the next gen

- However, when you are talking END of THIS year for DX11, who can know for sure? ..
but if i guess .. i'd say Phenom II will be ready for it at least in upper-midrange gaming
.. again who cares about Tri-SLi? .. by the time the future it portends really comes around in a single GPU, the CPU is always ready for it
[the GPU makers plan for it, imo]
rose.gif

 
Dec 30, 2004
12,553
2
76
I can't wait to see Xbitlab's reviews of these 2 and 3 core 6mB L2 cache CPUs. Their reviews seem a little more down to earth (sorry anand).

edit: I take that back. Xbit didn't increase the HT link to the usual 2.2Ghz, which has been proven to improve performance by 10+ percent. So maybe anand wasn't fidging.