Phenom II > i7

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

taltamir

Lifer
Mar 21, 2004
13,576
6
76
so when CPU bound, the i7 is 40 FPS faster (60 to 100+)... but when GPU bound the p2 is under half an FPS faster? (30.49 to 30.74 FPS)...
When I benchmark, just repeating the same benchmark on the same system is see some deviation in FPS, it does the same for everyone else. it is called a margin of error, those results are within the margin of error.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Modular
Originally posted by: apoppin
Looks like AMD has a new Viral marketing campaign running:

http://www.youtube.com/user/AMDUnprocessed

That's pretty ineffective marketing, I don't know that I'd call it viral at all. It's boring and there's that annoying background whine throughout the whole thing as well. Meh.

viral in that it is not "paid for"
--there is nothing wrong with viral unless it is a disease .. or malicious
rose.gif



.. and that appears to be AMD's campaign .. pretty similar to Nvidia's:

For playing games, basically the GPU is the most important piece of HW to upgrade
- see, they own ATi and are most glad to sell you any part of their complete platform

Now clearly when you get superfast graphics, you need a superfast CPU
. . . how many people have TriSli ?
- not many - so for most, a C2D or C2Quad or phenomII is plenty

 

DrMrLordX

Lifer
Apr 27, 2000
21,637
10,855
136
Originally posted by: Zstream

That is wrong, HT on the P4 was turned off on many many applications. The same technology used in many Xeon servers.

While you are correct, I believe that MS issued numerous HT updates to various operating systems to deal with scheduler problems that were ultimately to blame for HT hurting performance. Of course my memory may be bad here.

Nevertheless this looks like a scheduler problem that may not be an issue in Windows 7. We hope.

Or you could just turn of HT while gaming but who wants to reboot?

It would be nice if virtual cores could be switched off or on dynamically based on how the physical cores are loaded and how many threads they're handling. Seems like it would be less work to just redo the scheduler.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,846
3,190
126
Originally posted by: apoppin

Now clearly when you get superfast graphics, you need a superfast CPU
. . . how many people have TriSli ?
- not many - so for most, a C2D or C2Quad or phenomII is plenty

ummmm actually a lot now a days since F@H is getting more popular with CUDA.

Look at Mark, i think 2 of his systems have 3 GPU's on them.

My system is quadfire.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: aigomorla
Originally posted by: apoppin

Now clearly when you get superfast graphics, you need a superfast CPU
. . . how many people have TriSli ?
- not many - so for most, a C2D or C2Quad or phenomII is plenty

ummmm actually a lot now a days since F@H is getting more popular with CUDA.

Look at Mark, i think 2 of his systems have 3 GPU's on them.

My system is quadfire.

Lol, using members of a somewhat enthusiast forum as proof of argument that 'alot' of people have Multi-GPU computers is simply way off. The enthusiast market is a tiny fraction of the market for hardware sales- the majority of hardware sales comes from OEMs and 3rd party vendors to people who don't know what a GPU is.
 

Pelu

Golden Member
Mar 3, 2008
1,208
0
0
anyway... reviving this thread....

sometimes I take a look at those benchmarks and I wondering if those things are true.. or those sites sold their self out.... lol... after all, I play in an AMD, ATI computer, considered crap by some and I run well, and the games that run with problems are ones with well know issues of the engines and such.... or they have TOOMUCH MOD STUFF. like my sims 3 lol!
 

daw123

Platinum Member
Aug 30, 2008
2,593
0
0
Since someone else revived this thread:

Originally posted by: Sylvanas
Originally posted by: aigomorla
Originally posted by: apoppin

Now clearly when you get superfast graphics, you need a superfast CPU
. . . how many people have TriSli ?
- not many - so for most, a C2D or C2Quad or phenomII is plenty

ummmm actually a lot now a days since F@H is getting more popular with CUDA.

Look at Mark, i think 2 of his systems have 3 GPU's on them.

My system is quadfire.

Lol, using members of a somewhat enthusiast forum as proof of argument that 'alot' of people have Multi-GPU computers is simply way off. The enthusiast market is a tiny fraction of the market for hardware sales- the majority of hardware sales comes from OEMs and 3rd party vendors to people who don't know what a GPU is.

I have quad-fire, so add another +1 to your X-fire / SLI tally.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: palladium
Linky

Got the link from a friend.

It's in German, but the graphs are quite clear. In low res, i7 pwns PII, but as you crank up to more GPU-dependant res it seems PII starts to gain leadership, in some cases by a significant amount.

My thoughts: The testing methodology must have some flaws somewhere, I mean, in crysis @16x10, the i7 965 is slower than the 920 ( WTF!).

I believe guys at AMDZone are all hyped over this.

Yep, AMD's viral marketing at work
rose.gif


expect them to stress "value"
.. don't you see it here and in many many tech sites ?

"My $100 Dual core unlocked to a Quad" :p
- and more BS in imitation of Nvidia and Intel

Only they are not good at it

 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Actually, the site isn't wrong in anyway or anything. Take a look at this: http://www.techspot.com/review...l-core-i7-920-940-965/


High Quality Gaming: http://www.techspot.com/review...20-940-965/page11.html

Low Quality Gaming: http://www.techspot.com/review...20-940-965/page12.html

Take a note on Far Cry 2 how even the i7 920 beats CPUs based on every other architecture but on the high settings even the i7 940 is beat by the Q9650. Even with other games the gap is far closer.

There's something here and I don't think its simple as lack of cache or PCI Express. I believe this is due to the fact that while AMD had integrated memory controller systems there for a long time, Intel hasn't and Nehalem is rather a radical change in that view. In low resolutions the CPU is simply too powerful for driver bottlenecks/PCI Express to matter but in high resolutions, even some slowdowns communicating and interfacing with the GPU driver and game code will cause performance degradation.

Think about it: If its cache, why does AMD perform better than i7 on high resolutions? Even refer to the post earlier in this thread. Didn't somebody say that Core 2 on high resolutions used to be slower?
 

veri745

Golden Member
Oct 11, 2007
1,163
4
81
Originally posted by: Sylvanas
Originally posted by: aigomorla
Originally posted by: apoppin

Now clearly when you get superfast graphics, you need a superfast CPU
. . . how many people have TriSli ?
- not many - so for most, a C2D or C2Quad or phenomII is plenty

ummmm actually a lot now a days since F@H is getting more popular with CUDA.

Look at Mark, i think 2 of his systems have 3 GPU's on them.

My system is quadfire.

Lol, using moderators of a somewhat enthusiast forum as proof of argument that 'alot' of people have Multi-GPU computers is simply way off. The enthusiast market is a tiny fraction of the market for hardware sales- the majority of hardware sales comes from OEMs and 3rd party vendors to people who don't know what a GPU is.

fixed

If you continue to attack MODS, you will get a vacation. Just because we are enthusiasts, has nothing to do with being a MOD.

Markfw900
Anandtech Moderator
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: IntelUser2000
Actually, the site isn't wrong in anyway or anything. Take a look at this: http://www.techspot.com/review...l-core-i7-920-940-965/


High Quality Gaming: http://www.techspot.com/review...20-940-965/page11.html

Low Quality Gaming: http://www.techspot.com/review...20-940-965/page12.html

Take a note on Far Cry 2 how even the i7 920 beats CPUs based on every other architecture but on the high settings even the i7 940 is beat by the Q9650. Even with other games the gap is far closer.

There's something here and I don't think its simple as lack of cache or PCI Express. I believe this is due to the fact that while AMD had integrated memory controller systems there for a long time, Intel hasn't and Nehalem is rather a radical change in that view. In low resolutions the CPU is simply too powerful for driver bottlenecks/PCI Express to matter but in high resolutions, even some slowdowns communicating and interfacing with the GPU driver and game code will cause performance degradation.

Think about it: If its cache, why does AMD perform better than i7 on high resolutions? Even refer to the post earlier in this thread. Didn't somebody say that Core 2 on high resolutions used to be slower?

Where
is AMD performing better in Hi res?
--They just test PhII X4 at stock, anyway 2.6Ghz and it is a bottom feeder in your hi-res examples :p

http://www.techspot.com/review...20-940-965/page11.html

and You are talking about FEAR, not FC2, right? - *one* OLD game
:confused:

where there is 1 FPS difference, there is NO practical difference
-- it can be a driver issue also; i don't see Phenom II X4 beating core i7
rose.gif




 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Originally posted by: apoppin

Where[/i] is AMD performing better in Hi res?
--They just test PhII X4 at stock, anyway 2.6Ghz and it is a bottom feeder in your hi-res examples :p

http://www.techspot.com/review...20-940-965/page11.html

and You are talking about FEAR, not FC2, right? - *one* OLD game
:confused:

where there is 1 FPS difference, there is NO practical difference
-- it can be a driver issue also; i don't see Phenom II X4 beating core i7
rose.gif

Then you aren't paying attention. Look at Far Cry 2 low resolution how even the i7 920 beats everything and on the high quality the Phenom X4 beats the 920. Not an insignificant amount either, but something like 10%.

Look also how in other games in low resolutions Core i7 is faster(compared to other arch) than the frequency difference between each versions of Core i7's, but in high resolutions it doesn't even do that.

Funny, you say FEAR is an old game but all the CPUs are performing almost identical. I'd say that's current enough, don't you think?

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: IntelUser2000
Originally posted by: apoppin

Where[/i] is AMD performing better in Hi res?
--They just test PhII X4 at stock, anyway 2.6Ghz and it is a bottom feeder in your hi-res examples :p

http://www.techspot.com/review...20-940-965/page11.html

and You are talking about FEAR, not FC2, right? - *one* OLD game
:confused:

where there is 1 FPS difference, there is NO practical difference
-- it can be a driver issue also; i don't see Phenom II X4 beating core i7
rose.gif

Then you aren't paying attention. Look at Far Cry 2 low resolution how even the i7 920 beats everything and on the high quality the Phenom X4 beats the 920. Not an insignificant amount either, but something like 10%.

Look also how in other games in low resolutions Core i7 is faster(compared to other arch) than the frequency difference between each versions of Core i7's, but in high resolutions it doesn't even do that.

Funny, you say FEAR is an old game but all the CPUs are performing almost identical. I'd say that's current enough, don't you think?

First, what *gamer* in their RIGHT mind give a crap about 800x600 in any modern game?
:confused:

- and in HIGH res, the 940 gets 60s FPS; the 940 (55 fps) even beats X4 (53 fps) ... only 920 *chokes* (48 fps)



Find me more than ONE dubious benchmark that is not from LAST YEAR - 10 months old - where the newly released FC2 was having driver issues :p
rose.gif



edit: when i tested FEAR with Ph II 550 X2 at 3.1 and 3.9 GHz vs Ph II 720 X2 at 2.8 and 3.5 GHz vs Q9550s at 2.83, 3.1 and 4.0 GHz at 19x12 [maxed out details with GTX 280 - THIS month] - the averages only varied from 128 FPS to 134 FPS; the minimums varied by 2 FPS
- CPU bound a little, perhaps with GTX 280? .. i also tested with 4870-X2 where there is a bit more variance
:roll:

that review is still in my sig - page 13 for FEAR and i tested FC2 also
Athlon II vs. Phenom II (x2/x3) vs. C2Q

New AMD build - *budget* high performance gamer for 19x12
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Originally posted by: IntelUser2000
Actually, the site isn't wrong in anyway or anything. Take a look at this: http://www.techspot.com/review...l-core-i7-920-940-965/


High Quality Gaming: http://www.techspot.com/review...20-940-965/page11.html

Low Quality Gaming: http://www.techspot.com/review...20-940-965/page12.html

Take a note on Far Cry 2 how even the i7 920 beats CPUs based on every other architecture but on the high settings even the i7 940 is beat by the Q9650. Even with other games the gap is far closer.

There's something here and I don't think its simple as lack of cache or PCI Express. I believe this is due to the fact that while AMD had integrated memory controller systems there for a long time, Intel hasn't and Nehalem is rather a radical change in that view. In low resolutions the CPU is simply too powerful for driver bottlenecks/PCI Express to matter but in high resolutions, even some slowdowns communicating and interfacing with the GPU driver and game code will cause performance degradation.

Think about it: If its cache, why does AMD perform better than i7 on high resolutions? Even refer to the post earlier in this thread. Didn't somebody say that Core 2 on high resolutions used to be slower?

*edit - It was IE6. :p Anyway, wrong link below, try this link to their E8500 review, more CPU's tested. You'll see in a few games as the resolution goes up the Phenom pulls ahead. Not in every game and not always a lot, but here and there the Phenom does much better then you'd think it would... this is also the old Phenom 1 and at a fairly slow 2.6GHz.

This site is the one I think I was talking about earlier. I can't get the graphs to load, not sure if it is my work connection to the internet or their site, or our good ol IE6 standard that we use here. :p At any rate, if I recall some of the games are faster on a lowly 2.6GHz PhI than on much faster Intel C2Q's. I'm not sure if there is something to be said about Phenoms pulling ahead of the Intel chips at higher resolutions for some unknown reason or not, but the topic appears to pop up from time to time and shows up in reviews here and there for some reason.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i found *one* game out of the 14 that i test - ET-QW - where the Athlon II and the Ph II both beat C2Q (mostly at clock for clock; at 19x1/16x10 res)
(and it could be drivers; but i tested with both GTX 280 and 4870-X2)

BUT ..
1) IF money is no object, pick core i7 (or OC'd) C2Q
2) IF you are running Tri-Fire or SLi then pick Core i7 or (or OC'd) C2Q (see No1 reason above)

BUT ..
IF you are building a VALUE PC, then Ph II is a winner
rose.gif

 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: apoppin
First, what *gamer* in their RIGHT mind give a crap about 800x600 in any modern game?
:confused:
One that graduated high school?

http://en.wikipedia.org/wiki/D...able#Use_in_statistics
In the design of experiments, an independent variable's values are controlled or selected by the experimenter to determine its relationship to an observed phenomenon (i.e., the dependent variable). In such an experiment, an attempt is made to find evidence that the values of the independent variable determine the values of the dependent variable.

When we're trying to measure raw computing power, CPU is your independent variable, frame rate is your dependent variable, and the game's graphical demand is the extraneous variable.
In the link you provided, the authors admit that they did not account for extraneous variables (ie graphical demands of the game). your article:
Our gaming tests were performed using a single GeForce GTX 280 graphics card which is almost maxed out at 100% of its capacity
[...]
It appears that the GeForce GTX 280 had run out of legs in F.E.A.R Perseus Mandate at 1920x1200 using its maximum visual settings. Interestingly, the Core 2 Duo E8600 produced the best result here, beating the Core i7 965 EE by a single frame per second, and the slowest processor by just 3fps.


The test they performed is perfectly valid but it's not testing what you think it's testing. What the author is testing is whether or not the CPU is a limitation when your computer is using a GTX 280 graphics card. The varying frame rates in Far Cry 2 show that the CPU is one of the limitations in that game, so if you currently own a GTX280, you will see improvements by getting an i7. The uniform frame rates in Perseus Mandate show that this game is almost entirely limited by the video card, and the author explicitly states this when he says "it appears that the GeForce GTX 280 had run out of legs". What he means is that all of these processors tested still have lots of head room in that game, so upgrading to a GTX 295 would improve the frame rate.

If you want to test raw computing power, you need to eliminate the extraneous variables. Ideally, a gaming CPU test should be done at lowest resolution with the most powerful video card you can find. This completely eliminates the GPU bottleneck and gives us a picture of how fast the processor really is. When you look at something like Far Cry 2 where the game is obviously bottlenecked by both the CPU and the GPU, what does that data tell us? Basically nothing. It might say something like the i7 is 10% faster than a C2Q, but a completely CPU-bound test will show the i7 is maybe 20-30% faster. Simply put, doing a high resolution test is a complete waste of time.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ShawnD1
Originally posted by: apoppin
First, what *gamer* in their RIGHT mind give a crap about 800x600 in any modern game?
:confused:
One that graduated high school?

http://en.wikipedia.org/wiki/D...able#Use_in_statistics
In the design of experiments, an independent variable's values are controlled or selected by the experimenter to determine its relationship to an observed phenomenon (i.e., the dependent variable). In such an experiment, an attempt is made to find evidence that the values of the independent variable determine the values of the dependent variable.

When we're trying to measure raw computing power, CPU is your independent variable, frame rate is your dependent variable, and the game's graphical demand is the extraneous variable.
In the link you provided, the authors admit that they did not account for extraneous variables (ie graphical demands of the game). your article:
Our gaming tests were performed using a single GeForce GTX 280 graphics card which is almost maxed out at 100% of its capacity
[...]
It appears that the GeForce GTX 280 had run out of legs in F.E.A.R Perseus Mandate at 1920x1200 using its maximum visual settings. Interestingly, the Core 2 Duo E8600 produced the best result here, beating the Core i7 965 EE by a single frame per second, and the slowest processor by just 3fps.


The test they performed is perfectly valid but it's not testing what you think it's testing. What the author is testing is whether or not the CPU is a limitation when your computer is using a GTX 280 graphics card. The varying frame rates in Far Cry 2 show that the CPU is one of the limitations in that game, so if you currently own a GTX280, you will see improvements by getting an i7. The uniform frame rates in Perseus Mandate show that this game is almost entirely limited by the video card, and the author explicitly states this when he says "it appears that the GeForce GTX 280 had run out of legs". What he means is that all of these processors tested still have lots of head room in that game, so upgrading to a GTX 295 would improve the frame rate.

If you want to test raw computing power, you need to eliminate the extraneous variables. Ideally, a gaming CPU test should be done at lowest resolution with the most powerful video card you can find. This completely eliminates the GPU bottleneck and gives us a picture of how fast the processor really is. When you look at something like Far Cry 2 where the game is obviously bottlenecked by both the CPU and the GPU, what does that data tell us? Basically nothing. It might say something like the i7 is 10% faster than a C2Q, but a completely CPU-bound test will show the i7 is maybe 20-30% faster. Simply put, doing a high resolution test is a complete waste of time.

let me try one more time

i know what it tests :p
- i have my own review site and i graduated HS; i DO these very same tests that you guys quote as though it is your bible
- when *i* test games, i do not use a single video card; that is why i test with GTX 280 as well as 4870-X2 and Tri-Fire - it eliminates those variables and gives a pragmatic look at what a gamer can expect actually PLAYING those games
:roll:

What gamer PLAYS at 800x600 so that these tests matter at all to the practical experience of playing that game?
:confused:
 

imported_Scoop

Senior member
Dec 10, 2007
773
0
0
They should compare SLI performance with these platforms as it works better than Crossfire... oh yeah... too bad for AMD.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
I can see Phenoms doing better than core 2's at high res for one main reason...they have way higher read bandwidth. Once gpu memory is exceeded and data starts transferring over that pci express bus and hitting system ram, AMD's going to have an advantage.
As for the i7's, well supposedly QPI treats pci-express as a 2nd class citizen, while hypertransport puts it as equal to the cpu, so HT might be lower latency, but qpi should still be lower latency than core 2. That, and i5 should outperform i7 if that's the case.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: apoppin
First, what *gamer* in their RIGHT mind give a crap about 800x600 in any modern game?
:confused:

- and in HIGH res, the 940 gets 60s FPS; the 940 (55 fps) even beats X4 (53 fps) ... only 920 *chokes* (48 fps)

The only "gamers" I can think of who use that resolution are those who have only integrated graphics systems...and that's a whole nother kettle of fish.

1024x768 is probably still used by a very few, but 1280x1024 is probably the current "sweet spot" for the majority of "poor" gamers out there...19" LCDs are dirt cheep and look acceptable to most.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Viditor
Originally posted by: apoppin
First, what *gamer* in their RIGHT mind give a crap about 800x600 in any modern game?
:confused:

- and in HIGH res, the 940 gets 60s FPS; the 940 (55 fps) even beats X4 (53 fps) ... only 920 *chokes* (48 fps)

The only "gamers" I can think of who use that resolution are those who have only integrated graphics systems...and that's a whole nother kettle of fish.

1024x768 is probably still used by a very few, but 1280x1024 is probably the current "sweet spot" for the majority of "poor" gamers out there...19" LCDs are dirt cheep and look acceptable to most.
agreed; 12x10 is the most popular resolution reported by Steam Gamers; 14x9 is probably the WS 'most popular' followed by 16x10

But WHO wants to show core i7 gamers what it can do at 8x6? ... 1990s style :p
:roll:

My comment was directed to LAZY tech sites that drop the resolution to 8x6 and attempt to prove something about PC gaming and CPU performance
- it *really* shows they are too lazy to be thorough - or have no time to really test the HW with more than 1 GPU
rose.gif


imo, of course

 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: apoppin
What gamer PLAYS at 800x600 so that these tests matter at all to the practical experience of playing that game?
:confused:

Many of us don't care about practical tests because many of the games we'll be playing don't exist yet. A $100 processor and $300 processor will have very similar frame rates in today's games (Perseus Mandate), so the question is really how well they'll play tomorrow's games. Since we can't test games that don't exist, the best we can do is make impractical 800x600 tests to see how the fast the processors are in a CPU bound scenario. 2-3 years from now, buying a new video card for your old i7 or Phenom II system will create the same CPU limitations as an 800x600 test done today.

Doing practical tests works great if your system is upgraded every few months, but practical tests can mislead people who expect their hardware to last a bit longer. The classic example of this is when the 8800 GTS video card came in 320mb and 640mb variants. Anandtech's practical tests show the cards being the same, but the impractical tests (very high res with 4x AA) show one being much better than the other. Lots of people bought the 320mb version because they said "I don't play at super high res with AA" and those same people had to buy a new card after a year because it turned out those impractical tests reflected how well the card would run future games.

What's impractical and unrealistic today is practical and realistic tomorrow.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ShawnD1
Originally posted by: apoppin
What gamer PLAYS at 800x600 so that these tests matter at all to the practical experience of playing that game?
:confused:

Many of us don't care about practical tests because many of the games we'll be playing don't exist yet. A $100 processor and $300 processor will have very similar frame rates in today's games (Perseus Mandate), so the question is really how well they'll play tomorrow's games. Since we can't test games that don't exist, the best we can do is make impractical 800x600 tests to see how the fast the processors are in a CPU bound scenario. 2-3 years from now, buying a new video card for your old i7 or Phenom II system will create the same CPU limitations as an 800x600 test done today.

Doing practical tests works great if your system is upgraded every few months, but practical tests can mislead people who expect their hardware to last a bit longer. The classic example of this is when the 8800 GTS video card came in 320mb and 640mb variants. Anandtech's practical tests show the cards being the same, but the impractical tests (very high res with 4x AA) show one being much better than the other. Lots of people bought the 320mb version because they said "I don't play at super high res with AA" and those same people had to buy a new card after a year because it turned out those impractical tests reflected how well the card would run future games.

What's impractical and unrealistic today is practical and realistic tomorrow.

that sounds like a lame excuse to me for incomplete testing :p
- two to 3 years from now the video cards of today will be like today's DX9 videocards and there should be NEW tests for "old" Core i7 *then*

the guys in video knew better about the GTS 320 because we were there to warn them about high res and/or high AA - we knew the limitations of AT's (so-called "practical") testing way back then; if anything, i am pragmatic
rose.gif

 

Pelu

Golden Member
Mar 3, 2008
1,208
0
0
well... it doesnt matter... i dont remember where i think it was gaming forums, i post a link with some screenshots and their frame rates... the point was that games run well on the PHii and 4870x2 with out OC lol!.... but if you only going to play and not thinker with the computer then what is the point of having a computer at all.... better go ps3 or 360 any other kind of that bs if you going to just play at all...

console = mob
computer = smarties...