• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Why does the P4 suck in games??

The 9xx series is generally pretty competitive, but it gets stomped on in gaming benchmarks, somtimes with a 30+ FPS difference between that and the A64's. Why? What is it about games it cant do?

Is it more accurate to say, its always been the same at gaming, its just that the A64 is simply better? Rather than it sucks at gaming.
 
Originally posted by: Soviet
The 9xx series is generally pretty competitive, but it gets stomped on in gaming benchmarks, somtimes with a 30+ FPS difference between that and the A64's. Why? What is it about games it cant do?

Is it more accurate to say, its always been the same at gaming, its just that the A64 is simply better? Rather than it sucks at gaming.

A64 is more efficient architecture for gaming, as is the pentium-m. My Dothan @2.4ghz blew away my P4 @3.91ghz in gaming. But at higher resolutions, you are going to be much more limited by the graphics card(s) anyway and won't get that big of a differance.
 
Basically, P4 design and not much clock speed increase has meant Intel haven't gained much in gaming performance - there's only so much a few extra mhz can do on that architecture, whereas AMD made a big leap from AthlonXP to Athlon64, where every increase in MHz is much more useful, so they've got much further ahead.
 
Originally posted by: Dark Cupcake
It does not suck, its same speed as before, its just that now the athlon 64 are better for games.

QFT. The Pentium 4s are quite good at games(Considering the fact that even the lowest end models can run any game out without issue). It is just the fact that Athlon 64s happen to be better at games.
 
Most published benchmarks, like Sysmark and PCmark, are highly optimized for the P4's quirks and also concentrate on tasks and code that it does particularly well, while avoiding things that expose its weaknesses. The very same is true about the applications that are mostly used in published benchmarking, like 3DS and video encoding.

This means that they paint a onesided and very rosy picture of the P4's performance. In fact as favorable as can possibly be.

It does not correspond to general performance on general diverse code. This fact is then brutally exposed by gaming benchmarks. But it can also be exposed by benchmarking some other applications, though this is never done these days.

Do you remember how poorly the P4 did in the early Willamette days? Most people think that the Northwood changed all that. But while Northwood was somewhat better, the main thing that happened at the time was that benchmarks and benchmark code changed.

Pentium-M, Core and Core2 are not Pentium4s so you will not see the same discrepancy with them.

All that said, I'm not exactly a gamer, but in my limited experience an old 2.8 GHz P4 seem to do well enough. Even on fairly modern games like Far Cry. You don't seem to need an A64 to play games. Just enough RAM and a decent videocard.
 
pentium 4's had too long pipelines(31 stage ithink).That was their biggest mistake.They thought that clock speeds were everything and amd proved them so very wrong.
 
An AMD boss once said, "Intel would kill to have our Integrated (on-chip) Memory Controller"
Sorry, but I don't have a link or even remember where exactly I read it, but it was worded just like that (maybe 'kill' wasn't there?)

It's a combination of the Memory Controller, a shorter instruction pipeline and higher per-clock efficiency of the K8 that made NetBurst a bad choice for gaming. I think ExtremeTech once had an article that suggested a conspiracy theory, that games have to be specifically compiled for certain CPU architectures (for performance) and most developers could increase performance by offering different executables for various CPUs.
 
Originally posted by: Vee
Most published benchmarks, like Sysmark and PCmark, are highly optimized for the P4's quirks and also concentrate on tasks and code that it does particularly well, while avoiding things that expose its weaknesses. The very same is true about the applications that are mostly used in published benchmarking, like 3DS and video encoding.

This means that they paint a onesided and very rosy picture of the P4's performance. In fact as favorable as can possibly be.

It does not correspond to general performance on general diverse code. This fact is then brutally exposed by gaming benchmarks. But it can also be exposed by benchmarking some other applications, though this is never done these days.

Do you remember how poorly the P4 did in the early Willamette days? Most people think that the Northwood changed all that. But while Northwood was somewhat better, the main thing that happened at the time was that benchmarks and benchmark code changed.

Pentium-M, Core and Core2 are not Pentium4s so you will not see the same discrepancy with them.

All that said, I'm not exactly a gamer, but in my limited experience an old 2.8 GHz P4 seem to do well enough. Even on fairly modern games like Far Cry. You don't seem to need an A64 to play games. Just enough RAM and a decent videocard.


Actually Northwoods were quite a bit better than athlon XP, and HT really improoved general performance. Not all code was optimised, but a lot was using sse2 (which amd now has anyways)

Northwoods and willamette had a 20 stage pipeline, prescots have a 31 stage. Its not untill the prescot that p4 started to suck. The willamette ones were not cloacked high enough, and saying that it was beaten at per clock basis is not valid as the netburst architecture allowed the cpu to be clocked higher, and thats how it gained performance, by having a significantly higher clock speed.

Early athlon XPs were better than the targeted p4s, on the other hand the later ones eg the 3200+ barton got beaten by the 2.8ghz northwood in a lot of things, where the 3.0ghz one was better in almost everything. (the 800mhz FSB with HT northwoods)
 
Been said already but it?s the memory controller, 128k L1 vs. 32k ish. Maybe more registers help too? I don?t think the older Athlon XP cores are much different sans the memory controller. But the 64s are better for gaming.
 
Originally posted by: n19htmare
can you tell the difference between 160fps and 190fps... if so... Bravo.

I can tell the difference. The longer graph means better and more clear frames at teh per second thingy :laugh:
 
Originally posted by: n19htmare
can you tell the difference between 160fps and 190fps... if so... Bravo.

:roll: Most people don't buy Athlon 64s to play Quake 3 @ 1024x768. Even high-end SLI rigs are choking on modern titles with moderate to high details on.
 
Pentium 4 is an architecture that is best at "instructions throughput". As long as the instruction stream is flowing in the processor, the P4 performs beautifully. However (and with the newer P4 architecture is more visible), the long pipeline makes it very costly to run code with lots of jumps.
The idea is one instruction enters the processor, walks over all the stages in the pipeline (well, a good part of them at least) and is retired. The microprocessor don't know before running what instruction to do next, so it assumes the instruction in the next memory location. When there is a jump, a P4 must discard all the instructions that were loaded and partially executed before the jump address is known. An Athlon64 must do just the same, BUT its pipeline is shorter, and the next instruction will be executed faster.
Pentium4 (the processor itself) is penalized in the worst case by a 25 something clocks period when a jump instruction "fools" it. The Athlon64 is penalized by a 10 something clocks period. So, taking into consideration a longer clock cycle for Athlon64, an Athlon can miss three jumps for every two a P4 misses. Quite a difference.

Also, Athlon has some advantages in the time it can bring data from main memory, and is well served by memory with rapid access. Pentium4 tries to load as much data as possible, as long as it might be needed. This require a big bandwidth - and while the difference in speed between Athlon 64 with single channel memory and dual channel memory is quite small (a 10% mostly from doubling bandwidth), and between AM2 and Athlon64 with dual channel DDR is not significant (again a doubling of bandwidth), the P4 of the 1600MHz variety was helped alot going from single channel SDR to dual channel SDR and later (2400+MHz) to dual channel DDR.

So, in the end, as long as your code is friendly to the P4, it runs better than an Athlon64 - and Photoshop plugins are very friendly to P4. Athlon64 runs much better with non-optimized code (non-optimized for P4 that is)
 
Originally posted by: Dark Cupcake
Originally posted by: Vee
Most published benchmarks, like Sysmark and PCmark, are highly optimized for the P4's quirks and also concentrate on tasks and code that it does particularly well, while avoiding things that expose its weaknesses. The very same is true about the applications that are mostly used in published benchmarking, like 3DS and video encoding.

This means that they paint a onesided and very rosy picture of the P4's performance. In fact as favorable as can possibly be.

It does not correspond to general performance on general diverse code. This fact is then brutally exposed by gaming benchmarks. But it can also be exposed by benchmarking some other applications, though this is never done these days.

Do you remember how poorly the P4 did in the early Willamette days? Most people think that the Northwood changed all that. But while Northwood was somewhat better, the main thing that happened at the time was that benchmarks and benchmark code changed.

Pentium-M, Core and Core2 are not Pentium4s so you will not see the same discrepancy with them.

All that said, I'm not exactly a gamer, but in my limited experience an old 2.8 GHz P4 seem to do well enough. Even on fairly modern games like Far Cry. You don't seem to need an A64 to play games. Just enough RAM and a decent videocard.


Actually Northwoods were quite a bit better than athlon XP, and HT really improoved general performance. Not all code was optimised, but a lot was using sse2 (which amd now has anyways)

Northwoods and willamette had a 20 stage pipeline, prescots have a 31 stage. Its not untill the prescot that p4 started to suck. The willamette ones were not cloacked high enough, and saying that it was beaten at per clock basis is not valid as the netburst architecture allowed the cpu to be clocked higher, and thats how it gained performance, by having a significantly higher clock speed.

Early athlon XPs were better than the targeted p4s, on the other hand the later ones eg the 3200+ barton got beaten by the 2.8ghz northwood in a lot of things, where the 3.0ghz one was better in almost everything. (the 800mhz FSB with HT northwoods)

Willamette had 256KB cache, and Northwoods had 512kB. Northwood was the star in the P4 line.
HT improved performance in some cases, and even decreased it in others. Mostly, it had a positive benefit, but not very much - a 10 to 15 percent maybe.
 
Originally posted by: Calin
Originally posted by: Dark Cupcake
Originally posted by: Vee
Most published benchmarks, like Sysmark and PCmark, are highly optimized for the P4's quirks and also concentrate on tasks and code that it does particularly well, while avoiding things that expose its weaknesses. The very same is true about the applications that are mostly used in published benchmarking, like 3DS and video encoding.

This means that they paint a onesided and very rosy picture of the P4's performance. In fact as favorable as can possibly be.

It does not correspond to general performance on general diverse code. This fact is then brutally exposed by gaming benchmarks. But it can also be exposed by benchmarking some other applications, though this is never done these days.

Do you remember how poorly the P4 did in the early Willamette days? Most people think that the Northwood changed all that. But while Northwood was somewhat better, the main thing that happened at the time was that benchmarks and benchmark code changed.

Pentium-M, Core and Core2 are not Pentium4s so you will not see the same discrepancy with them.

All that said, I'm not exactly a gamer, but in my limited experience an old 2.8 GHz P4 seem to do well enough. Even on fairly modern games like Far Cry. You don't seem to need an A64 to play games. Just enough RAM and a decent videocard.


Actually Northwoods were quite a bit better than athlon XP, and HT really improoved general performance. Not all code was optimised, but a lot was using sse2 (which amd now has anyways)

Northwoods and willamette had a 20 stage pipeline, prescots have a 31 stage. Its not untill the prescot that p4 started to suck. The willamette ones were not cloacked high enough, and saying that it was beaten at per clock basis is not valid as the netburst architecture allowed the cpu to be clocked higher, and thats how it gained performance, by having a significantly higher clock speed.

Early athlon XPs were better than the targeted p4s, on the other hand the later ones eg the 3200+ barton got beaten by the 2.8ghz northwood in a lot of things, where the 3.0ghz one was better in almost everything. (the 800mhz FSB with HT northwoods)

Willamette had 256KB cache, and Northwoods had 512kB. Northwood was the star in the P4 line.
HT improved performance in some cases, and even decreased it in others. Mostly, it had a positive benefit, but not very much - a 10 to 15 percent maybe.


But what HT allowed is pretty much to have a system almost as responsive as dual core one in one big and one light task. While single core processors chocked on just the big task.
It was not a huge performance diff and like u said in some situation actually reduced performance, but it allowed say to watch a dvd and do encoding, while on the single core is no possible with out lag (unless u set priority for encoding to low thats is)
 
Originally posted by: Dark Cupcake
Originally posted by: Vee
Most published benchmarks, like Sysmark and PCmark, are highly optimized for the P4's quirks and also concentrate on tasks and code that it does particularly well, while avoiding things that expose its weaknesses. The very same is true about the applications that are mostly used in published benchmarking, like 3DS and video encoding.

This means that they paint a onesided and very rosy picture of the P4's performance. In fact as favorable as can possibly be.

It does not correspond to general performance on general diverse code. This fact is then brutally exposed by gaming benchmarks. But it can also be exposed by benchmarking some other applications, though this is never done these days.

Do you remember how poorly the P4 did in the early Willamette days? Most people think that the Northwood changed all that. But while Northwood was somewhat better, the main thing that happened at the time was that benchmarks and benchmark code changed.

Pentium-M, Core and Core2 are not Pentium4s so you will not see the same discrepancy with them.

All that said, I'm not exactly a gamer, but in my limited experience an old 2.8 GHz P4 seem to do well enough. Even on fairly modern games like Far Cry. You don't seem to need an A64 to play games. Just enough RAM and a decent videocard.


Actually Northwoods were quite a bit better than athlon XP, and HT really improoved general performance. Not all code was optimised, but a lot was using sse2 (which amd now has anyways)

Northwoods and willamette had a 20 stage pipeline, prescots have a 31 stage. Its not untill the prescot that p4 started to suck. The willamette ones were not cloacked high enough, and saying that it was beaten at per clock basis is not valid as the netburst architecture allowed the cpu to be clocked higher, and thats how it gained performance, by having a significantly higher clock speed.

Early athlon XPs were better than the targeted p4s, on the other hand the later ones eg the 3200+ barton got beaten by the 2.8ghz northwood in a lot of things, where the 3.0ghz one was better in almost everything. (the 800mhz FSB with HT northwoods)

Vee has a point about the Northwoods being subpar because of the Bapco 2002 controversy where the benchmark suite was overhauled to favour Intel processors.
 
Back
Top