Black96ws6
Member
http://www.legitreviews.com/article/1766/
Ack. It's getting beat left and right by lower clocked Llano and PhenomII quad cores...
Ack. It's getting beat left and right by lower clocked Llano and PhenomII quad cores...
http://www.legitreviews.com/article/1766/
Ack. It's getting beat left and right by lower clocked Llano and PhenomII quad cores...
IMO this is why the only bulldozer-based server part we see AMD mentioning is the full 16-core Interlagos SKU.
Anything less than the full monte is a chip that sucks in low-threaded apps AND sucks in multi-threaded apps compared to the 45nm lineup.
http://www.legitreviews.com/article/1766/
Ack. It's getting beat left and right by lower clocked Llano and PhenomII quad cores...
At least get some game benchmarks for games that are occasionally CPU-limited below 60 fps, like Battlefield 3 or Starcraft 2.
What about server software that is licensed on a per core (not per socket) basis (not sure if this is still practiced)? Will their marketing of the modules as "2 cores" hurt?
Terrible review, only has benchmarks for 3 actual programs, and the CPU does fine in them.
Resident evil 5, 136 fps. I guess that sucks because a $1000 990x can get 156 fps, but honestly who cares?
hawx 2, 92 fps. bottom of the list, but again, who the hell cares about a difference between 92 fps and 101 fps? Are you going to pay $800 extra to get another 10 fps?
S.T.A.L.K.E.R.: CoP, 55 fps. Uh oh, below 60, this might be detectable in actual play... except even on the fastest tested CPU, it only got 57 fps... again nobody is going to care to spend an extra $800 for 2 more fps on average.
I don't even bother looking at artificial benchmarks anymore. I agree, the 4100 probably sucks, but it would really be nice if the review actually showed it sucking in some relevant program. At least get some game benchmarks for games that are occasionally CPU-limited below 60 fps, like Battlefield 3 or Starcraft 2.
You ever get tired of parroting this stuff over and over?
You ever get tired of parroting this stuff over and over?
That is a very interesting point. For anyone that works with Tivoli software you're probably familiar with their "per core" PVU cost system.
http://www-01.ibm.com/software/lotus/passportadvantage/pvu_licensing_for_customers.html
What is interesting is Opterons are 50pt each while Intel's "better" cores actually cost closer to 70pt or even 100pt each.
I wouldn't be surprised once the Opteron Bulldozers come out if IBM updates their sheet with sub 50pt cores.
I hate this. Ive been a long time AMD fanboy and I was so looking forward to Bulldozer. Although after the third year of waiting, I kinda didnt care too much anymore and was expecting it to be a flop but I never wouldve thought it would be this bad to the point where its actually SLOWER than Phenom II!! Thats just insane.
But with AMD cleaning house and laying off a boat load of people, maybe theyll get the ship righted and come back with something decent. I dont think theyll ever get the performance crown from Intel again cause Intel is just too far ahead at this point, but hopefully they can remain competitive and a viable alternative to Intel. We as enthusiasts need that cause Intel (or any company for that matter) with no competition is not a good thing. I gotta think, if AMD had never existed, we'd all still be gaming on Pentium 4 single cores that we paid $500 for! :\
Priced lower, not as bad as some are putting it but being a new architeture it should do better than that. They better release Buldozer II soon...
I don't think we would be stuck with P4 if AMD didn't exist. The Core architecture originated in mobile platforms with the need to increase performance/watt and lower power consumption. We probably wouldn't have Sandy Bridge today, but we would have better than a P4.
Piledriver is supposed to improve performance per watt by 10-15%. AFAIK they've never claimed IPC would improve by that much.Last I heard late 3Q early 4Q12 on the plus side, that gives AMD time to move BD to Piledriver and hopefully fix a fair number of issues. Trinity is 'supposed' to improve IPC by ~10-15%. Of course, AMD needs this now, but we'll only get BD B3 (next quarter?).
One can only hope that with the extra time BDII will see a 15-20% bump in IPC (which is reasonable considering the number of bottlenecks that reviewers have found in BDI's implementation and the fact that Piledriver has already been implemented in Trinity).