• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Legit Reviews - FX-4100 Quad Core 3.6GHz Bulldozer Processor Review

What an unmitigated disaster. Ebay AMD Athlon II X4 620. $65. Would ker-stomp this thing. This chip should be same price as athlon 631. ie $87.

It kind of pisses me off that they didnt run many overclocking tests. What they think we gonna do with this chip? Of course we're gonna overclock it. So it would be really frickin helpful to know how much better than an i3 this will be when overclocked, AND how much power its gonna cost. Also I'd like a comparison with the cheapest i5-2400 and with an overclocked 1035T. My hunch is the 1035T would win but it would be nice to know for sure.

And what is the deal with that memory bandwidth? That has to be because of the really slow L3, right? That is truly awful. Can we disable that L3?
 
Last edited:
well they are supposed to have a8-3870k released q4. i think unlocked llano would be awesome, just waiting to see it show up if they didnt cancel it.
 
IMO this is why the only bulldozer-based server part we see AMD mentioning is the full 16-core Interlagos SKU.

Anything less than the full monte is a chip that sucks in low-threaded apps AND sucks in multi-threaded apps compared to the 45nm lineup.

Yeah, seems like Interlagos is only going to be useful for large VM clusters if priced right :|
 
What about server software that is licensed on a per core (not per socket) basis (not sure if this is still practiced)? Will their marketing of the modules as "2 cores" hurt?
 
Terrible review, only has benchmarks for 3 actual programs, and the CPU does fine in them.

Resident evil 5, 136 fps. I guess that sucks because a $1000 990x can get 156 fps, but honestly who cares?

hawx 2, 92 fps. bottom of the list, but again, who the hell cares about a difference between 92 fps and 101 fps? Are you going to pay $800 extra to get another 10 fps?

S.T.A.L.K.E.R.: CoP, 55 fps. Uh oh, below 60, this might be detectable in actual play... except even on the fastest tested CPU, it only got 57 fps... again nobody is going to care to spend an extra $800 for 2 more fps on average.

I don't even bother looking at artificial benchmarks anymore. I agree, the 4100 probably sucks, but it would really be nice if the review actually showed it sucking in some relevant program. At least get some game benchmarks for games that are occasionally CPU-limited below 60 fps, like Battlefield 3 or Starcraft 2.
 
What about server software that is licensed on a per core (not per socket) basis (not sure if this is still practiced)? Will their marketing of the modules as "2 cores" hurt?

That is a very interesting point. For anyone that works with Tivoli software you're probably familiar with their "per core" PVU cost system.

http://www-01.ibm.com/software/lotus/passportadvantage/pvu_licensing_for_customers.html

What is interesting is Opterons are 50pt each while Intel's "better" cores actually cost closer to 70pt or even 100pt each.

I wouldn't be surprised once the Opteron Bulldozers come out if IBM updates their sheet with sub 50pt cores.
 
Terrible review, only has benchmarks for 3 actual programs, and the CPU does fine in them.

Resident evil 5, 136 fps. I guess that sucks because a $1000 990x can get 156 fps, but honestly who cares?

hawx 2, 92 fps. bottom of the list, but again, who the hell cares about a difference between 92 fps and 101 fps? Are you going to pay $800 extra to get another 10 fps?

S.T.A.L.K.E.R.: CoP, 55 fps. Uh oh, below 60, this might be detectable in actual play... except even on the fastest tested CPU, it only got 57 fps... again nobody is going to care to spend an extra $800 for 2 more fps on average.

I don't even bother looking at artificial benchmarks anymore. I agree, the 4100 probably sucks, but it would really be nice if the review actually showed it sucking in some relevant program. At least get some game benchmarks for games that are occasionally CPU-limited below 60 fps, like Battlefield 3 or Starcraft 2.

You ever get tired of parroting this stuff over and over?
 
If anything, this review game the CPU a pretty decent review. For the price though, any other existing quad core is better. It essentially traded blows with the Intel dual-core. A sub $100 X4 is also a better buy.

This CPU need to be <$80 to really be considered.
 
I hate this. Ive been a long time AMD fanboy and I was so looking forward to Bulldozer. Although after the third year of waiting, I kinda didnt care too much anymore and was expecting it to be a flop but I never wouldve thought it would be this bad to the point where its actually SLOWER than Phenom II!! Thats just insane.

But with AMD cleaning house and laying off a boat load of people, maybe theyll get the ship righted and come back with something decent. I dont think theyll ever get the performance crown from Intel again cause Intel is just too far ahead at this point, but hopefully they can remain competitive and a viable alternative to Intel. We as enthusiasts need that cause Intel (or any company for that matter) with no competition is not a good thing. I gotta think, if AMD had never existed, we'd all still be gaming on Pentium 4 single cores that we paid $500 for! :\
 
That is a very interesting point. For anyone that works with Tivoli software you're probably familiar with their "per core" PVU cost system.

http://www-01.ibm.com/software/lotus/passportadvantage/pvu_licensing_for_customers.html

What is interesting is Opterons are 50pt each while Intel's "better" cores actually cost closer to 70pt or even 100pt each.

I wouldn't be surprised once the Opteron Bulldozers come out if IBM updates their sheet with sub 50pt cores.

Argh - Trivoli, I'd rather be stabbed in the liverz than have to deal with that again :thumbsdown:
 
I hate this. Ive been a long time AMD fanboy and I was so looking forward to Bulldozer. Although after the third year of waiting, I kinda didnt care too much anymore and was expecting it to be a flop but I never wouldve thought it would be this bad to the point where its actually SLOWER than Phenom II!! Thats just insane.

But with AMD cleaning house and laying off a boat load of people, maybe theyll get the ship righted and come back with something decent. I dont think theyll ever get the performance crown from Intel again cause Intel is just too far ahead at this point, but hopefully they can remain competitive and a viable alternative to Intel. We as enthusiasts need that cause Intel (or any company for that matter) with no competition is not a good thing. I gotta think, if AMD had never existed, we'd all still be gaming on Pentium 4 single cores that we paid $500 for! :\

I don't think we would be stuck with P4 if AMD didn't exist. The Core architecture originated in mobile platforms with the need to increase performance/watt and lower power consumption. We probably wouldn't have Sandy Bridge today, but we would have better than a P4.
 
Priced lower, not as bad as some are putting it but being a new architeture it should do better than that. They better release Buldozer II soon...


Last I heard late 3Q early 4Q12 on the plus side, that gives AMD time to move BD to Piledriver and hopefully fix a fair number of issues. Trinity is 'supposed' to improve IPC by ~10-15%. Of course, AMD needs this now, but we'll only get BD B3 (next quarter?).

One can only hope that with the extra time BDII will see a 15-20% bump in IPC (which is reasonable considering the number of bottlenecks that reviewers have found in BDI's implementation and the fact that Piledriver has already been implemented in Trinity).
 
I don't think we would be stuck with P4 if AMD didn't exist. The Core architecture originated in mobile platforms with the need to increase performance/watt and lower power consumption. We probably wouldn't have Sandy Bridge today, but we would have better than a P4.

Yeah I know. I was just exaggerating on that, but we wouldnt be too far ahead on that. Even with AMD dragging up the rear all these years, they were still close enough to give Intel a reason to step it up. I mean weve gone from Nehelam to Sandy Bridge to Sandy Bridge-E in a few weeks to Ivy Bridge in a few months, all within a couple years. I dont think Intel would be cranking out new lines of newer and better procs this fast if AMD werent around.

I mean, why is Intel making us pay $220 to get the cheapest, overclockable Intel processor? Because they can. If AMD were right there behind them, I have a feeling all of Intel processors would be "K" models.
 
Last I heard late 3Q early 4Q12 on the plus side, that gives AMD time to move BD to Piledriver and hopefully fix a fair number of issues. Trinity is 'supposed' to improve IPC by ~10-15&#37;. Of course, AMD needs this now, but we'll only get BD B3 (next quarter?).

One can only hope that with the extra time BDII will see a 15-20% bump in IPC (which is reasonable considering the number of bottlenecks that reviewers have found in BDI's implementation and the fact that Piledriver has already been implemented in Trinity).
Piledriver is supposed to improve performance per watt by 10-15%. AFAIK they've never claimed IPC would improve by that much.
 
Back
Top