Actually, our primary business function is to monitor real-time performance for stock and equities trading workstations over time. We deploy a sophisticated array of metrics tracking agents and sample the runtime environment every 15-60 seconds for several weeks. The data is then aggregated to a central database for analysis through our custom Report Card templates.
Our workloads are modeled after nearly one billion sample records taken from numerous financial services institutions (some with over 30K employees). You'd be surprised what these people run on a daily basis (WinTV anyone?).
The test project is being conducted at the request of the InfoWorld Test Center, a consulting customer of ours and also a publication to which I personally contribute on a regular basis (check the masthead). As for the 248 quip, that's what IBM sent us (though the performance deltas we're seeing would likely not be negated by a minor uptick in Opteron clock frequency).
Also, everyone seems up in arms over the media encoding component. We added that to diversify our workload mixture so that it reflects *more* than just our financial services models - again, per request of the Test Center. Since another leading market for these systems is high-end multimedia content creation (this per the vendors' own sales data), it makes sense to test their handling of this type of task.
Regardless, the final score is derived by aggregating results from each of workloads. No matter how you slice it - individual workload scores or the average across a mixed set of tasks (including Database, Workflow *and* Media Encoding) - the Opteron is slower. Our methodology is sound and our results easily reproducible. Enough said. Now go read the article when it hits in a couple of weeks...
RCK