• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

FEAR to be optimised for dual core..

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: jiffylube1024
I think they're just saying this to save face so that people buy the game despite horrid performance on 95% of the hardware out there.

Unless the game is designed for dual core from the ground up, I don't expect a significant improvement. It could still give a noticeable improvement though... who knows.


"OK, our game cannot run on 95% on the hardware out there, so we make it faster for those 0.01% with Dual-Core CPUs!"
 
It's good to see developers making use of new technology, and I suppose everyone whose pride is hurt by having to turn down a few settings in a game now has a good reason to pick up an X2. 😀
 
Originally posted by: StrangerGuy
Originally posted by: jiffylube1024
I think they're just saying this to save face so that people buy the game despite horrid performance on 95% of the hardware out there.

Unless the game is designed for dual core from the ground up, I don't expect a significant improvement. It could still give a noticeable improvement though... who knows.


"OK, our game cannot run on 95% on the hardware out there, so we make it faster for those 0.01% with Dual-Core CPUs!"

Very good point.

They should optimize the game for every hardware, instead.
 
In this video, the person says that FEAR can run on directX 8 cards so you wont need a super high end system to run it.....
 
Originally posted by: dopefishzzz
Originally posted by: StrangerGuy
Originally posted by: jiffylube1024
I think they're just saying this to save face so that people buy the game despite horrid performance on 95% of the hardware out there.

Unless the game is designed for dual core from the ground up, I don't expect a significant improvement. It could still give a noticeable improvement though... who knows.


"OK, our game cannot run on 95% on the hardware out there, so we make it faster for those 0.01% with Dual-Core CPUs!"

Very good point.

They should optimize the game for every hardware, instead.

I agree. I am getting tire of buying a new video card every six-eight months. And from now on it will be 2!!! video card every 6-8 months.

If I didn't play so many games, I wouldn't have gotten a job to pay for it all. 😀
 
Originally posted by: nitromullet
Originally posted by: apoppin
Sounds like marketing. 😛

it's gonna be "awhile" before the dual core optimizations will be useful [look at SM 3.0] . . . you need a game "built from the ground up" with it . . . expect another 3 years or so . . .

:roll:
I imagine that you're probably right, but they have to start somewhere. SM 3.0 cards have only been available since June of last year, and even then only from NVIDIA so I don't think it's a fair comparison. This is a different story. Both Intel and AMD are pushing dual core. Seeing as how none of the dual core chips from either is clocked any faster than single core chips that have been available for over a year, threading is currently the ONLY way to increase performance.

Edit: Doesn't the Unreal Engine 3 make use of multi threading. and isn't that supposed to be the UT2007 engine? We're in the second half of 2005, no matter how you slice it, that isn't 3 yrs from now.

the 3.0 and DC is not a "comparison" - M$ implimented 3.0 and nVidia lead the marketing charge [as a feature ati doesn't yet have]. . . anyway, AMD doesn't matter - Intel is the moving force behind Dual Core now and it IS marketing. . . . how ELSE are they gonna sell DC to gamers?

Yes, the Unreal3 ENGINE allows for DC but the game progremmers will have to [learn how to use it and] impliment it in each game . . . the "3 years" is my guesstimation as to when DC will be "mainstream" in games . . . watch for it first in next gen consoles. 😉
 
(First time poster, long time lurker)

I think the plan for game utilization for dual core chips will be AI / sound and all non gfx task handled by the second core.

This could help for a bigger processing resource budget for games in the future.
 
Originally posted by: CodeMonkey666
(First time poster, long time lurker)

I think the plan for game utilization for dual core chips will be AI / sound and all non gfx task handled by the second core.

This could help for a bigger processing resource budget for games in the future.

welcome to AT 😀
 
Back
Top