Lordy, Lordy let the fun begin!:biggrin:
Lordy, Lordy let the fun begin!:biggrin:
cbunny, you can see in my sig I have 2500kd OC to 4.5Ghz. They play Crysis3 nicely as does the 8350 @ 4.6Ghz and quite frankly so does the 8150 @4.2 Ghz (Surprise, surprise!!!).for the future of gaming, knowing that the PS4 has an 8 core AMD processor? and the Xbox720 is likely to as well.
Thinking more in line with future games being optimized for higher numbers of cores or threads. Vishera 2.0 and/or Steamroller.
for the future of gaming, knowing that the PS4 has an 8 core AMD processor? and the Xbox720 is likely to as well.
Thinking more in line with future games being optimized for higher numbers of cores or threads. Vishera 2.0 and/or Steamroller.
what are you talking about? are you a coder?! there no difference at all(or very small), in code for 2 cores and 1000 cores, yes there is big difference from 1 core and >1, but there no way to avoid this, multi core is the future.It takes time and effort to code for more cores neither of which is good for devs (I'd rather have them designing a great game than spending time and money trying to optimize for 8 slow cores).
what are you talking about? are you a coder?! there no difference at all(or very small), in code for 2 cores and 1000 cores, yes there is big difference from 1 core and >1, but there no way to avoid this, multi core is the future.
Please tell us about your software development experience so you can state that there's no to very little difficulty in scaling an application - especially a game - from 2 to 1,000 cores.
An example of your work would be great showing this would be excellent.
Or you could just explain to us how you managed to break Amdahl's Law with that 1,000 core application.
Snip
In 5 or 6 years from now will you still be using the CPU you buy today? Is it worth it to take the trade off in performance for the next X number of years for something that MAY happen in the future?
Nope.
What you are describing is future-proofing. "If I get this now it will last longer / be better in the future because of X". In the 30 years of PC history it has never paid off.
I won't go so far as to respectfully disagree, but this does come down to different strokes for different folks.Respectfully disagree on this one.
For the same price I could have had either the E8500 or the Q8200. At the time (2008 or so) everyone recommended the E8500 because of it's single threaded performance and how it would be a long time (if ever) until the quad was usable. Not to mention limited overclocking potential of the Q8xxx series.
5 years later, which can still play EVERY game at playable settings? Definitely not the dual core. Most games since 2010 have shown tangible and sometimes massive gains with a quad core. Some require it to even be playable on even the lowest of settings. A high clocked octo-core is going to be quite desirable in the near future. History repeats itself, over and over. I also had the same exact conundrum between an Athlon 64 4000+ and the X2 4200+. Most stated for gaming that the 4000+ was much better for overclocking/cooling, and that I'd have a wasted core. Spending that extra little bit for another core, albeit slower, was by far the best decision and that system is still quite fast to this day for everyday tasks.
More cores the better. Especially with a future of many-core systems.
Respectfully disagree on this one.
Please tell us about your software development experience so you can state that there's no to very little difficulty in scaling an application - especially a game - from 2 to 1,000 cores.
An example of your work would be great showing this would be excellent.
Or you could just explain to us how you managed to break Amdahl's Law with that 1,000 core application.