• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Is the PC hardware industry slowing down?

Kroze

Diamond Member
Seriously, I just picked up one of my 6 months old maximum pc magazine and reading about the reviews of the Radeon 5970 and how to build a budget gaming pc for $647 dollars.

Fast forward 6 months and they haven't come out with a new graphic card that is any faster than the 5970 (you would think speed would double every 6 months) and yet the price is still the same.

The $647 dollars budget gaming pc is still cost the same as it was 6 months ago if you check the prices on newegg. WTF.

Also have we hit the bottleneck for processor speed of 4ghz? Not much innovation or speed improvement since the Intel Quad core CPU. Now they're just adding more cores and cache. You figure 6 months is an eternity in technology age and you would have a cpu that would be twice as fast as the 3 year old Intel E8400.
 
6 months is not an eternity. But yes, I do feel things (at least in the CPU market) have slowed down a bit. The i7 was released in 2008, and today in 2010 the 2008 top-o-the-line i7 is still extremely competitive with current i7's.
 
We are starting to reach the limits of silicon.

Video cards are still the same price and we havent gotten new ones yet because we dont need too. What game can't a 5970 run...let alone a lower 5850...at max settings at super high resolutions? There are none. So why make faster GPU's when what you have already is too fast. For PC graphics and technology to make a big jump tech wise we need newer fabrication materials (graphine?) and we need the next generation of consoles. No PC game BESIDES maybe Crysis (which isnt true now because of Crysis 2) cant be ported to the consoles now. Most PC games are console ports and use the Unreal 3 engine...so we are running 5 year old software on 6 month old super GPU's...we need the next wave of high tech next-gen consoles to come out so that the ported games will be Unreal 4 tech and not Unreal 3 tech...make sense? Any of it? 😛
 
it just pissed me off that prices have stagnated and little or no innovation since 2008 in the graphic card, cpu. not only that, ram prices skyrocketted.
 
Moore's Law Fail

If you are saying that the OP's idea of Moore's Law is incorrect and that is the 'fail' aspect then you are correct.

If you are saying that the current situation does not properly reflect Moore's Law then you are incorrect.

Moore's Law is that transistor count, not speed, doubles every 18 months (approximately). The reality would more accurately line up with 24 months I believe.
 
While I wish things would speed up I am happy that my "aging" Q6600 is still relevant and able to handle the latest games
 
I'm happy with things as they are...I can run everything I want to with an i5 750 and a 4890, which means I save money. 😛
 
We are starting to reach the limits of silicon.

Video cards are still the same price and we havent gotten new ones yet because we dont need too. What game can't a 5970 run...let alone a lower 5850...at max settings at super high resolutions? There are none. So why make faster GPU's when what you have already is too fast. For PC graphics and technology to make a big jump tech wise we need newer fabrication materials (graphine?) and we need the next generation of consoles. No PC game BESIDES maybe Crysis (which isnt true now because of Crysis 2) cant be ported to the consoles now. Most PC games are console ports and use the Unreal 3 engine...so we are running 5 year old software on 6 month old super GPU's...we need the next wave of high tech next-gen consoles to come out so that the ported games will be Unreal 4 tech and not Unreal 3 tech...make sense? Any of it? 😛

Graphene is nowhere near ready to be used as a semiconducting device.
 
On the other hand, when's the last time we had some revolutionary graphics in a game? Crysis? Three years ago? I'm more concerned about that than the actual hardware.
 
We are starting to reach the limits of silicon.

Video cards are still the same price and we havent gotten new ones yet because we dont need too. What game can't a 5970 run...let alone a lower 5850...at max settings at super high resolutions? There are none. So why make faster GPU's when what you have already is too fast. For PC graphics and technology to make a big jump tech wise we need newer fabrication materials (graphine?) and we need the next generation of consoles. No PC game BESIDES maybe Crysis (which isnt true now because of Crysis 2) cant be ported to the consoles now. Most PC games are console ports and use the Unreal 3 engine...so we are running 5 year old software on 6 month old super GPU's...we need the next wave of high tech next-gen consoles to come out so that the ported games will be Unreal 4 tech and not Unreal 3 tech...make sense? Any of it? 😛

Exactly this. Since most (yes most; not all flamers..) PC games are simply ports of console games, there is no reason for hardware manufacturers to rush to create new technologies, when the market is still consuming the old technology at the same price they were months ago. It means their profit margins are that much higher with out sinking a ton of resources into R&D. Economically it makes sense. Unfortunately for the enthusiast it's a bit boring.
 
13-14 months is pretty typical length for a graphics card cycle. Usually there's a refresh halfway through, but it's not like that ever brings a huge increase in performance or anything.
 
On the other hand, when's the last time we had some revolutionary graphics in a game? Crysis? Three years ago? I'm more concerned about that than the actual hardware.

I'm going with this. I feel like gaming has moved a lot more to the consoles with the latest generation. And that generation is quite old now. The Geforce7 series and Radeon x800 series are quite old now but most games are still designed for that hardware.
 
More's law says nothing about speed. Besides which it's just an observation that has happened to hold for a long time.
 
lack of new demanding software is stagnating hardware growth -- agree. SSDs have been the biggest leap in HW is years, though.

I disagree. It is easy to write something that taxes the hardware beyond what it is currently capable (x264, it can easily bring an i7 down to a slow 1.5 fps, or lower with the right filters). I think it is more of a problem with architectural design. Just about every corner that could be cut, every known optimization, ect has been implemented in hardware. Now they have been stagnating with adding more and more cores in hopes that some magical enhancement will come along.
 
I disagree. It is easy to write something that taxes the hardware beyond what it is currently capable (x264, it can easily bring an i7 down to a slow 1.5 fps, or lower with the right filters). I think it is more of a problem with architectural design. Just about every corner that could be cut, every known optimization, ect has been implemented in hardware. Now they have been stagnating with adding more and more cores in hopes that some magical enhancement will come along.

The world is over!
 
I disagree. It is easy to write something that taxes the hardware beyond what it is currently capable (x264, it can easily bring an i7 down to a slow 1.5 fps, or lower with the right filters). I think it is more of a problem with architectural design. Just about every corner that could be cut, every known optimization, ect has been implemented in hardware. Now they have been stagnating with adding more and more cores in hopes that some magical enhancement will come along.

While it's true that more cores is a solution that has been forced on us by limitations in lithography it's not just some temporary hack. Multicore computing along with properly threaded software is a very legitimate way of greatly improving the processing power of modern computers. It will inevitably run up against some of the same miniaturization and heat dissipation briers that stopped the Ghz wars but it is the way forward.
 
We are starting to reach the limits of silicon.

We have also reached the limits of what we need to accomplish with personal computers. Much of the business world still runs on P3 & P4 class machines, they only upgrade when one goes dead. The number of people who need faster machines is dwindling.
 
Back
Top