Has the law of diminishing returns taken place the last 10 years

Wolverine607

Member
Apr 4, 2010
41
4
71
It seems computer hardware has gotten so powerful that people are upgrading less and less these days and for a good reason.Unless you need to run the latest games, video editing, virtualization and such, a high end computer form 10 years ago while slower, could still be used today for most basic things like web browsing, and word processing, spreadsheets and such.

I mean would you agree or disagree with these statements below.

*It seems as if a high end computer from 2005 could still be used today in 2015 for many basic modern things, but of course not everything and it will feel slower, although not unbearably slow.

*Where as a high end computer from 1995 would have been useless and non-functional with almost anything modern in the year 2005.


I still have Pentium 4 and Athlon rigs that work just fine at acceptable speed for browsing modern websites. I do not think an original Pentium 1 from 1995 with like 8-16MB of RAM would have stood a chance to browse the 2005 web or run any modern software back in 2005. Where as much of today's software will run fine on a computer from 2005 or even older in many cases
 
Last edited:

mfenn

Elite Member
Jan 17, 2010
22,400
5
71
www.mfenn.com
2005 is the era of the original dual-core Opterons (and their Athlon brethren) and Pentium D. I think that neither of those are not acceptable for browsing JS-heavy sites today. And the amount of RAM that typically came in those systems would be eaten up by modern web browsers.

However, I agree that there is less difference between a 2005 and 2015 machine as between a 1995 and 2005 machine. You're absolutely seeing progress slow down in terms of pure performance gain per year as the technologies mature.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
2005 is the era of the original dual-core Opterons (and their Athlon brethren) and Pentium D. I think that neither of those are not acceptable for browsing JS-heavy sites today. And the amount of RAM that typically came in those systems would be eaten up by modern web browsers.

Actually I have 2GB of ram in the old X2 and it does a better job with webpages than my Nexus 7. It also runs a lot of modern software good enough.

OP part of what happened was the rise of mobile computers, which kept the baseline for things like webpages low.
 

JackMDS

Elite Member
Super Moderator
Oct 25, 1999
29,471
387
126
Until the current decade any one that wanted to benefit from the Internet had to buy a PC or Laptop.

Million of people use to have the Dells and every few years upgrade it as Dell switch form 256MB RAM to 1GB RAM and to 2GB.

Every year in the basements of NYC apartment and business buildings there use to be thousand of two three old Dell/HP computers waiting for the Sanitation Dep. Not any more.

When Tablets and the like were introduced to Market Millions realized that there is No reason any more to buy the Bulky sets of PCs.

As a result the mainstream market of PCs stated to shrunk rather Growing as was expected before.

When there is No Growing market for a product fast turnover of Models is gone.

From two day ago - http://recode.net/2015/04/09/dont-look-now-but-the-pc-market-is-going-south-again/

Do anyone really thinks that more than 5% of users are aware of what CPU/GPU/HD/SSD is used in their Computers?


Heck, many people that I know are aware that they are using Windows or Apple (as they call it), but when asked they have No clue about the version that is installed on the device. I mean terms like Lion, Mountain Lion, etc. XP, Vista, Win 7, Win 8 are not in their repertoire of important things in live.




:cool:
 
Last edited:

BonzaiDuck

Lifer
Jun 30, 2004
15,722
1,454
126
Actually I have 2GB of ram in the old X2 and it does a better job with webpages than my Nexus 7. It also runs a lot of modern software good enough.

OP part of what happened was the rise of mobile computers, which kept the baseline for things like webpages low.

No disagreement there. I just think that mainstream computer use, barring resource-intensive games, has been left behind as wants now exceed need for mainstreamers.

I finally replaced one of our oldest LGA-775 boxes with a Z77/i5-3750K hardware combination. For that -- just to "be done" with it. It had been planned for some time. The crucial motivation arose just because a faulty AV module was hogging clock-cycles during e-mail downloads to Outlook. Now I discover the quick-fix for that problem meant simply re-installing last year's AV/firewall program with this year's license key.

And as someone said, the "herd" has moved into mobile-device-territory, even though your desktop PC has not become obsolete.

So there's an interplay between Moore's Law, human requirements and other factors. I think Moore's Law has been called into question: the microscopic traces on a 14nm processor are about 28 atoms wide now. The limit may be fast approaching.

The biggest bottleneck had been the storage subsystem, and it has been demolished by solid state drives.

So at this stage, I'd tell anyone who complains about spending too much on the technology: "Stop being an enthusiast-hobbyist, then!" And if you ask me, now that I have only two LGA_775 systems still running among five in the household (one is a server with Q6600 processor) -- the household is overpowered for its computers' potential.
 

Eeqmcsq

Senior member
Jan 6, 2009
407
1
0
I'm not so sure about 2005. Maybe around 2009, when Core 2s and Phenom II/Athlon IIs were the majority of the market. That's my threshold where CPUs had enough single threaded performance and number of cores to provide "good enough" performance for general, every day tasks. And any increase in performance was in the realm of "diminishing returns".
 

OBLAMA2009

Diamond Member
Apr 17, 2008
6,574
3
0

Do anyone really thinks that more than 5% of users are aware of what CPU/GPU/HD/SSD is used in their Computers?

thats cuz intel doesnt tell peeps whats in their computers anymore. its really weird that doing things that way works, but i guess u might as well do that if ur chip r actually getting SLOWER
 
Last edited:

OBLAMA2009

Diamond Member
Apr 17, 2008
6,574
3
0
i dont have a superfast internet connection. does anyone know if higher clockspeed would matter if you had a gigabit line?
 

KentState

Diamond Member
Oct 19, 2001
8,397
393
126
I'm not so sure about 2005. Maybe around 2009, when Core 2s and Phenom II/Athlon IIs were the majority of the market. That's my threshold where CPUs had enough single threaded performance and number of cores to provide "good enough" performance for general, every day tasks. And any increase in performance was in the realm of "diminishing returns".

Core 2 machines where common in 2008 and the i7 launched in 2009. The i7-920 that I purchased is still going strong and can handle more than simple web browsing if you throw in a decent graphics card.
 

shabby

Diamond Member
Oct 9, 1999
5,779
40
91
I don't think i'll be changing my i2500k anytime soon, gpu and ssd yes but that's about it.
 

RaistlinZ

Diamond Member
Oct 15, 2001
7,629
10
91
I think it started with the i7-920 era. When overclocked, it's just as fast as anything AMD has to offer in 2015. People no longer upgrade because they "have-to". It's mostly just "want-to" these days.

My upgrade from i7-930 to i7-4770k was rather pointless, and from i7-4770k to i7-5820k even more so.


But it's still fun. ;)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,339
10,044
126
I still have Pentium 4 and Athlon rigs that work just fine at acceptable speed for browsing modern websites.

Somehow, I doubt that. Maybe, though, if you use adblock and noscript and set flash player to click-to-activate. Or maybe don't install flash player at all.

To get to your other points: I think 2005 might be just a tad too early, to consider PCs of that era to be able to run modern software and web sites. Maybe 2006-2007 (Core2 / Athlon AM2 X2 era). When we started to get multi-core as standard, and we also got an IPC jump from Pentium 4 / D. It also meant that x64 extensions were standard in the CPU. Vista x64 wasn't so popular, but when Win7 was released in 2009, it was encouraging that so many OEMs were shipping the x64 version on their machines. Finally, the 64-bit era had arrived.

Ever since then, every new generation has had basically single-digit percentage performance improvements on existing code. It has been nearly a collective yawn as far as absolute performance gains, although performance / watt gains have been fairly staggering.

For non-gaming, non-video-editing tasks, I could even get by with my HP Stream 7 tablet with a Bay Trail-T quad-core CPU that comes in under 5W or so.
 
Last edited:

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Maybe a high-end PC from 2005 would be fine for present-day use, but what about those who didn't have an Opteron 165 and 2GB of RAM, but rather a Sempron and 512MB? Or, those with laptops? Today you can get 20+ hours of battery life in some machines, and 6 hours is common. In 2005, 2 hours was pretty good. I don't think you could use a 2005 laptop in very many of the same ways you'd use a modern one.

Certainly the pace of single-threaded performance improvement has slowed, but even 2010-2015 saw huge changes in what you can expect in both desktop and mobile devices.
 

jji7skyline

Member
Mar 2, 2015
194
0
0
tbgforums.com
Upgraded my 2005 model computer to the max.

Original specs:
Celeron D (not dual core)
2GB DDR-400 RAM (was upgraded to max when I bought it)
VIA built-in graphics

Current specs:
Pentium 4 3.2Ghz (mobo supports up to 3.4ghz pentium extreme, but couldn't find any of those, and the motherboard doesn't support overclocking).
2GB DDR-400 RAM (max supported by mobo)
NVIDIA FX5200 (fastest video card I had access to with AGP port, can run 1080p).

It's okayy for super-basic use, but chokes on any multitasking or intensive websites. Even running more than a few tabs in chrome makes it choke.

I highly doubt that anyone would use even a high-end computer from 2005 in 2015 as their main rig, no matter how basic their needs are.

Oh and also, this thing is LOUD!
 

xgsound

Golden Member
Jan 22, 2002
1,374
8
81
Actually I have 2GB of ram in the old X2 and it does a better job with webpages than my Nexus 7. It also runs a lot of modern software good enough.

OP part of what happened was the rise of mobile computers, which kept the baseline for things like webpages low.

I waited until DDR2 for my X2. That was 2006 and now it runs win7 on a SSD. That SSD sure perked things up even on SATA2. The limiting factor was the Nvidia chipset which I couldn't figure out how to get 64 bit on, so I'm limited to 3.25 GB. It does internet, minor photo work, and music fine at 28C to 39C.

Jim.
 

BonzaiDuck

Lifer
Jun 30, 2004
15,722
1,454
126
Maybe a high-end PC from 2005 would be fine for present-day use, but what about those who didn't have an Opteron 165 and 2GB of RAM, but rather a Sempron and 512MB? Or, those with laptops? Today you can get 20+ hours of battery life in some machines, and 6 hours is common. In 2005, 2 hours was pretty good. I don't think you could use a 2005 laptop in very many of the same ways you'd use a modern one.

Certainly the pace of single-threaded performance improvement has slowed, but even 2010-2015 saw huge changes in what you can expect in both desktop and mobile devices.

I'm still patting myself on the back and touting a Gateway "executive" laptop I acquired last year and upgraded with 2x4GB SO-DIMM, 500GB SSD and "wireless-N." Your remarks about battery-life give me pause, though. I'm about ready to order a $20 battery replacement.

Centrino-Duo "C2D" : I can see the CPU-meter run up momentarily to 100% for a few things, but it otherwise seems super-fast for mainstream office apps.

Retired, if I stray from the house for errands and other tasks, I have no need to take a lappie with me. I might as well learn to use certain features on my I-phone.

I suppose I should start "looking" at some newer laptops, though. On the other hand -- we're well served here with our desktops/workstations. I bought the old Gateway just so I could "keep up" with the technology and get familiar with mobile technology. I can redeploy the SSD to another system. Can't do anything with the SO-DIMM (DDR2) or the $7 Intel NIC.

But . . . 20 hours of battery life -- that's a far cry from the 5 hours or less I can expect from the Gateway.
 

The Sauce

Diamond Member
Oct 31, 1999
4,739
34
91
My i5-2500K running at 4.7 is still going strong. This has been the best value chip I have had in my entire life - and I've been building since the old 80286 was king. Will probably keep it through another video card upgrade.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
My Core 2 duo that I got in 2007 is still running and a perfectly functional machine. Its been relegated to HTPC/media center duties, but it browses the web and plays back video just fine.
 

piasabird

Lifer
Feb 6, 2002
17,168
60
91
How people use computing devices has changed greatly. There are lots of devices to access the Internet from an X-BOX to a raspberry, to a desktop, to a tablet, to a laptop to a smartphone or phablet. Even a smart TV is a possible option for Netflix and Hulu.

I still have a socket 775 core 2 duo processor at home and it runs pretty good on 2 gigs of DDR2 800 RAM. I only use it in my basement. I tried it with windows 8 but it did not work that well. I am still running Vista 32 on it.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
Compare to the previous 10 years:

*It seems as if a high end computer from 1995 could still be used today in 2005 for many basic modern things, but of course not everything and it will feel slower, although not unbearably slow.

Considering you'd have to be rich to get even just 16MB of RAM back in 1995, I'd say there's no way that could be true. Even running win 95 you'd still run out of memory very fast and your web browsing experience would be terrible, for the sites that load at all.

So yes, things have clearly stagnated. But then again, compare mobile devices from 1995 to 2005. Not much difference, compared to the last 10 years.
 

mvbighead

Diamond Member
Apr 20, 2009
3,793
1
81
At this point, there seem to be two things that would help a device meet 99% of the usage requirements:

1) Play HD video without issue
2) Lower power consumption

If a particular set of hardware meets those two criteria, it is generally a usable platform for 99% of what MOST people do.

Thing is, some of the first generation of 3.0GHz chips were heaters. The produced a LOT of heat.

Like others are saying, it seems to be about the Core Duo era that systems able to accept 4GB of RAM and use relatively little power and do everything the majority of people do. Whether an Athlon x2 or a Core Duo (and moreover Core 2 Duo), that seemed to be the point of severely diminishing returns for general PC usage.

Now, when 4k resolution becomes the next big thing, I suspect that those standards will have to increase. But with that work being offloaded to GPU, a Core 2 Duo could possibly still suffice.

All in all, multiple cores, IMO, were the best system advancement out there as you didn't have to worry about one single threaded task (EG - Antivirus) knocking your PC down while you waited for a scan to finish.