is it my imagination or is business software way behind the hardware?

PlatinumGold

Lifer
Aug 11, 2000
23,168
0
71
i'm not a gamer. i'm currently working on building myself a 939 3500+ machine.

even that will be way overkill for what i do on a day to day basis.

if you don't game, is there any reason at this point to get a dual core p4 or opteron system?

has hardware gone way ahead of software?

i can't remember a time when i've felt that hardware was so much ahead of software.

well, mb when i was using dos on my 486 machine. that was a bit overkill.
 

Koing

Elite Member <br> Super Moderator<br> Health and F
Oct 11, 2000
16,843
2
0
No.

The dual core P4 and opteron systems will be for the people who video edit/ photoshop a lot. People who design stuff basically. Everyone else it will 'just be nice'. Do I really need 1Gb of ram? No. 512mb is 'doable' but it sure as hell gets laggy after 5 days of no shut downs :p.

Koing
 

BobDaMenkey

Diamond Member
Jan 27, 2005
3,057
2
0
There is no reason. Dual core and high end stuff, is as always, for the super users who take the usage of their systems to the absolute max and find them lacking. Your average AOLer isn't going to be able to tell the difference between an 1800+ and a 3200+ for most applications anymore.
 

SagaLore

Elite Member
Dec 18, 2001
24,036
21
81
Well business software really doesn't need any improvement. Once you have it where it needs to be, no need to "push the envelope" like you do with games and their lust for higher resolution graphics and complex AI algorithms.

Business software usually consists of some kind of database or flatfile, with nothing more than "add 4001 to 732.4" calculations. :p
 

PingSpike

Lifer
Feb 25, 2004
21,758
602
126
Beats the hell out of the 486 and early pentium 1 era, where every app ran like dogsh|t.
 

PlatinumGold

Lifer
Aug 11, 2000
23,168
0
71
Originally posted by: BobDaMenkey
There is no reason. Dual core and high end stuff, is as always, for the super users who take the usage of their systems to the absolute max and find them lacking. Your average AOLer isn't going to be able to tell the difference between an 1800+ and a 3200+ for most applications anymore.

ya.

it feels like what's available to average users and how they use them, most systems these days are overkill, more than ever before.

for eg. in the 386 days, lotus 123 really pushed the 386 cpus.
in the 486 days, Windows and MS Word / Excel / Access pushed the 486 cpus even average users

in the early pentium days windows and multiple apps pushed them to their limits.

now the mid to high end cpu's are overkill for even moderate power users.

sure some of the high end cad / graphics design shops will take advantage of the dual cores but really, is there a mass market for this cpu?

why isn't voice recognition in ever OS now? the cpu's can definitely handle it?
 
May 31, 2001
15,326
2
0
Originally posted by: PlatinumGold
i'm not a gamer. i'm currently working on building myself a 939 3500+ machine.

even that will be way overkill for what i do on a day to day basis.

if you don't game, is there any reason at this point to get a dual core p4 or opteron system?

has hardware gone way ahead of software?

i can't remember a time when i've felt that hardware was so much ahead of software.

well, mb when i was using dos on my 486 machine. that was a bit overkill.


Post in the Distributed Computing forum, I am sure there are plenty of people that can help you figure out what to do with those extra processor cycles. ;)

I am running an Athlon64 3500+ and have a Sapphire X850 XT, and damn, the frames per second when I use Quicken are off the scale! ;)
 

PlatinumGold

Lifer
Aug 11, 2000
23,168
0
71
Originally posted by: SagaLore
Well business software really doesn't need any improvement. Once you have it where it needs to be, no need to "push the envelope" like you do with games and their lust for higher resolution graphics and complex AI algorithms.

Business software usually consists of some kind of database or flatfile, with nothing more than "add 4001 to 732.4" calculations. :p

you can improve business software with AI. Voice Recognition, facial recognition for personal settings, security etc.

 

PricklyPete

Lifer
Sep 17, 2002
14,582
162
106
Ummm... definitely depends on what software you are running.

There is plenty of software that still maxs out the capability of hardware in a work environment from CAD to 3D rendering to simpler things like advanced reporting systems and whatnot. Just because you don't need a brand new machine to run MS Word 2003 doesn't mean there are applications out there that don't need that.

As for home software, video editing/encoding can definitely put a hurting on your machine as well as photo editing and multitasking (having 5 browser sessions open with your MP3 player playing, an Word document with your resume, and a p0rn playing in the media player.

All I'm saying is there is valid reasons for buying the best you can get. You may not have a valid reason though...so get what you need. My parents and sisters use my old machines and they all run much faster than anything they need because all they do is go on the internet and use Word.

 

Demon-Xanth

Lifer
Feb 15, 2000
20,551
2
81
In my opinion, hardware development has become much more robust and streamlined. There are no chipsets that would be considered crap anymore. Compatibility problems are way down from where they used to be. The only parts that I see commonly fail are fans, and those are usually only the sleeve bearing ones.

Software development on the other hand, it's a mixed batch. On one side, you have simple, powerful software applications that work very well and are quite small. These are usually made by smaller companies or individuals. Then you end up with a bloated, buggy piece of crap software that somehow makes it onto every OEM PC on earth. It's usually hard to use, feels stripped down, and is slow.

Where I think the difference is, is that software development itself hasn't been scaling well. At the company that I work for, we increased the amount of programmers 10 fold. But I don't see the level of software going anywhere. A system with a 286 had a smoothly scrolling screen, they couldn't make a 1GHz P3 do that. It's like everyone knows how to design a city, but noone knows how to design a building anymore. I commonly see software trying to be inbetween art and science. When it comes to auditing the code, it's hard to understand because it's art, but it doesn't have to look good because it's science. Wrong standpoint.

Another problem I see: the testing phase is tedious, time consuming, and there's no glory. So NOONE DOES IT. They want the glory of coming up with the idea. Once they got the glory, they're done. This leads to the "ship it now, patch it later" idealogy. That doesn't happen with hardware, patching hardware isn't easy nor cheap. Remember when Intel "patched" the i820's MTH? Yeah, that didn't go over smooth did it? One excuse I hear is that "software has millions of lines of code, it is going to have lots of bugs.". Consider that a chipset has some 20 million transistors easy, CPU has around 80 million or so, GPUs are in the 150 million range, RAM is in the billions. There's tons of support circuitry as well. How many patches are issued for those each month?

Software development needs to work like hardware development if they want computing to make the next leap.
 

PingSpike

Lifer
Feb 25, 2004
21,758
602
126
Originally posted by: PlatinumGold
ya.

it feels like what's available to average users and how they use them, most systems these days are overkill, more than ever before.

for eg. in the 386 days, lotus 123 really pushed the 386 cpus.
in the 486 days, Windows and MS Word / Excel / Access pushed the 486 cpus even average users

in the early pentium days windows and multiple apps pushed them to their limits.

now the mid to high end cpu's are overkill for even moderate power users.

sure some of the high end cad / graphics design shops will take advantage of the dual cores but really, is there a mass market for this cpu?

why isn't voice recognition in ever OS now? the cpu's can definitely handle it?

Don't worry, there's still plenty of software development companies that try their best to use all of the extra processing and memory power by creating sloppy inefficent apps. If Lotus could make an email client cripple a pentiumIII then I think they can do it with anything!

Originally posted by: PlatinumGold

you can improve business software with AI. Voice Recognition, facial recognition for personal settings, security etc.

Eh...the vast majority of users don't even more than a quarter of the features present in Office...what makes you think anyone is going to use those new features? Plus, IT has settled down into its "needs" spot in the business world from its explosion in the late 90s. Businesses aren't going to pay for stuff just because its cool, especially since the economy is kind of gimped right now. They don't want to buy new computers when the old ones are doing the job.

I think for the majority of users, business apps have reached a point where more features aren't really worth the extra cost. They haven't even caught up to the old stuff yet. And you're going to run into a "if it ain't broke, don't fix it" mentality.

Plus, games drive the hardware market. Does your average word user need an AGP card with 16mb of memory? A PCI addon card with 2MB would do the job for them nicely.
 

Demon-Xanth

Lifer
Feb 15, 2000
20,551
2
81
Originally posted by: PingSpike
Plus, games drive the hardware market. Does your average word user need an AGP card with 16mb of memory? A PCI addon card with 2MB would do the job for them nicely.

You've never seen XP on an i8xx graphics system have you? MS managed to make a real videocard a requirement.
 

PlatinumGold

Lifer
Aug 11, 2000
23,168
0
71
Originally posted by: Demon-Xanth
In my opinion, hardware development has become much more robust and streamlined. There are no chipsets that would be considered crap anymore. Compatibility problems are way down from where they used to be. The only parts that I see commonly fail are fans, and those are usually only the sleeve bearing ones.

Software development on the other hand, it's a mixed batch. On one side, you have simple, powerful software applications that work very well and are quite small. These are usually made by smaller companies or individuals. Then you end up with a bloated, buggy piece of crap software that somehow makes it onto every OEM PC on earth. It's usually hard to use, feels stripped down, and is slow.

Where I think the difference is, is that software development itself hasn't been scaling well. At the company that I work for, we increased the amount of programmers 10 fold. But I don't see the level of software going anywhere. A system with a 286 had a smoothly scrolling screen, they couldn't make a 1GHz P3 do that. It's like everyone knows how to design a city, but noone knows how to design a building anymore. I commonly see software trying to be inbetween art and science. When it comes to auditing the code, it's hard to understand because it's art, but it doesn't have to look good because it's science. Wrong standpoint.

Another problem I see: the testing phase is tedious, time consuming, and there's no glory. So NOONE DOES IT. They want the glory of coming up with the idea. Once they got the glory, they're done. This leads to the "ship it now, patch it later" idealogy. That doesn't happen with hardware, patching hardware isn't easy nor cheap. Remember when Intel "patched" the i820's MTH? Yeah, that didn't go over smooth did it? One excuse I hear is that "software has millions of lines of code, it is going to have lots of bugs.". Consider that a chipset has some 20 million transistors easy, CPU has around 80 million or so, GPUs are in the 150 million range, RAM is in the billions. There's tons of support circuitry as well. How many patches are issued for those each month?

Software development needs to work like hardware development if they want computing to make the next leap.

Programmers vs Coders.

most indian IT guys i've met are very good Coders but not very good programmers. i ran accross a database program recently where in the patient demographics table they forgot to make the Primary Key (in this case the SS#) a required field.

 

vi edit

Elite Member
Super Moderator
Oct 28, 1999
62,484
8,345
126
Programmers vs Coders.

most indian IT guys i've met are very good Coders but not very good programmers. i ran accross a database program recently where in the patient demographics table they forgot to make the Primary Key (in this case the SS#) a required field.

Not to turn it into a completely different thread, but that is actually getting pretty common. Privacy laws are cracking down on the use of SSN's for identification tracking.
 

PingSpike

Lifer
Feb 25, 2004
21,758
602
126
Originally posted by: Demon-Xanth
In my opinion, hardware development has become much more robust and streamlined. There are no chipsets that would be considered crap anymore. Compatibility problems are way down from where they used to be. The only parts that I see commonly fail are fans, and those are usually only the sleeve bearing ones.

Software development on the other hand, it's a mixed batch. On one side, you have simple, powerful software applications that work very well and are quite small. These are usually made by smaller companies or individuals. Then you end up with a bloated, buggy piece of crap software that somehow makes it onto every OEM PC on earth. It's usually hard to use, feels stripped down, and is slow.

Where I think the difference is, is that software development itself hasn't been scaling well. At the company that I work for, we increased the amount of programmers 10 fold. But I don't see the level of software going anywhere. A system with a 286 had a smoothly scrolling screen, they couldn't make a 1GHz P3 do that. It's like everyone knows how to design a city, but noone knows how to design a building anymore. I commonly see software trying to be inbetween art and science. When it comes to auditing the code, it's hard to understand because it's art, but it doesn't have to look good because it's science. Wrong standpoint.

Another problem I see: the testing phase is tedious, time consuming, and there's no glory. So NOONE DOES IT. They want the glory of coming up with the idea. Once they got the glory, they're done. This leads to the "ship it now, patch it later" idealogy. That doesn't happen with hardware, patching hardware isn't easy nor cheap. Remember when Intel "patched" the i820's MTH? Yeah, that didn't go over smooth did it? One excuse I hear is that "software has millions of lines of code, it is going to have lots of bugs.". Consider that a chipset has some 20 million transistors easy, CPU has around 80 million or so, GPUs are in the 150 million range, RAM is in the billions. There's tons of support circuitry as well. How many patches are issued for those each month?

Software development needs to work like hardware development if they want computing to make the next leap.

I agree with that totally. I'm not really heavily involved in the software development field. But I do bug testing for a medical software company as a part time position. They're a small company, but I'm always impressed that despite the fact the application is fairly simple in its design they go pretty nuts with testing and constant fixes to small bugs that occasionally crop up. They have to, IMO, the software does partial diagnoses of patients...you don't want any errors on that.

I think the industry and the users are both partly to blame though. Instead of demanding streamlined, robust and bug free applications they fall for the flash. And like you said, its easy and fun to make a flashy new version or piece of software. Its boring and largely invisible to the end user when you spend countless hours debugging and testing to make sure the app is rock solid.
 

PlatinumGold

Lifer
Aug 11, 2000
23,168
0
71
Originally posted by: vi_edit
Programmers vs Coders.

most indian IT guys i've met are very good Coders but not very good programmers. i ran accross a database program recently where in the patient demographics table they forgot to make the Primary Key (in this case the SS#) a required field.

Not to turn it into a completely different thread, but that is actually getting pretty common. Privacy laws are cracking down on the use of SSN's for identification tracking.

fine, don't use SS as the primary key then, just make it another unique key. and create a Primary key that is invisible to the user and is created by the database.

you cannot have a relational database without a unique id.

 

PingSpike

Lifer
Feb 25, 2004
21,758
602
126
Originally posted by: Demon-Xanth
Originally posted by: PingSpike
Plus, games drive the hardware market. Does your average word user need an AGP card with 16mb of memory? A PCI addon card with 2MB would do the job for them nicely.

You've never seen XP on an i8xx graphics system have you? MS managed to make a real videocard a requirement.

I can't even find a motherboard with onboard graphics less than 8-16mb anymore. But can't you just turn all the junk off anyway? I run my win2k3 server on an old s3 virge card I had lying around and it seems fine.
 

ggnl

Diamond Member
Jul 2, 2004
5,095
1
0
I bog down my computer at work all the time. But I do a lot of work with some pretty massive spreadsheets.

Individual users may not need all the computng power, but it takes a lot of power to run the database for a large business.
 

PlatinumGold

Lifer
Aug 11, 2000
23,168
0
71
Originally posted by: ggnl
I bog down my computer at work all the time. But I do a lot of work with some pretty massive spreadsheets.

Individual users may not need all the computng power, but it takes a lot of power to run the database for a large business.

and normally that type of functionality would be handled by fancy machines known as "SERVERS". ;)