Building Technology the Wrong way.

eternalone

Golden Member
Sep 10, 2008
1,500
2
81
Why are we still building giant video cards and hardware machines to run our games and software on?? For example battlfield 3 needs a pretty new and modern video card to run on decently, shouldn't it be the opposite? Shouldn't programmers be making awesome games with spectacular graphics that are able to run on really old hardware? Wouldn't that be more ideal and environmentally friendly? It seems to me that not enough effort is put into this, instead economics trumps efficiency and the urge to make a buck keeps us in this need to buy bigger and better hardware to run over bloated games. Its kind of a like a paradox because the computer hardware business also funds innovation of better technology, but at the same time you would think there would have been some sort of technological break through, kind of like computing and hardware's version of cold fusion yet nothing, maybe technology is being held back I dont know. The closest effort Ive seen to the correct computing model as I see it in the vision of a natural advanced society would be like the http://www.raspberrypi.org/ project and that group is non for profit. So a non for profit organization is the only way for technology to evolve naturally and efficiently with minimal resources? It seems to me that capital is holding innovation back more than its helping, mimicking all aspects of our wasteful inefficient society as a whole. It seems to me that big business does not want efficient computing because they would not be able to make any money off of it, also because it would probably evolve on its own without the need for the newest and fastest hardware around. It seems innovation of technology is limited so that manufacturers can adjust their business model with limited gains to the consumer. If they could I believe they would have us running our computers with coal, like they did all those years with trains etc just to make a buck and you as a consumer would probably not know the difference since it would be the social norm.
 
Last edited:

3chordcharlie

Diamond Member
Mar 30, 2004
9,859
1
81
Back in the 90's 3dfx built a video card that automatically could run every game that was ever gonna be made.

But the big companies realized they couldn't make any money, so they invented 32-bit and got rid of voodoo2.

It was mostly a big conspiracy between all the big companies, and the man.
 

Number1

Diamond Member
Feb 24, 2006
7,881
549
126
More on topic, you're either really really stupid or just trolling. Either way FUCK YOU.
 

Bignate603

Lifer
Sep 5, 2000
13,897
1
0
I got about 3 lines in before I gave up. However, the answer simply is that people want new games and things to look better than previous one. While you can do some things to make the code more "efficient", looking better with less processing power, but you can only do that so much before you start needing better hardware.

Also, the hardware companies are continually making new hardware that can do more impressive things. Game makers simply are choosing to take advantage of those new abilities.
 

Pr0d1gy

Diamond Member
Jan 30, 2005
7,774
0
76
Look on the bright side eternalone, in 5 or 10 years you will be able to run bf3 on your phone. ^_^
 

fire400

Diamond Member
Nov 21, 2005
5,204
21
81
You want to be Steve Jobs jr?

Then go for it. Even he accepted capitalism and fan culture.

apple-music-sept1.jpg
 

Destiny

Platinum Member
Jul 6, 2010
2,270
1
0
The Raspberry Pi is all marketing hype. The founder of the foundation works for Broadcom and is pushing their chips for charity (non-profit) to supposedly provide very cheap PCs for poor kids to learn programming...

However people start to try to use it for other applications like HD Video streaming or HD Video Playback and were met with disappointment.


You can get a more powerful ARM Cortex A8 chipset and build one yourself for the same price... but also keep in mind the chipset is only as good as its software development kit available for developers...
 
Last edited:

tcG

Golden Member
Jul 31, 2006
1,202
18
81
I've long thought the way that the big hardware companies slowly dole out the new technology is suspect. What is stopping Intel from making much, much faster chips tomorrow other than them having an effective duopoly in the market?

The software side stinks, too. What with the API's like DirectX...
 

lxskllr

No Lifer
Nov 30, 2004
60,035
10,526
126
I've long thought the way that the big hardware companies slowly dole out the new technology is suspect. What is stopping Intel from making much, much faster chips tomorrow other than them having an effective duopoly in the market?

Engineering, and cost. It's more involved than going into the kitchen, and saying "I'm gonna make the best omelet ever", and doing it.
 

tcG

Golden Member
Jul 31, 2006
1,202
18
81
So it's because it costs too much and therefore has to be done in steps? I dunno. I just don't buy it.

Don't you agree that there is a huge disincentive for development of powerful, cheap, ubiquitous hardware, seeing as how that would put the entire industry out of business? I see no reason, other than the planned obsolescence that I'm talking about, that we shouldn't have real-time photo-realistic 3d rendering on cheap hardware. Is there some sort of barrier in the physics of it, in the same way that we can't have free energy? I suppose such rendering would require very powerful information processing, but isn't that possible with sufficiently advanced engineering?

This is a lot of conjecture I admit, and I'm not 100% on it, but I just don't think the slow technological advancement we see is cutting edge.

I can see how the development of computer graphics in the past may have been constrained by these huge amounts of research necessary, since it's a big deal to build 2d graphics or 3d graphics from scratch. But now we are the point where 3d graphics are already very realistic, so it doesn't seem to me to be that revolutionary to just crank up the processing power (or make the coding more efficient). It's just a matter of degree, not a whole different concept needing to be invented.
 
Last edited:

mmntech

Lifer
Sep 20, 2007
17,501
12
0
I've long thought the way that the big hardware companies slowly dole out the new technology is suspect. What is stopping Intel from making much, much faster chips tomorrow other than them having an effective duopoly in the market?

The software side stinks, too. What with the API's like DirectX...

In the 90s and early 2000s, hardware performance gained very rapidly. People used to say computers were obsolete as soon as they left the factory. In the mid 2000s, they hit a "good enough" plateau. The point when low and middle end computers were capable of doing what most people used them for without choking. Checking email, doing homework, surfing the web, watching videos, casual online gaming. Those computers started lasting longer without noticeable performance impacts.

There is still rapid hardware development. It's just taking place in the mobile world. Five years ago, if someone had shown me an iPad and said it would replace desktops and laptops, I'd have laughed. Mobile hardware was a joke at the time. Now I can actually edit video and photos on my phone. That was unheard of not too long ago.
 

tcG

Golden Member
Jul 31, 2006
1,202
18
81
Heck, we already see a form of planned obsolescence in the artificial market stratification for CPU's, for example. There's no manufacturing reason that two CPU's using largely the same materials and manufacturing process should be separated by a 10:1 price difference, other than the fact that there's no competition in the market to drive down prices to a point closer to manufacturing cost.
 

Bignate603

Lifer
Sep 5, 2000
13,897
1
0
I've long thought the way that the big hardware companies slowly dole out the new technology is suspect. What is stopping Intel from making much, much faster chips tomorrow other than them having an effective duopoly in the market?

Because they're evolving an existing technology in slow steps? No one has come up with a truly disruptive technology for processing in a long time.


Heck, we already see a form of planned obsolescence in the artificial market stratification for CPU's, for example. There's no manufacturing reason that two CPU's using largely the same materials and manufacturing process should be separated by a 10:1 price difference, other than the fact that there's no competition in the market to drive down prices to a point closer to manufacturing cost.

CPU makers do quite a bit to price things at different price points to try and maximize income. A big part of this is because the cost to set up a new production line with an improved process has an astronomical cost. Once the line is pumping out chips the cost per chip is relatively low but they need to recoup those setup and development costs. They do as much as they can to make sure there's a chip for every budget, even to the point where they handicap chips to run far below their true capability. The epitome of this were some of AMD's 3 core CPUs. Some of the 3 core CPUs were a 4 core with a defective core, others the manufacturer had purposely disabled one of the cores so they could get a CPU at a lower price point.
 
Last edited:

lxskllr

No Lifer
Nov 30, 2004
60,035
10,526
126
So it's because it costs too much and therefore has to be done in steps? I dunno. I just don't buy it.

I don't know much about chip design, but I do know it takes very expensive engineers, and very expensive plants to create them. I think they're starting to hit a wall with conventional design, so improvements are coming slower than they were in years past.

From a 2007 Ars article...

2007 has not been kind to AMD. The company saw its workstation market share slip, has taken on $2 billion of new debt, lost almost $1.2 billion over the past two quarters, has been unable to close the gap with Intel when it comes to CPU performance, and has been the subject of recent rumors that Barcelona will be delayed. AMD has been in cost-cutting mode for the past several months and, according to IDG News Service, is considering getting out of the fabrication business.

Currently, AMD operates two fabs: Fab 30 and Fab 36. Fab 30 is in the process of being fitted to handle 300mm production, and when the transition is complete, it will be rechristened Fab 38. It hasn't come cheaply, either—the chip maker has invested over $2.5 billion to expand its 300mm capabilities. AMD has also been talking up a new 45nm plant in Malta, NY, that would come online in 2009.

http://arstechnica.com/gadgets/2007/06/amd-considering-getting-out-of-fabrication-business/

If making chips were easy, AMD would have just done it, and kicked Intel's ass.
 

tcG

Golden Member
Jul 31, 2006
1,202
18
81
I don't know much about chip design, but I do know it takes very expensive engineers, and very expensive plants to create them. I think they're starting to hit a wall with conventional design, so improvements are coming slower than they were in years past.

I know that's the general feeling, but there's no reason I can see that it should be that way. What is the physical constraint on more raw processing power? Technology is amazing these days... you'd think we would have something.

If making chips were easy, AMD would have just done it, and kicked Intel's ass.

Why would they do that? If they made something truly revolutionary, they would enjoy a brief period of market dominance followed by total ruin. It's more profitable for both companies, which enjoy duopoly control, to have agreements which enable them to maintain control.
 

RelaxTheMind

Platinum Member
Oct 15, 2002
2,245
0
76
what the hell do you think console game programmers do.

the xbox360 and ps3 are ~7 year old computers/technology. big difference in graphics from early games and games of today.

imho its all marketing.
 

lxskllr

No Lifer
Nov 30, 2004
60,035
10,526
126
I

Why would they do that? If they made something truly revolutionary, they would enjoy a brief period of market dominance followed by total ruin. It's more profitable for both companies, which enjoy duopoly control, to have agreements which enable them to maintain control.

I don't know how things are now, but for awhile AMD had one foot in the grave, and the other on a banana peel. There was a real risk of them going under. Aside from a brief period during the P4 era, they've trailed Intel the whole time. Intel and AMD aren't equals by any stretch, and there's no agreement between them. AMD would love to put Intel in the ground. The only reason Intel wants AMD around is to stave off anti-monopoly suits. They aren't really competition.

Read up on chip design. I've seen stuff in passing, but didn't pay real close attention. A lot of it is over my head, and I'm just not that interested. The gist of the situation though, is they're getting ahead by using hacks. Silicon is pretty much done as a material. They've crammed as much in as they're going to, and any improvements will be small refinements on the process.