Integrated video conspiracy theory...

Jul 1, 2000
10,274
2
0
Okay - given all the talk of SiS and Via developing (or re-developing graphics divisions), I have come to a conclusion.

Graphics accelerators are about to become part of the core logic system of every motherboard. This will ultimately erode away the market for 3D accelerators, and I believe that the "3d Card" will become largely a thing of the past - possibly within 5 years.

Don't believe me? Consider this...

1) VIA - Via is in talks to acquire ST Micro's Kyro technology. The Inquirer has an article about Via combining S3 tech with ST tech to produce next generation accellerators. Al of Via's current graphics products are integrated.
2) ATI - Has produced a integrated video chipset for P4 (in cooperation with intel). The video reportedly is far superior to nForce.
3) nVidia - has produced the highly touted nForce chipset for AMD, and ultimately P4 processors. nVidia just displayed a new version at Cebit.
4) SiS - is aggressively developing its graphics division. SiS believes in highly integrated cores and could add video as a means of providing value to OEM's. This is where all the money is...
5) intel - Intel onboard video sucks... but they are working with ATi now on integrated video solutions for P4.
6) AMD - has been rumored to be leaving the chipset biz...

I know what many of you are thinking... onboard video sucks. It is slow, and well... it just sucks. This is true. But it has gotten remarkably better in the last year.

The reality is that with this many competitors operating in one area (core logic), chipset maker will need to provide more and more features to remain ahead of the game. At first, it is just adding features. Ultimately, competiton will drive them to make better and better products. This drive for better products could result in the same type of frenzy that drives the GPU biz today. As onboard video improves, fewer and fewer video cards will be sold. This will result in 1) less volume, 2) higher prices, 3) less innovation.

Before the GPU business goes away, there will be a price war for market share - this has already strated to happen between ATi and nVidia. All the OEM's are lining up - some loyal to ATi, some nVidia. Some have already perished (R.I.P. Elsa). The upgrade market has all but withered and died. People just accept that it is far better to buy a new machine that to upgrade the old one. Graphics are one of only upgradeable things on a PC anymore, given the rapid obsolesence of CPU for factors and RAM types.

Miniaturization and convergence seem to b the name of the game now. We (or at least some of us) started using computers when we hooked up the ol' C-64 to the TV. Now, narly 20 years later, we are beginning to look back to the TV as a viable display option.

At the end of the day, we will all own an xbox... or something like that.
 

bunker

Lifer
Apr 23, 2001
10,572
0
71
The only argument I can come up with against this is money. How often do people upgrade their motherboard? Unless the onboard GPU is easily replaced I don't see the mfg's stopping production on add-in cards.

How else are you gonna get than shiny new graphics chipset unless you buy a new MB?
 

Auric

Diamond Member
Oct 11, 1999
9,591
2
71
bunker, my thoughts exactly. 3D is in its infancy. Yes, more capable integrated chipsets are going to become increasingly common but significant demand for something better will always exist. Micron proprosed a GPU socket awhile back but nothing came of it. Done correctly an evolving standard would be nice for reducing redundancy and cost. It would prolly require a breakthrough in integrated memory or a seperate memory socket also to be worthwhile over chipset integration though as system memory will prolly always lag.
 
Jul 1, 2000
10,274
2
0
Your arguments assume a need to upgrade. I submit there will be very few upgrades, at least to home PC's in the future. Programmable DSPs are also part of that picture.

This might sound a bit overdramatic, but the xbox marks the beginning of the beginning of the end for the home PC. Microsoft has the money and the power to force a standard. It is to their advantage to do so. By creating the standards for all of the hardware, Microsoft lowers its costs of OS development significantly as it could foreseeably eliminate incompatibilities. A lower cost machine equals greater market penetration into previously untouched demographic groups.

Increased market penetration = money. Money = revenues. Revenues = increased dividends for investors. Happy investors = job security for corporate directors.

Not only that, major media and entertainment groups (notably Eisner's Disney) are backing one standard. This standard may retard the growth of technology and innovation, but it provides a cost-effective medium for them to broadcast to. With all of the money coming in from Disney, Time Warner, and Microsoft, there is more than enough marketing and entertainment power to fuel a new standard for a standard home entertainment PC medium.

Think about it -

media groups would be thrilled to have a standard computer entertainment device to program for. It lowers production costs, and reduces technical support costs. It could also yield some significant anti-piracy measures that could better protect their intellectual property.

Game developers would similarly benefit. It is very hard to make money on PC games (given the tech support issues) , which is why the console business is as big as it is. There are many PC game releases, but not as many as there are for consoles.

Halo, for example, is being ported to PC. Grand Theft Auto III is being ported to PC.

It is all part of a much larger picture. It is all about money and power. The home PC will become a thing of the past, and it will likely be replaced with an xbox style computer system. Homebuilt systems are not the future.... they will soon become a relic of the past.

Good video is quite simply the last frontier preventing miniaturization. Video expansion cards are presently installed perpendicular to a motherboard, greatly increasing the size of the case required to house the system. Unless there is some way to figure out a new bus interface (like the replaceable video processor modules that Micron was playing with), video will have to be integrated on board.

Look at Abit's new Max line of motherboards... they have on-board everything - except video. Onboard firewire, USB 2.0, 10/100 LAN - no printer, no serial ports. The RAID controller is onboard. The 6.1 sound is onboard. Everything is onboard.

Everything - except video.

The sound card business will be the next to go. Wait and see.

Video will take a while longer, but it will go too. We will be buying our Microsoft, AOL, Via, or other boxes at Target and other mass market retailers.
 

sandorski

No Lifer
Oct 10, 1999
70,677
6,250
126
A conspiracy, no. Going to happen, most likely.

Consider it this way: Vidcards are currently 2-3x more expensive than mobos. Even if mobo prices double when using the next bleeding edge vidchip, they'll likely still be cheaper to produce and sell than current video cards. Also consider that integrated solutions will not(unless some manufacturer is completely stupid) have any kind of compatibility issues with the rest of the onboard system. Putting together a completely integrated mobo will make system setup all the more easier, eliminating most, but not all, hardware issues.

Cooling will also become much easier, requiring less active cooling and dramatically improving case airflow at the same time.

 

Tates

Elite Member
Super Moderator
Jun 25, 2000
9,079
10
81
The way I see, we PC hobbyists have about 5-10 years left before things become so mainstream that building your own PC will not be practical or possible anymore. Enjoy it while it lasts.
 

Auric

Diamond Member
Oct 11, 1999
9,591
2
71
Indeed. But the difference betwixt video and all those other things is that the potential for improvement of 3D is limitless. The console business requires a minimum of 2-3 year cycles. Meanwhile, the modularity of the PC motivates rapid improvement because something better can always be made and anyone is free to give it a go. Closed designs and long product cycles may be a business woman's moist dream but an investor is just as likely to favor the opportunity in open designs. As for DSP's well, I remember the IBM MWave and Chromatic Mpact and reckon dependence on software is the achilles heel when used for complicated tasks.
 

White Widow

Senior member
Jan 27, 2000
773
0
71
Where's the conspiracy? Of course as die sizes shrink it will be possible to integrate formerly disparate chipsets, but that's no surprise. You are 100% right that standards allow for greater efficiency and use of resources, but your logic is flawed. You assume that a standard means a cap on performace. As long as it is technically possible to produce improved hardware, regardless of the standard it operates within, people will want it. This is true for networking technology, sound technology, and everything else we associate with PC system components. The reason you see integrated audio and LAN is because it's easy to integrate these high-quality components. Relative to die space, the transistors needed to make these circuits is very small and there's lots of technologcal headroom in these designs to absorb the long product cycles of integrated sytems. This is not true with video.

Indeed, as many people have pointed out in this thread, the necessity for improving graphics, even in the short term, is quite evident in a way not true with audio and ethernet. For most people, the everyday use of their system is not limited by the ethernet speed or the sound quality coming out of their PC. It *is* limited by the graphics quality in a much more tangible way. As such, consumers will constantly demand imrpoved graphics far more loudly than they demand faster LAN access or improved periphreal connectivity (there's lots of extra bandwidth on your USB 2.0 bus than the keyboard, mouse, and digicam can use).

It makes sense to integrate these components because most people will not require any alternative. However, as long as a significant portion of the consumer base demands an improvement, integrating a component is a bad idea. Of course, as technology improves it becomes possible to put better quality components in an integrated platform, but that doesn't mean you can sensibly extend this trend to conclude that EVERYTHING will become integrated. The only way this will be possible is if the cost savings from integartoin are so significant that people can buy complete systems and afford to throw them away when the next "new" technology comes out. This seems unlikely.

As Nvidia has UNDENIABLY demonstrated, the product cycle for video cards is very short. The economics of 100% integrated components and video display devices simply do not match up. Video and CPU power are the two weakest links in the performance "chain" of PC computing. As such, there will continue to be constant pressure to improve these aspects of any system more so than others (like ethernet and sound). This pressure forces constant improvement and development which precludes any whosale flight toward integration.

Certainly as technology continues to improve the market dynamics will change. Third, and even second, rate add-in video devices will begin to disappear as the integrated optoins improve to address the needs of the consumer population that demand these alternatives. However, as long as there is a market for "the best" (which again is most prominent with CPU's and video) then integraiton will never completely dominate. As long as programmers can come up with interesting and engading ways to use faster and better technology then there will always be a market for add-in devices of some kind. That is, by the way, why integrated CPU's will not just magically take over either: we will continue to demand new ones to do new things that the older ones cannot within the timeframe allowed by integrated products.

In a nutshell: Of course integration will occur because it makes increasing sense economically at the low to middle end. Devices that do not inherently limit the usability of the platform will be the first to be integarted because they can maintain their usefulness throughout the life of the system. However, it will never make sense at the high-end where the nature of demand is fundamentally different. To the extent that new software is devloped that can use extra horsepower, completely integrated devices will continue to be inappropriate. And hey, if the technology changes and evreyone can find a system they like with integrated everything that meets all their needs for the expected life of their system, then everyone wins anyway. But unless graphic chipsets become ultra-OVERPOWERED then there will always be a need for the opportunity to add-in something new without replacing the whole system.

-A
 

The_Lurker

Golden Member
Feb 20, 2000
1,366
0
0
As it seems, this would most likely come true. Core logic and integratiton is the trend for almost anything. From appliances (Toaster oven, fridges) to cell phones (PDA's and cell phones combined) or even to toys. Even enthusiast products are heading somewhat towards integrations. Many of us now use onboard LAN, onboard SCSI or RAID cards. Are we all going to migrate to ONboard video and onboard sound? Perhaps, most likey.

There will however, always be a market for pure hardware enthusiasts. The Car market is quite mature, and the large majority go out and buy prebuilt cars that dont require or choose not to go and buy their own engine, put in NOS fuel injection systems, big stereos, etc. but there's still a market. That's the way things are, most of the PC market is made through OEM's and integrated devices, there will, however, always be a small market for PC enthusiasts.
 

JellyBaby

Diamond Member
Apr 21, 2000
9,159
1
81
No, it won't happen anytime soon. First, the goal of integrated video today and in the near future will be cost reduction. You won't find the cutting edge GPUs soldered onto motherboards because they're too expensive (so is the memory for that matter).

Second, video technology has a long, long way to go. Sure some don't blink twice at integrated RAID controllers or 10/100 ethernet but these technologies have fully matured. Until each of us as our own Holodeck from Star Trek the componentized platform we use today, where we can buy video cards separately and slap 'em on, will reign supreme.

There might be periods where it becomes integrated. Namely, when innovation poops out. But when a lightbulb pops over some designer's head, off we go again into addon components to get the job done.
 

dummy2001

Member
Dec 5, 2001
188
0
0
Great thread. First rule is that anyone on this forum is a bad sample of what consumers want. Consumers want a 2GHZ! (celeron) and 80GB! Hard Drive (5400 rpm) to play Bass Fishing 5 and check on their fantasy football. How many ordinary people except gamers really use the potential of their computers now? How many computers do you see advertised with 2ghz P4's and a 16mb ATI Rage or whatever, if not "4mb built in AGP graphics". Not only does integration lower costs, it also gets at the real barrier to commodification, reliability. Right now only one company sells reliable home computers, and they come in lots of pretty colors. Of course Apple controls the hardware AND the OS. So once SiS and Via start selling integrated boxes, how long before it comes installed with Linux and their own proprietary front end? Right now its Windows that does the integration of all the various pieces of hardware. Once the hardware is integrated, well, we'll all be renting our minutes of application use from MS over the internet by then anyway...
 

AA0

Golden Member
Sep 5, 2001
1,422
0
0
you need to think of it why these companies are moving into integrated chipsets.
1st - look at nforce, they charge quite a bit for the onboard features, but its a value compared to buying them separate. Their profit margins are high though, they can charge more at a minimal cost.

2nd - integrate video is a huge market, OEMs buy it like crazy, it accounts for 50% of chipset sales, and before nforce, who was in the business? intel, which can easily be cleaned out, s3? no competition. Its a massive, huge profit margin area with no competition.

MS is jumping in the chipset business to save money, they know how much nvidia profits from the xbox.


As for a onboard replacable video socket? not going to happen. Memory running at 300Mhz plus isn't really easy to be put into a socket, its just too fast. Memory timings that screw up so many motherboard companies are going to be worse at 300+ Mhz. There are enough unstable motherboards, I don't want unstable video cards. Its the same reason we don't have replaceable chipsets, memory timings are crucial for each chip.
 

Auric

Diamond Member
Oct 11, 1999
9,591
2
71
How about CPU/GPU integration? They are interdependent and require upgrading together to maximize performance anyway. How to handle memory cost effectively seems the only draw-back currently.
 
Jul 1, 2000
10,274
2
0
It seems kind of logical that the CPU and the GPU will eventually share the same silicon space. The ultimate goal will be a one chip solution, where the CPU can handle damn near everything. Intel has already stated that this is a long term goal.

I did not really mean that this was a conspiracy... I just wanted to discuss markey trends.

I'll rename the thread. :)

I have a new theory - I think Creative is buying Permedia to diversify its holdings. The sound market is drying up as onboard sound has improved by leaps and bounds. Creative will need to get back into the video biz to sell more products in an increasingly competitive market.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
I think what this means is that more OEM's will not use an add-on video card in nearly all computers they sell because the new generation of on-board video will be able to satisfy a greater amount of users and actually have performance that would suffice for nearly all gamming/3D needs in the relam of 800x600 16bit.

Death to the model 64!!!
 
Jul 1, 2000
10,274
2
0
Agreed - and this is what is part of the larger trend that will ultimately kill the PC hobbyist industry.

This thing will become standard equipment, like a VCR. Nobody (except for a few dorks that bought Heathkits) builds their own VCR.

Given the fact there will be one standard, all software will run uniformly from one unit to another. There will be no incentive to has a faster box, since the software runs the same.

This is the same reason why nobody o/c's the xbox or the PS2 - what the hell would be the point.

By making us all equals, Microsoft could sell games and entertainment to the masses. From the ghetto to suburban purgatory - everyone equal in terms of their access to information. All loyal Microsoft customers.
 

Operandi

Diamond Member
Oct 9, 1999
5,508
0
0
This thing will become standard equipment, like a VCR. Nobody (except for a few dorks that bought Heathkits) builds their own VCR.

Comparing a PC to VCR doesn?t make sense. A VCR has fixed specifications for a reason it was meant to do one thing and one thing only. You go outside those specifications and you have broken the standard and it?s not a VCR anymore. A PC by its very nature is meant to be flexible. If you try to give the PC set standards it's not PC anymore but something else.
 

JellyBaby

Diamond Member
Apr 21, 2000
9,159
1
81


<< what is part of the larger trend that will ultimately kill the PC hobbyist industry >>

Pretty unlikely. As Operandi said a personal computer is a general purpose tool. And programmable, too. It will not become fixed, static, immutable in your lifetime. It will morph from time to time but you'll always be able to expand or tinker with it to do new things.

Even many specific purpose devices still enjoy a strong hobby environment. Take speaker building. You'd think after all this time standardized speakers would have killed off that hobby but it's going along quite strong.
 

ProviaFan

Lifer
Mar 17, 2001
14,993
1
0
Computers will not be turned into Xboxes and stuff like that anytime soon. History shows that. For example, I read about when the railroad transportation system was new, a long time ago, that the people who owned businesses that operated canals were lobbying congress to stop the "new" technology. Needless to say, we know what happened. ;)