Will the video card someday be non-existant???

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
I've come to believe the theory that someday integrated video will be every bit as powerful as the graphics cards that will be out at the time. If i remember correctly the video card was developed to handle 3d data that cpus cant handle. So whos to say Intel/AMD cant spit out a 256bit cpu with integrated 512bit DDR graphics running at a true 400mhz bus (1600 quad pumped). I mean is it more of a matter of no one having the balls to change the standard? Because such a system wouldnt require a video card, hell or even a sound card, the CPU would be able to handle the work of all these things simultaneously.

Just a thought i thought i would throw out for discussion.

Obviously microsoft would have to come out with windows XP-512 ;-) but hey its theoretical here
 

Belegost

Golden Member
Feb 20, 2001
1,807
19
81


<< If i remember correctly the video card was developed to handle 3d data that cpus cant handle. >>



You don't remember correctly. The video card was made to allow computers to print data to a CRT. The early mits systems and the like had a little set of lights to output data, and could be connected to a printer/plotter to do the same thing (with the right printer card of course) the video card was simply hardware that could interface with a CRT and produce an image.

I've owned computers going as far back as a IBM issue 8088, and most of them have had video cards, and I wouldn't have dreamed of trying to run a 3d scene more complicated than a wireframe cube on it.



<< running at a true 400mhz bus (1200 quad pumped). >>



Do the math, 400x4 = 1600 using standard base 10 mathematics in R.

Truth be told this is reality. Anyone could go buy an nForce motherboard and get built-in video that performs great.
The real reason that video cards exist is because there is a market for them, the programs run on a computer continue to increase in complexity, so the video cards continue to increase in complexity. Furthermore, the video card that is right for a gamer is significantly different from the video card that is best for video editing, which is far different than the video card that is best for organizing spreadsheets. So, it follows that video cards will continue to be a separate card as long as there is a significant difference in the abilities needed from video cards.

I suppose if somewhere in the future a company creates a motherboard with built-in video that can play games, edit video, and render lightwave scenes with equal performance to any card in one of those areas then we might see a significant trend towards that.
 

rimshaker

Senior member
Dec 7, 2001
722
0
0
I envision the entire graphics system being included within the cpu core itself within the next decade or so. It's headed in that direction if you step back and look at the big picture. And this embedded system would be accessible via software.
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
Until video cards become a commodity prduct, they wont be integrated. ie, if different people want different videocards, then they will not integrate onto the CPU.
 

rimshaker

Senior member
Dec 7, 2001
722
0
0


<< Until video cards become a commodity prduct, they wont be integrated. ie, if different people want different videocards, then they will not integrate onto the CPU. >>



Looking from today's perspective, then yea, it doesn't seem likely. But mark my words.... within this decade :)
 

MagnetStone

Junior Member
Apr 18, 2002
5
0
0
if they wanted they could put graphics processor with the main processor, but then it will not be enough for some or more or equal for some, so better to leave it on users hand. But, when they will start to put one, there WILL BE definitely option for using another(better) one.

Becuase of demand / innovation, primary processors are increasing their multimedia processing power (MMX, SSE, 3dNow, AltiVec, VIS, MAX, MDMX, SH4, etc).

By adding Video Level 1/ VL2/VL3 cache and incorporate more graphics power, they could minimize the need for external graphics processor / card. But, then we will end up with lot of valuable cpu resource/cycles wasted up fetching/waiting for video data.

Those were mainly about OUTPUT, but now about INPUT ....

Where we are very behind, or, KEPT behind. NOT ONLY VIDEO ALSO AUDIO. Of course among many reason, some are, less research, availability, (and another might be ..."its too much power for regular people").

If that would have been the chosen path (incorporating video/audio processing unit),
or, Will Be,

Then,

We could/will use Computer's assistance to SEE/look(/hear) around our environment / objects / peoples & their movement / tinyest facial expression change recognition / etc through their eye/CCD Camera/(?) & ear (microphone / (?)) and work/process on those data effectively, quickly and take proper response (like, computer driven plane/car, personal assistant, etc).

Of course, THEN, some of us will have to introduce video/audio (recognition / tracking / tracing) processing unit (or, ISA (Instruction Set Architecture)) which is directly coupled / connected to its video/audio input interface (CCD/(?/GaAs)/microphone) and ALSO with their main/primary processor through HyperTransport / HyperThread / Multi-PARALLEL Processing bus. If not on Multi/Parallel bus, then at least on a separate card with Video/Audio Recognition chip for Shared bus.

yeah ... yeah .... i ... know ,,, .... it's .. like .. fiction ,,,,
but that is some(/lot) of ours dream ... life. so, not impossible.
 

arcas

Platinum Member
Apr 10, 2001
2,155
2
0
I think what we'll see is the continued integration of components into the core chipsets and CPU. Further, we'll see the line separating function traditionally relegated to the core chipsets move so that more of those functions are included in the CPU. We're already seeing this in the embedded/low-power world with CPUs like the Geode. We're seeing this integration occur at the mainstream chipset level too (i810/815, nForce and at least one from Via come to mind immediately). We're even seeing this happen at the very high end where the AMD 'Hammer' will include some memory circuitry traditionally located on the northbridge chip. In time, I think we'll see video cards go away unless you need something very specialized.

It used to be that IDE cards were a must. Now IDE is a standard component on most every mainstream chipset out there. Anybody remember the dedicated hard drive compression cards they used to sell (I think they were called "Stacker")? Anybody still using one today in a modern system? Most likely not. Hardware data compression isn't necessary anymore. Remember when the only way to watch DVDs reliably on a computer was to have a dedicated MPEG2 card? Today, many video chips/cards implement features that off-load the most computationally-intensive parts of MPEG2 decoding. Dedicated MPEG2 decoding hardware is essentially obsolete. How much longer before the majority of video chips offer some sort of video encoding hardware? 3 years?

There continues to be a progression towards more and more integration. In time, as the specs of on-chip/chipset devices catch up to the specs of the stand-alone cards, the mainstream market will have less and less reason to buy dedicated cards. How long will it be? Good question. But it will happen.

(BTW, it's interesting to note that the i860 RISC chip of the mid/late 1980s was originally intended to be a video chip until someone realized that its floating-point capabilities were far beyond most existing CPUs and that there might be a bigger market for it as a number cruncher instead.)

 

rimshaker

Senior member
Dec 7, 2001
722
0
0


<< Anybody remember the dedicated hard drive compression cards they used to sell (I think they were called "Stacker")? >>



Ooh wow man... :) I remember the card wasn't necessary, but was an option to really speed up operations on stacked drives. Thanx for the flashback, i feel so young again! :D
 

Shalmanese

Platinum Member
Sep 29, 2000
2,157
0
0
As I said, unless Video Cards become a commodity, they will never be integrated into a CPU.

IDE controllers and MP2 decoders really dont have any outstanding features that would make you pick one over the other so they have been absorbed. As long as person X wants 100 fps in Ultra Quake Tounament 3000 and person Y only wants to do spreadsheet work in Excel 2010, videocards will still be an addon.
 

Jerboy

Banned
Oct 27, 2001
5,190
0
0


<< I've come to believe the theory that someday integrated video will be every bit as powerful as the graphics cards that will be out at the time. If i remember correctly the video card was developed to handle 3d data that cpus cant handle. So whos to say Intel/AMD cant spit out a 256bit cpu with integrated 512bit DDR graphics running at a true 400mhz bus (1200 quad pumped). I mean is it more of a matter of no one having the balls to change the standard? Because such a system wouldnt require a video card, hell or even a sound card, the CPU would be able to handle the work of all these things simultaneously.

Just a thought i thought i would throw out for discussion.

Obviously microsoft would have to come out with windows XP-512 ;-) but hey its theoretical here
>>



You didn't remember correctly. With an exception for DVI interfaced displays, the signal needs to be in analog form. Video card is basically a DA converter with some buffer on the digital side to buffer information for DA converter to process. Modern video card contains all kinds of DSP ASIC chips that does all kinds of 3D rendering in addition to DA converter.
 

Jackhamr60504

Member
Nov 12, 2001
96
0
0


<< As I said, unless Video Cards become a commodity, they will never be integrated into a CPU.

IDE controllers and MP2 decoders really dont have any outstanding features that would make you pick one over the other so they have been absorbed. As long as person X wants 100 fps in Ultra Quake Tounament 3000 and person Y only wants to do spreadsheet work in Excel 2010, videocards will still be an addon.
>>



That is true while the price for card X and card Y differ. If the price of the X card was very near the price of the Y card, everyone would use the faster X card.
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
I would believe in this.

Intel would buy Matrox, still keep them selling cards, but take the G400 2D Core and integrate it onto the prescot or something.

Then have the CPU have an integrated 128 bit bus to a 32MB DDR 333 module. Under the IHS.

This would give the CPU *2d* capabilities.

Matrox 2d integrated onto your CPU core.. how wonderful. Now we're back to the days of dedicated 3d accelerators. Anyone relish that era? G200 paired up with SLI Voodoo2? Ah.. the good old days..

I think this would be win/win for everyone. Don't you? Imagine a mobo with VGA and DVI port.

AMD would be doomed..
 

mellondust

Senior member
Nov 20, 2001
562
0
0
My dad worked on a research team for Intel last year which studied the aspect of combining the video and cpu in one. He wasn't allowed to say exactly what they came up with other than that the ultimate conclusion was that it would be to hard to compete with the likes of nvidia and others to deliver something that could compete. They decided that for now they would let the graphic industry continue to provide the cutting edge graphics.
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
......Mellon dust

Note, I said G4002d core...

Now *that* would kick butt.

Remember the days of independent 3d and 2d graphics cards?

Well, now we would just have 3d. And 2d image quality wouldn't be locked to the videocard anymore.

Which could be a blessing, or a curse....
 

PowerMacG5

Diamond Member
Apr 14, 2002
7,701
0
0
I highly doubt that video cards will become non-existant. Video cards give the user the option for expansion, just as a slot/socket CPU gives the user option for expansion. If they were to completely integrate video into a motherboard and not use cards, to get better graphics you would need to buy a new motherboard. Some of you are probably thinky, "Why don;t they make a GBU socket on the motherboard, and instead of replacing the card all you do is replace the GPU like you do a CPU? That might be a good idea, but again you have performance limits. With a video card the company, like ATI or NVIDIA, can add different chips, transistors, etc... to the card, but if you use a socket on a motherboard you are limited to only change the GPU. The video card is as technical, if not more technical than a CPU. A video card has to do different things than a CPU. The CPU of your computer is doing calculations, thats basically it (I'm being simple right now), just as the GPU. The video card has many different components an it that get replaced as you upgrade the video card.
 

imgod2u

Senior member
Sep 16, 2000
993
0
0
For whatever you can build integrated, you can build a better separate card. For all the technology that you have to put the GPU and CPU together, you can make a better combination of a dedicated CPU and GPU. It'll always be that way if the GPU and CPU were one chip. The only way I see it ever changing is if the market moved to mass-redundancy systems. Having a single computer in one small chip (CPU, memory, HD, video, etc), and then having those chips have the ability to link with eachother on an large scale. That way, if you want a more powerful computer, just add on more chips. As technology improves, and better chips come out, you can simply replace the old ones with new ones and not need to do anything to your computer as a whole.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
all im getting at is what if there was a 512bit 800mhz chip out that had ddr650 on it like the ti500, theoretcally (this is with a lot of theory) wouldnt it be able to do all of the video and audio operations just with integrated technology? (and achieve the same performance)
 

imgod2u

Senior member
Sep 16, 2000
993
0
0


<< all im getting at is what if there was a 512bit 800mhz chip out that had ddr650 on it like the ti500, theoretcally (this is with a lot of theory) wouldnt it be able to do all of the video and audio operations just with integrated technology? (and achieve the same performance) >>



You mean a CPU? The "bitness" of a CPU is much different than the "bitness" of a rendering pipeline. A rendering pipeline merely has to be wide enough to do operations on a certain number of bits, whether it's 1 256-bit operation or 4 32-bit operations. A 512-bit CPU would be totally different. CPU's are huge collections of logic circuits, registers and pipelines that don't just run some data through a set number of operations, they actually go through instructions. They work completely differently than a GPU, which takes data and runs it through a fixed rendering pipeline (well, recently, there has been a move to programmability but the instructions still have to be pre-programmed into the GPU). For the CPU to do the work of rendering, it has to run the rendering model (DX-like or OpenGL-like) through software, using its own logic instruction set, which, suffice it to say, is a lot slower than the natively fixed pipeline that crunches data like on a GPU.
Also, a video card contains far more than just a GPU. It's first and main purpose is to display the information the CPU sends it onto the screen. A RAMDAC is used to convert the digital image into an analog one to be display. In the case of digital monitors, the data still needs to be "compiled" and formed into an image. You need SOME circuitry to output the image. A CPU is a collection of logic circuits, it doesn't have any interface to the outside world asside from the chipset.
 

Mday

Lifer
Oct 14, 1999
18,647
1
81
intel actually tried to do so by creating their chipsets with built in video controllers. think about a laptop, the cheap ones have their video controller built into the core logic chipset, ie, intel. so basically, if you consider that, as well as the goal of system-on-a-chip, a logical conclusion is that yes, video cards will be obsolete.

however, considering the computing strength of the GPUs, replacing the GPUs with the CPUs (going backwards) will require much more powerful cpus. and of course the US govt will block the export of such systems lol.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Well actually it wasn't Intel's idea, SiS were first to do that. Loooooong ago, around 1997.

The point why we still have separate graphics cards (and probably ever will) is time to market. The graphics engines and RAM speeds evolve way faster than the much more complicated chipset cores ever will. Through the five years we've had integrated graphics, we've always seen previous-generation graphics cores pulled into chipsets. Pair that with shared-memory architecture, and the outcome is that chipset-integrated graphics are always two or three steps behind current discrete graphics boards. This is also why the latter won't go away.

What has been disappearing almost completely is the market for the most basic $20 graphics cards. Practically noone bothers make those anymore.

regards, Peter
 

L3Guy

Senior member
Apr 19, 2001
282
0
0
IMHO, the driving force in GPU integration with a CPU will be bus bandwidth constraints.
At some future date, the bandwidth and latency of having the GPU off chip will cause it to have lower performance than an integrated CPU / GPU.
If its feasible to manufacture the combined chip at a reasonable cost, then it will become standard.
I personally think we will see that sooner in the console market than the PC market.
Xbox has only a 700 MHz P3 and it rocks. Make the same combo on .9 micron process on one chip and costs would drop and speed would jump.

Just my 2 cents.

Doug
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
... but still, until you manage to slap that future GPU onto a CPU die and make the two use the same RAM controller, one or two even more advanced generations of separate GPUs will have become available.

The communication bandwidth between CPU and graphics card hardly limits graphics performance - and that'll be improved by new bus systems like HyperTransport anyway. What really matters is the GPU's available local bandwidth during the render process - and you'll always have more of that when the GPU is a separate entity with its own private RAM.

regards, Peter