The next (r)evolutionary phase in Video "cards"?

SagaLore

Elite Member
Dec 18, 2001
24,036
21
81
I hypothesize that there may come a day when video cards no longer exist. Think about this - the bandwidth is completely reliant on the bus. Other components are on the motherboard are getting streamlined - now the memory controller is on the cpu, and what was two chips (northbridge/southbridge?) is now one.

Instead of upgrading your video card, you upgrade the video-cpu and video-ram. It's on the board! The video-cpu and main cpu are bridged with an ultra fast dedicated bus. The video ram is completely removable and upgradeable via slots. The video connector is part of the mb along with the other connectors, never to be changed.

This way people can upgrade their video-cpu or expand their ram, or put in faster ram, etc.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
You're about the hundredth person I've seen "invent" this idea. It gets repeated every few weeks in here.

The biggest problem is that socketed processors and RAM are invariably slower than hardwired chips (the noise margins are never as good, and you can usually control the trace lengths much more tightly). If the pace of development of video cards starts to slow down, this might be more feasable.

Other problems:

Video cards change *really* fast. Doing something like this ties you to a particular type of 'video CPU' socket, and probably a particular kind of RAM. With an add-on video board, the manufacturer can build an entirely new GPU, or switch from DDR to GDDR3 (to whatever they're using next year) without you needing to upgrade your motherboard (as long as the sockets don't change; we've been using AGP for years, and PCIe is taking over and looks to be good for at least 2-3 years). Plus, good luck getting all the GPU makers to agree on the standards for this.
 

Falloutboy

Diamond Member
Jan 2, 2003
5,916
0
76
I could see a low end chip integrating the chipset/cpu/video on one chip but not for a highend chip way too many transistors
 

BEL6772

Senior member
Oct 26, 2004
225
0
0
Ummm, Isn't this what ATI, Nvidia, and Intel are all doing already? Don't they all have graphics engines embedded into chipsets that are permanently attached to the motherboard? Most of these solutions are aimed at users that don't place a premium on 3-D performance. As stated above the pace at which high end GPUs are evolving dictates that platforms targeted at the high-end market provide a slot for easy upgrades.
 

EarthwormJim

Diamond Member
Oct 15, 2003
3,239
0
76
Originally posted by: BEL6772
Ummm, Isn't this what ATI, Nvidia, and Intel are all doing already? Don't they all have graphics engines embedded into chipsets that are permanently attached to the motherboard? Most of these solutions are aimed at users that don't place a premium on 3-D performance. As stated above the pace at which high end GPUs are evolving dictates that platforms targeted at the high-end market provide a slot for easy upgrades.

I think the difference he's trying to say, is the chipsets shouldn't permanently be attached to the motherboard, but be upgradeable like CPUs and memory.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Er ... isn't the 16x PCIE slot a "fast dedicated" bus? Besides, it clearly makes more sense to tie the GPU and its ludicrous-speed RAM together on one board, and have the much more rarely used GPU-CPU communication travel through a frequency-inhibiting slot.

It's OK the way it is. It's not like all those companies have been missing something bleedingly obvious for 20 years now.
 

Calin

Diamond Member
Apr 9, 2001
3,112
0
0
The sockets for processors (CPU) changed from Pentium lines: SlotA, Socket A, Socket 754 and Socket 939 (and maybe 940 for early AthlonFX). Plus Slot1, Socket 423, Socket 478 and now Socket 775. 8 changes in 8 years (or more 6 years). Upgrading CPU was a "risky" bussiness in CPU world, and also would be in the VPU world. Too much change for a valid upgrade path. Also, video cards changed several types of memory in the years - VRAM, SDRAM, Double Data RAM and now GDDR3. Not to mention different (wildly different) speeds and different (wildly different) data paths.
Also, as was mentioned, the slot/socket connections will create difficulties: higher distances for traces, more difficult to equalize trace length, echoes in signal at all the connection points, supplemental tweaking of memory access for different memories and so on and so on and so on.
 

imported_kouch

Senior member
Sep 24, 2004
220
0
0
or maybe when we reach directX24 and a 8 core 30nm cpu, we cour just dedicate a couple of the cores to do DX and just add DX instructions to the instruction set.
 

Vee

Senior member
Jun 18, 2004
689
0
0
Previous revolutions in video cards, ever since the 3dFX chipset first was integrated on the videocard, have all been onboard the videocard. I'm so very sure that will continue. No reason at all to become excited by bandwidth towards MB.
 

FreemanHL2

Member
Dec 20, 2004
33
0
0
I have thought about this myself, but it is impossible. For a start there is no reason you would integrate the card into the motherboard, it would take up more room than it does now.

Secondly, if were able to upgrade RAM on the card, it could be a potentially dangerous process...
For instance putting 256mb of RAM onto a TNT2 would simply destroy the GPU, you would need to re-invent a new type of RAM for every new model of card, to ensure that the card never operates beyond its bandwidth. This would lead to several problems, especially different companies creating RAM that isn't compatable with different cards made by other manafacturers... not to mention a confusing array of RAM dedicated to specific GPU's... It could get so complicated, this is the point were you start thinking "I hope this never happens." :)
 

The Land of Smeg

Junior Member
Jan 23, 2005
15
0
0
Next Revolutionary Phase in Video Cards: Horizontally Mounted Cards.

Allows for CPU-size heatsinks for the GPU... GPUs are the most rapidly developing components in a computer, a good CPU will modern for a good 3-5 years, a good GPU will keep you modern to 1-2 years. It's always the Video Card that you always want to upgrade first because it's the first to become obsolete.

Horizontally Mounted Video Cards would allow for more space for a large heatsink, for the growing heat dissipation requirements of a GPU, It can be better fixed onto the Motherboard with more mounting points/even mounting points so if you have a heavy heatsink, the card won't be pulled/bent down by gravity.

And it will no doubt offer better airflow.