When will we see a true 512 Bit GPU?

sharkeeper

Lifer
Jan 13, 2001
10,886
2
0
Matrox Parhelia is supposed to be 512 bits. We all know it has/had sub Ti4600 performance where it counts.

What about ATi/nVidia? Anyone know?

-DAK-
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Exactly. A 512-bit memory path may help performance if the GPU can take advantage of it, but "512-bitness" by itself is meaningless. It might make more sense to keep 256-bit and just up the memory clock more, or have ultrafast cache memory, or onboard SLI with 2 GPUs and 2 memory pools, or ....?
 

sharkeeper

Lifer
Jan 13, 2001
10,886
2
0
Exactly. A 512-bit memory path may help performance if the GPU can take advantage of it, but "512-bitness" by itself is meaningless. It might make more sense to keep 256-bit and just up the memory clock more, or have ultrafast cache memory, or onboard SLI with 2 GPUs and 2 memory pools, or ....?

The cache is probably the most effective out of all of them. Video card memory is already running at really fast speeds currently! In some cases, over 500 MHz (actual) speed! Pretty soon, TMS engineering costs and effort will outstrip the cost of the memory itself!

Is it possible (outside of the environment of manufacturing) to gauge GPU load? Sort of like a GPU task manager? Perhaps a form of HT for GPU's could help things out.

-DAK-
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Is it possible (outside of the environment of manufacturing) to gauge GPU load? Sort of like a GPU task manager? Perhaps a form of HT for GPU's could help things out.
Interesting question -- I suppose one easy way to guess about whether a GPU is processor-limited or bandwidth-limited to to play with the clocking for each with an overclocking tool. Underclock the memory and see if franerates drop, overclock and see if they rise, and do the same with the GPU core speed while holding the memory clock constant.

 

AgaBoogaBoo

Lifer
Feb 16, 2003
26,108
5
81
Originally posted by: DaveSimmons
Is it possible (outside of the environment of manufacturing) to gauge GPU load? Sort of like a GPU task manager? Perhaps a form of HT for GPU's could help things out.
Interesting question -- I suppose one easy way to guess about whether a GPU is processor-limited or bandwidth-limited to to play with the clocking for each with an overclocking tool. Underclock the memory and see if franerates drop, overclock and see if they rise, and do the same with the GPU core speed while holding the memory clock constant.

I wonder if Dual Channel video cards will ever make their way into the market because if you think about it, video cards today have things many motherboards didn't have years ago. DOing that would allow them to use older RAM modules while maintaining speed and cutting costs of more expensive RAM allowing them to focus more on the GPU itself rather than memory...
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0
I wonder if Dual Channel video cards will ever make their way into the market because if you think about it, video cards today have things many motherboards didn't have years ago. DOing that would allow them to use older RAM modules while maintaining speed and cutting costs of more expensive RAM allowing them to focus more on the GPU itself rather than memory...

? most card since the tnt have used 128bit paths for memory - effectively dual channels (when compared to pc's) and now we have 256bit which is quad channels. But even the geforce 3 used 4 x 32bit access.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
In AA situations, cards are quite clearly bandwidth limited (as tests somewhere I think have shown) so increasing the memory bandwidth does provide a performance increase. This is only in 1600x1200 8x AA type situations though.
 

BoomAM

Diamond Member
Sep 25, 2001
4,546
0
0
Originally posted by: shuttleteam
Matrox Parhelia is supposed to be 512 bits. We all know it has/had sub Ti4600 performance where it counts.

What about ATi/nVidia? Anyone know?

-DAK-
The Matrox Parhelia was and is a 512bit GPU, however it had sub Ti4200 performance because they didnt clock it very high, didnt include any memory optimisations and had a lack of any type of compression.
 

codehack2

Golden Member
Oct 11, 1999
1,325
0
76
Originally posted by: shuttleteam
Matrox Parhelia is supposed to be 512 bits. We all know it has/had sub Ti4600 performance where it counts.

What about ATi/nVidia? Anyone know?

-DAK-

bits is such a suggestive term... what are you referring to? Vertex/Pixel Shader Engines? Memory Interface? I'd suggest that you read page 3 and page 4 of Anands Parhelia preview to find out how Matrox came up with the "512" in the Parhelia name...after a quick glance, you should soon realise that NV and ATI already have 512 bit GPU out (ala nv3x and R3xx).

CH2
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Did you even read this thread, vc? Sure, some enterprising guy or gal in marketing will slap the term "1024 bit" on an upcoming GPU, but that'll be as meaningless as Apple's classification of their G4 and G5 towers as "supercomputers."
 

codehack2

Golden Member
Oct 11, 1999
1,325
0
76
Originally posted by: Pete
Did you even read this thread, vc? Sure, some enterprising guy or gal in marketing will slap the term "1024 bit" on an upcoming GPU, but that'll be as meaningless as Apple's classification of their G4 and G5 towers as "supercomputers."

Don't feed the trolls Pete... VC is a floater that needs to be flushed... he is mass spamming the video forum with stupidity; the likes of which I have not seen around here in quite sometime.

CH2