what's next for video cards?

hahher

Senior member
Jan 23, 2004
295
0
0
aside from speed and memory increases, what's next? will we just higher versions of pixel & vertex shader or will there be another aspect that video cards will offload from cpu? how will shaders be different than today in 5 years?

will pipelines just increase until it's 1:1 ratio without monitor resolution?
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
NVIDIA is just starting to move into hardware video encoding now, i would imagine in a few years it will be a nice feature. Hell it might even be nice now its not activated on NV40 to bench yet :p
 

hahher

Senior member
Jan 23, 2004
295
0
0
Originally posted by: Acanthus
NVIDIA is just starting to move into hardware video encoding now, i would imagine in a few years it will be a nice feature. Hell it might even be nice now its not activated on NV40 to bench yet :p

hmm, i thought of that as more of a temporary thing. same way there used to be hollywood decoder for dvd, but now it's just all cpu.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
The CPU requirements are very high on encoding HDTV style video... It can take days to encode a single movie.

While it is a "niche market" feature, i would think that offloading cpu load will be a big deal in the future.

Hell decoding high resolution WMVs can require a cpu of up 3.0ghz (or PR rating) today.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
"One of the coolest things that VS3.0 offers is something called instancing. This functionality can remove a lot of the overhead created by including multiple objects based on the same 3d model (these objects are called instances). Currently, the geometry for every model in the scene needs to be setup and sent to the GPU for rendering, but in the future developers can create as many instances of one model as they want from one vertex stream. These instances can be translated and manipulated by the vertex shader in order to add "individuality" to each instance of the model. To continue with our previous example, a developer can create a whole forest of trees from the vertex stream of one model. This takes pressure off of the CPU and the bus (less data is processed and sent to the GPU)." - AnandTech
 

PCTweaker5

Banned
Jun 5, 2003
2,810
0
0
Originally posted by: RussianSensation
"One of the coolest things that VS3.0 offers is something called instancing. This functionality can remove a lot of the overhead created by including multiple objects based on the same 3d model (these objects are called instances). Currently, the geometry for every model in the scene needs to be setup and sent to the GPU for rendering, but in the future developers can create as many instances of one model as they want from one vertex stream. These instances can be translated and manipulated by the vertex shader in order to add "individuality" to each instance of the model. To continue with our previous example, a developer can create a whole forest of trees from the vertex stream of one model. This takes pressure off of the CPU and the bus (less data is processed and sent to the GPU)." - AnandTech

Its things like this that I dont want from new video cards. I want the GPU to be able to process every single model without faking something even though it looks as if its real, thats just the way I am about it. To me thats like a 4 banger being as fast as a V8 and that really hurts.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
I think that eventually we will see actual CPU's built onto the cards with its own memory controller and everything completely relieving the Main motherboard CPU of any geometry processing. Video cards will in a sense, become a little second computer inside your big computer.
 

PCTweaker5

Banned
Jun 5, 2003
2,810
0
0
That would be awesome, so your pretty much saying that video cards will no longer be CPU limited? Well for the most part right?
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Eventually, they say, textures will be created on the fly... there won't be textures installed on your computer exactly... it'll all be code and the GPU will process the code and create the textures as it needs to display them.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: PCTweaker5
Originally posted by: RussianSensation
"One of the coolest things that VS3.0 offers is something called instancing. This functionality can remove a lot of the overhead created by including multiple objects based on the same 3d model (these objects are called instances). Currently, the geometry for every model in the scene needs to be setup and sent to the GPU for rendering, but in the future developers can create as many instances of one model as they want from one vertex stream. These instances can be translated and manipulated by the vertex shader in order to add "individuality" to each instance of the model. To continue with our previous example, a developer can create a whole forest of trees from the vertex stream of one model. This takes pressure off of the CPU and the bus (less data is processed and sent to the GPU)." - AnandTech

Its things like this that I dont want from new video cards. I want the GPU to be able to process every single model without faking something even though it looks as if its real, thats just the way I am about it. To me thats like a 4 banger being as fast as a V8 and that really hurts.

You have no idea how computer graphics rendering works, do you? Using the same models over and over is already very common; this would make them look *better*, not worse, and provides bandwidth savings to boot.
 

PCTweaker5

Banned
Jun 5, 2003
2,810
0
0
I could care less about how graphics rendering works, I just buy the damn thing, slap it in, and start playing. Im not trying to become one of you nerdy types this is just a nice little hobby I have on the side and I know enough to keep me goin. Never did I say it would look worse I just want things a certain way but everyone cant have their ways so whatever they do I will have to deal with as long as it looks really good. I just really hate optimizations that sacrifice IQ for performance, again not saying that this would look bad.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: PCTweaker5

Its things like this that I dont want from new video cards. I want the GPU to be able to process every single model without faking something even though it looks as if its real, thats just the way I am about it. To me thats like a 4 banger being as fast as a V8 and that really hurts.
Originally posted by: PCTweaker5

...I could care less about how graphics rendering works...

So, what exactly is your point then? Either you care or you don't.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
PCTweaker, to echo Matthias' reply, geometry instancing allows for displaying more objects on-screen at the same IQ at the same speed. It's exactly "things like this" that you should want more of from future 3D cards. I can't think of an easy car analogy, but it's definitely not a V4 acting like a V8. AFAIK, geometry instancing is about gaining speed, not sacrificing IQ. In fact, IQ will be better, as you can have more models on-screen at the same frame-rate.

If you care less how PC graphics works, why are you posting in a "future of 3D tech" thread? Obviously we all want "bigger, better, faster, more," but this sort of thread is about how IHVs will achieve that.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
Originally posted by: Pete
PCTweaker, to echo Matthias' reply, geometry instancing allows for displaying more objects on-screen at the same IQ at the same speed. It's exactly "things like this" that you should want more of from future 3D cards. I can't think of an easy car analogy, but it's definitely not a V4 acting like a V8. AFAIK, geometry instancing is about gaining speed, not sacrificing IQ. In fact, IQ will be better, as you can have more models on-screen at the same frame-rate.

If you care less how PC graphics works, why are you posting in a "future of 3D tech" thread? Obviously we all want "bigger, better, faster, more," but this sort of thread is about how IHVs will achieve that.

Exactly. Who cares if the job is done by Geometry instancing, it no more artificial then the colors of rendered pixels.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
I could care less about how graphics rendering works, I just buy the damn thing, slap it in, and start playing. Im not trying to become one of you nerdy types this is just a nice little hobby I have on the side and I know enough to keep me goin. Never did I say it would look worse I just want things a certain way but everyone cant have their ways so whatever they do I will have to deal with as long as it looks really good. I just really hate optimizations that sacrifice IQ for performance, again not saying that this would look bad.
You admit that you only know enough to get by so you really shouldn't be commenting on geometry instancing since you don't know what it is. Multiple instancing really has nothing to do with IQ.

Currently, the CPU has a single master copy of every model and due to physics and the number of a given model in the scene, it geometrically transforms the master model as needed for every given instance of the model and then turns these models into a strip stream for rendering. What multiple instancing allows is for the CPU to just send the master copy of the model over to the GPU and have it do all the geometry transforms and such. The GPU because it is instruction set specialized and has a super long pipeline and has so many execution units can do the job much faster than the CPU can.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Originally posted by: keysplayr2003
I think that eventually we will see actual CPU's built onto the cards with its own memory controller and everything completely relieving the Main motherboard CPU of any geometry processing. Video cards will in a sense, become a little second computer inside your big computer.
...how is that different from current cards? People have been saying that for ages, and guess what? It's happened. They aren't fully programmable yet (not counting NV's, which is a separate piece, and we have no way of knowing if it will be part of the next DX spec), but very close.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Heh, I noticed nobody quoted my "fairly" interesting comment on future onboard video cpu's. But everyone jumped on and quoted many times, PCTweaker's comment because you found it to be negative.
Are all you guys drawn to negativity? Is that all your here for is to trounce on someone for saying the "wrong" thing? Are you all here just looking for a quarrel? WTF?
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Originally posted by: keysplayr2003
Heh, I noticed nobody quoted my "fairly" interesting comment on future onboard video cpu's. But everyone jumped on and quoted many times, PCTweaker's comment because you found it to be negative.
Are all you guys drawn to negativity? Is that all your here for is to trounce on someone for saying the "wrong" thing? Are you all here just looking for a quarrel? WTF?
I quoted your comment. However, it's a non-issue. We're about a year, maybe two, away from having fully programmable GPUs from both camps, both with hardware and.or drives conforming to a DX and/or OpenGL standard. Will there be sockets? No. DIMM slots? No.

Anyway, yes, we are looking for a quarrel. :laugh::beer:
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Originally posted by: zephyrprime
What exactly does it mean to have full programability?
To be similar to a CPU, where you can tell it to do whatever you want, instead of being locked into certain methods and features. If there can be a standard set for them (DX10?) that they must all conform to, so that if X goes in, Y comes out, within some very small margin of error, it could mean serious video offloading--even to the point of your 3D card fully handling DivX playback, for instance, provided the app gives the card the info it needs. Also, if they can do it smart enoguh, they might be able to make cards that could, with drivers and/or firmware updates, have new graphics features backported to them.
...of course that last bit is probably a pipe dream :)
 

PCTweaker5

Banned
Jun 5, 2003
2,810
0
0
You guys seem to only be comprehending the parts in my posts that piss you off. Like I have mentioned before, this is what *I* would like to see and not what the whole world should have to see, its my opinion and you guys have nothing better to do then to try and start sh*t. My comments arent ignorant or pointless, they are merely comments reflecting my own personal thoughts on new technological "advances". So before you guys find something in this post that hurts you dearly and makes you feel the need to lash out, take a second to realize that its just an opinion to which everyone is entitled to. Thank you!
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
In terms of pure hardware power a lot is coming by 2014 as predicted by Nvidia. If This Performance outlook turns out to be true, my jaw will drop. 100ghz CPU frequency, 44ghz memory frequency....well read the rest.
 

PCTweaker5

Banned
Jun 5, 2003
2,810
0
0
That is awesome but its too bad the world is gonna end in 2012, guess I wont have a family of my own.