New era of GPU. Ananadtech needs to do this.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
Sigh, so you acknowledge that we can't emulate some rather important things in SW? Hey we're getting somewhere, great.

Oh and I'm still waiting for your link to the rather interesting definition of "BIOS" that you're using - your other links didn't help there.


Ahh I siad niether. Thanx for playing. No point in arguing with you on OoO becuase its doesn't matter when you have parrellel computing power. OoO is great marketing hype though. I mean yeah lets run instructions down the pipeline when we need them and if we are waiting on the rams lets run these instructions and stick the result out in the cache.

but none of that means a sack of beans when we can execute instructions across a bunch of small cores. Not really even instructions but when we can crunch a bunch of really complicated math really really fast. Normally we have to wait on such operations in the pipeline.

you know what though. Its not really worth my time.

As to BIOS. do you know what that means ?
 

Cattykit

Senior member
Nov 3, 2009
521
0
0
About ATI and OpenCL issue:

It's been pointed out that openCL, unlike CUDA, lacks proper development support. So far, all those NLE developers are going for CUDA because they can't do anything with openCL at this point.

The furture looks very gloomy but there's a hope: Apple. As you probably know, Apple's Final Cut Pro is a very popular product to the point it can even be considered as an industry standard. Last time I checked, Appls's going for openCL route and I think they can be a major force that can change current situation.

Until that becomes reality, there's no point of talking about how ATI can do this or that. For now, it's CUDA only and the best from ATI is worse than the worst of nVidia in this area.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
I see you have a camaro in you avatar. does a ford 302 have a significant advantages over a ls1 ??? the answer is no.

I am reading through the nvidia white papers,what little they are releasing. not seeing any hugely significant differences.

Really, really bad metaphor.

And keyword is "reading". Keep going. Ben rattled off 5 things off the top of his head without the whitepapers in front of him, just imagine what you mind find by actually reading them AND hopefully understanding them.
But I have to agree with the OP's last sentence in his last post. Apple may be OpenCL's champion and will probably see it through, but for now, it's just CUDA.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
Ahh I siad niether. Thanx for playing. No point in arguing with you on OoO becuase its doesn't matter when you have parrellel computing power. OoO is great marketing hype though. I mean yeah lets run instructions down the pipeline when we need them and if we are waiting on the rams lets run these instructions and stick the result out in the cache.
You didn't get that the context in which I mentioned OoO was exception handling, did you? And I really hope you see why OoO execution makes exception handling and debugging more problematic.
Also on a side note you completely missed the point of OoO execution.. or did you somehow get rid of ld/st instructions?


The point is: There are things that are better done in HW and there are things you just can't do in SW and some of those are rather important and can influence the performance heavily. That ranges from missing features (Ben and I already enumerated more than half a dozen, no need to repeat ourselves), to the architecture and the HW (nr. of registers, cache,..) itself.


As to BIOS. do you know what that means ?
Yeah basic input/output system and it makes absolutely no sense in how you used it.. come on at least admit that. Or just link to a definition of "BIOS ISA" in context of a GPU.


@Cattykit: Yeah that'd be something where apples influence could really help - pressing OpenCL, combined with a good development environment for both cards, yes please. Because vendor dependency may be no real problem for business and pro applications, but for consumers who don't buy a GPU to just run one program, it's critical.
And honestly the most important thing isn't performance, but a good development environment.. you can only use all that performance if you can write and maintain your programs reasonabely well and handle the complexity. That's something where Nvidia is heavily investing, but there's still MUCH to do (just think of the compilers - I don't want to recompile my program for every different card and plaster the code with #IFDEFS just to get performance.)
 
Last edited:

extra

Golden Member
Dec 18, 1999
1,947
7
81
Yeah, I have a gtx470 partially because I was interested in accelerating video editing from my 7d, but I haven't really done much video. For encoding stuff, I've not found a great GPU one yet. Fast, yes, but not as good quality.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Yeah, I have a gtx470 partially because I was interested in accelerating video editing from my 7d, but I haven't really done much video. For encoding stuff, I've not found a great GPU one yet. Fast, yes, but not as good quality.

http://www.anandtech.com/show/2781

I'm not really up on encoding, but have you see this article? Nvidia image quality looks good here.
 

sandorski

No Lifer
Oct 10, 1999
70,217
5,796
126
An In-Depth article into Non-Gaming Apps on GPU's with Comparisons would certainly be interesting.

Whether to add Test Results in regular Vidcard Reviews is another question though, especially if such Apps are very specialized and have limited Appeal. For that you probably want some sort of Suite that Tests specific capabilities which can then give indications as to how well a GPU might work with particular Apps.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
You didn't get that the context in which I mentioned OoO was exception handling, did you? And I really hope you see why OoO execution makes exception handling and debugging more problematic.
Also on a side note you completely missed the point of OoO execution.. or did you somehow get rid of ld/st instructions?


The point is: There are things that are better done in HW and there are things you just can't do in SW and some of those are rather important and can influence the performance heavily. That ranges from missing features (Ben and I already enumerated more than half a dozen, no need to repeat ourselves), to the architecture and the HW (nr. of registers, cache,..) itself.



Yeah basic input/output system and it makes absolutely no sense in how you used it.. come on at least admit that. Or just link to a definition of "BIOS ISA" in context of a GPU.


@Cattykit: Yeah that'd be something where apples influence could really help - pressing OpenCL, combined with a good development environment for both cards, yes please. Because vendor dependency may be no real problem for business and pro applications, but for consumers who don't buy a GPU to just run one program, it's critical.
And honestly the most important thing isn't performance, but a good development environment.. you can only use all that performance if you can write and maintain your programs reasonabely well and handle the complexity. That's something where Nvidia is heavily investing, but there's still MUCH to do (just think of the compilers - I don't want to recompile my program for every different card and plaster the code with #IFDEFS just to get performance.)


Just becuase your kind of pissing me off.

this is the last time I will reply to your posts on this subject manner.

Here try wiki for BIOS. It creates the low level adress space/instructions input needed for communication with the host machine Operating system. I fully expect EFI to replace this soon enough but widespread implemantation is at least 3-5 years away for a complete shift to it.

http://en.wikipedia.org/wiki/BIOS

ISA is the Instruction Set Architecture. ssentially it is the backbone of how you compile and build a kernel that will run on the target hardware.

http://developer.amd.com/gpu/ATIStr...een-Family_ISA_Instructions_and_Microcode.pdf

Now if you want to learn something. Maybe reading over the white papers would enlighten you to the fact that AMD hardware does allow for OoO and a great many other features that might be Driver "intermediate LeveL" limited with there current release. Maybe the drive support is just to immature " my suspcion"

either way. Nvidia might have a substantially easier card to write code for. That true becuase CUDA implements alot of kernel functionality writing code is easier.

But to keep insisting that the AMD hardware is incapable. Bleeds of ignorance and malace.
 

LokutusofBorg

Golden Member
Mar 20, 2001
1,065
0
76
Sony Vegas 10 has added limited CUDA support. I've been using v9 Platinum for a couple years or so and really like it. It's well under $100 so easily affordable for home video editors.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
Here try wiki for BIOS. It creates the low level adress space/instructions input needed for communication with the host machine Operating system. I fully expect EFI to replace this soon enough but widespread implemantation is at least 3-5 years away for a complete shift to it.
Yep, said that posts ago.. but a GPU still doesn't need a BIOS and a "BIOS ISA" was just made up by you.


But I'm interested where you read in that whitepaper that the architecture supports OoO execution, because every review so far (including Anands) touted that part as completely new for GPUs. Also you do know that you don't need ANY SW/driver support for OoO? That's the whole point of it (and the discussion in large) - some things just can't be implemented in SW.. and OoO execution really is a prime example for that. So that's just nonsense..
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
I think it'd be a nice review to see what GPU's can do these days for general tasks. They're not just pixel pushers any more.

I think we can see what AMD's approach is vs. Nvidia's. Nvidia is trying to make a general purpose GPU, a part that does a lot more than just giving you good FPS in games. AMD appears to be taking a platform approach. CPU tasks will run on the there, their GPU will do the things it does best. Nvidia doesn't have a CPU. Intel doens't have a GPU. This could be AMD's ace in their pocket if they could get it off the ground. Nvidia has done a lot of work in this area compared to AMD.
 

borisvodofsky

Diamond Member
Feb 12, 2010
3,606
0
0
You're wrong. It's about where the largest market is.

Photo editing, movie editing, -> MUCH less gear-heads and more "hate it cuz it's a job."

Gamers:,, buy buy buy, cuz omfg I need to see more particles..

THAT's the difference.

It takes at least community college to be a graphics designer, video editor,

Gamers don't need shit besides the will to waste time.. WHICH is congenital.
 

Ryan Smith

The New Boss
Staff member
Oct 22, 2005
537
117
116
www.anandtech.com
Hi all, your friendly neighborhood GPU editor here;

I just wanted to let you know that I've seen this thread and I do think it's an interesting idea. I can't really comment besides that - there's a lot of research that would be required before even deciding to go ahead on an article like this, and I'm currently up to my neck in video cards. We do want to expand our GPGPU coverage (and I'm heading to NVIDIA's GPU Tech Conference in 2 weeks to jumpstart that process) so this is definitely a good suggestion.:)

-Thanks
Ryan Smith
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Wow this thread has really brought out the pitchforks and stakes.
As an ATi fan I don't see any problem with a good article all about CUDA and what uses can be made of it with NVDA's latest cards.
It's one of their feature marketing points so it would be nice to know exactly what it can do.
There's no need for people to be claiming it would be biased against ATI,everyone knows it's something that NV cards provide.Why not list its strengths and show what it can do that might be of use to their owners?
If it makes such a big difference to programs that video card users may well be interested in then it might even persuade people to get an NVDA card in the future.
I don't see any harm in that.
After all if you have a need for programs that number crunch things like password cracking I believe ATi is very strong in this area.
A decent list of non gaming software that can be accelerated by either teams GPUs sounds good to me.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Wow this thread has really brought out the pitchforks and stakes.
As an ATi fan I don't see any problem with a good article all about CUDA and what uses can be made of it with NVDA's latest cards.
It's one of their feature marketing points so it would be nice to know exactly what it can do.
There's no need for people to be claiming it would be biased against ATI,everyone knows it's something that NV cards provide.Why not list its strengths and show what it can do that might be of use to their owners?
If it makes such a big difference to programs that video card users may well be interested in then it might even persuade people to get an NVDA card in the future.
I don't see any harm in that.
After all if you have a need for programs that number crunch things like password cracking I believe ATi is very strong in this area.
A decent list of non gaming software that can be accelerated by either teams GPUs sounds good to me.

You're right, there is no need to claim this. Especially now that this thread precedes any article Ryan may put together. It was a good idea to create this thread as now it serves as a reference that most people agree it would be a good source of data.
And this type of article applies to total non-gamers as well.
The biggies are:
Video editing and transcoding
Folding at Home, SETI and any form of GPGPU Distributed Computing.
Anti-Virus protection (Kaspersky)
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
I would be keen to read such a thing, for one :)

I have my doubts about the future of GPGPU on mainstream desktop computing however.

IMO the proportion of people who video-edit/transcode is so minute as a percentage of computers users that it's almost no market at all (not quite as dramatic as that, mild hyperbole ;)).

It's easy to forget that the vast majority of computers in use in offices and homes use office/email and the internet, maybe some DVDs too. Most people who don't hang around AT don't give a fig about video-editing or computing, and couldn't care less even if a discrete chip made it faster ;)

I can't even being to understand the anti-virus applications of GPGPU on home and office desktops, even an i3 in most people's machines will sit almost entirely idle for most office/email/internet purposes, so what do you need to offload to a GPU?

As for SETI/Folding at home/distributed, it's laudable, but again, outside of the rarefied air of computer forums, hardly anyone as any meaningful % of the computer base in use runs them or knows what they are even, I would wager. Even if they did and they also knew that a discrete graphics card could make that faster, who buys computing gear to run those programs alone? Even on somewhere like AT the % would be tiny, so a tiny % of a tiny % of desktop computer users...hardly the stuff marketing dreams are made of.

So interesting, yes. Of great value to people who do need it, no doubt. But mass market, nuh huh.

For the mass market, I just can't see current or future CPUs lacking the horsepower to do everything that 99% of desktop computer users need to do outside gaming (and even that will be encroached upon very quickly by intel and AMD with their on-die graphics).

As an alternate reality: GPGPU does develop an important place in mainstream computer: however, it's as a result of the dramatic increase in increasingly powerful integrated on-die GPUs, that are more than enough to leverage GPGPU apps for that vast vast vast majority of computer users...again leaving the need for discrete GPGPU cards out in the cold for those users...

That said, I see GPGPU having a bright future in commercial/military/scientific applications where they need/want 'off-the-shelf' number crunching brutality.

EDIT: just some stream of consciousness thoughts here ;)
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Dug, I cannot even begin to count the amount of percentile statistics you just pulled from thin air.
(Not a single link to back them up). How many people do you think create home movies from camcorders in the world? 7? Maybe 8?
I have literally dozens of old Hi-8 tapes I want to convert to my own DVD movies, but I don't have that kind of time. So I think that if anything could speed up that process, it would be a plus. Instead of waiting hours for final encoding, perhaps I can do it real time or faster.
So you can say whatever percent of whatever userbase you want, but those are made up numbers.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I think re-encoding videos will be a huge market. Everybody has shelves full of dvd's and just like cd's a lot of people are going to want to rip and convert them - making them suitable for playing on phones/tablets/media servers/etc.

That doesn't happen yet mainly because:
a) it's too hard to work out how to do it.
b) even if you have a very fast cpu it takes hours and hours.

gpgpu is the solution to (b), once (a) is also solved so it's as simple as just sticking it in your pc then everyone will be doing it.
 

WelshBloke

Lifer
Jan 12, 2005
31,372
9,264
136
Dug, I cannot even begin to count the amount of percentile statistics you just pulled from thin air.
(Not a single link to back them up). How many people do you think create home movies from camcorders in the world? 7? Maybe 8?
I have literally dozens of old Hi-8 tapes I want to convert to my own DVD movies, but I don't have that kind of time. So I think that if anything could speed up that process, it would be a plus. Instead of waiting hours for final encoding, perhaps I can do it real time or faster.
So you can say whatever percent of whatever userbase you want, but those are made up numbers.

Those shouldn't take too long to do, I do (occasionally) make HD home movies and dont really have any problems with speed.

It could be faster, but the bit that takes the most of my time is actually arranging the clips I want and sorting out sound tracks etc, the computer can get on with its thing (encoding/transcoding and what not) when I'm not there.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Keys:
The biggies are:
Video editing and transcoding
Folding at Home, SETI and any form of GPGPU Distributed Computing.
Anti-Virus protection (Kaspersky)
Dug:
I can't even being to understand the anti-virus applications of GPGPU on home and office desktops, even an i3 in most people's machines will sit almost entirely idle for most office/email/internet purposes, so what do you need to offload to a GPU?
I'd also like some info on how it pertains to antivirus software.
Microsoft Security Essentials just pounds 3rd party AV programs into dust(for free) in the home users market.
Are you talking about some kind of industrial strength AV usage where MSE is unsuitable?
Video encoding from hard media like 8mm camera footage,VHS tapes etc to digital files suitable for hard drive storage etc is just begging for high performance acceleration via our (mostly idle) high end GPUs as it's slow as hell even with a fast Core i7.
The biggest feature I can imagine would be an analog data stream capture component on the video card so a 3rd party capture card isn't needed.
Put a good gaming GPU and a video capture function on a single video card and I'm ready to buy it...whatever the color.
 
Last edited:

WelshBloke

Lifer
Jan 12, 2005
31,372
9,264
136
I think re-encoding videos will be a huge market. Everybody has shelves full of dvd's and just like cd's a lot of people are going to want to rip and convert them - making them suitable for playing on phones/tablets/media servers/etc.

That doesn't happen yet mainly because:
a) it's too hard to work out how to do it.
b) even if you have a very fast cpu it takes hours and hours.

gpgpu is the solution to (b), once (a) is also solved so it's as simple as just sticking it in your pc then everyone will be doing it.

Here you go
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
gpgpu is the solution to (b), once (a) is also solved so it's as simple as just sticking it in your pc then everyone will be doing it.

Honestly, Badaboom handles (a) already. Pick the device you are encoding for- hit start(unless you encode for multiple different devices, you only have to pick once so then it's just start the app and click the encode button). Expanding out the amount of supported devices a bit would be nice(ie- add DroidX, N1, iPad etc), honestly nV should bundle this app with their cards at this point, I think that it would promote GPGPU better then most of their marketing at this point.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
I *thought* that Kaspersky was just using Nvidia hardware to help detect new viruses, or something like that. It was more of a research part from a corporate stand point. I didn't think it helped the PC user at all. Am I wrong on this? I can't imagine processing power is what limits a virus scan. All the processing power in the world won't make my harddrive spin faster than 7200 RPM.
 

WelshBloke

Lifer
Jan 12, 2005
31,372
9,264
136
Keys:Dug:I'd also like some info on how it pertains to antivirus software.
Microsoft Security Essentials just pounds 3rd party AV programs into dust(for free) in the home users market.
Are you talking about some kind of industrial strength AV usage where MSE is unsuitable?
Video encoding from hard media like 8mm camera footage,VHS tapes etc to digital files suitable for hard drive storage etc is just begging for high performance acceleration via our (mostly idle) high end GPUs as it's slow as hell even with a fast Core i7.
The biggest feature I can imagine would be an analog data stream capture component on the video card so a 3rd party capture card isn't needed.
Put a good gaming GPU and a video capture function on a single video card and I'm ready to buy it...whatever the color.


Does that really need that much horsepower? The only time I've really dealt with non digital video I just plugged the camera into my DVD recorder and ripped to DVD's that way (I think, it was a while ago).
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
It would definitely be interesting to read a GPGPU article on AT showing us what tasks can be accomplished faster with it. I personally used to encode videos a lot, but that came to an end once I found RockPlayer for Android (plays just about everything). Nonetheless, It would be nice to see what useful software is out there that supports and utilizes GPGPU.