ATI Technologies Vows to Bring Shader Model 3.0, Multi-GPU Technology

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
It's one thing to vow it, but another matter entirely is actually doing it.

nVidia is still struggling with their drivers, and they started this project when they purchased 3DFX. How long ago was that? If it's been 3 years, it will be at least that long before ATi can do it unless they've been secretly developing this already.

Once they have workstation cards with stable drivers based on this technology, I'll be jumping on board.

Just out of curiousity, why do workstation apps require special drivers, whereas complex games like Doom 3 don't? Can't the people who make AutoCAD and such program their app to use the standard code paths?
 

OinkBoink

Senior member
Nov 25, 2003
700
0
71
man i dont like x-bit.


it justs keeps on building stuff like that stupid inquirer.

AT is a million times better IMHO.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
nVidia is still struggling with their drivers, and they started this project when they purchased 3DFX.

No, they didn't. AGP had no proper support to allow for multi GPU configurations. In order to get it to work properly you had to create a bridge chip(ala 3dfx) or use driver tricks that wouldn't always work(ala ATi). PCIe was needed in order to bring multi GPUs to consumers in a realistic fashion.

Just out of curiousity, why do workstation apps require special drivers, whereas complex games like Doom 3 don't? Can't the people who make AutoCAD and such program their app to use the standard code paths?

No compromises. Gaming compromising is a requirement and always has been. For higher end applications(although AutoCAD isn't really high end, it is a workstation app however) the API itself can frequently be a severely limiting factor. OpenGL has just barely ratified OGL 2.0 giving at least a basic level of open standard shader support- should Alias have waited to include shader acceleration in Maya until the ARB got around to doing their job? If they had, then another company(Discreet comes to mind) would have simply stepped up and used it as a feature the competition couldn't compete with. For the non viz based aps, their geometry load tends to be significantly higher(ProE and comparable) and for those applications it makes sense to use workarounds to not perform T&L calcs per frame on every vertice if it can be avoided. This type of optimization isn't something you want to have all the time as the overhead required to do it is going to hurt performance on most applications, however it can provide a significant performance boost for the particular applications it is supposed to aid.
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
Originally posted by: TheSnowman
What do you find funny?


its just the common trend these companies engage in:

1) Downplay a competitors feature
2) Say they'll have it when the industry 'really' needs it
3) Leave us hanging with expectations of some revolutionary product

I am allowed to find it humorous aren't I?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: BenSkywalker
nVidia is still struggling with their drivers, and they started this project when they purchased 3DFX.

No, they didn't. AGP had no proper support to allow for multi GPU configurations. In order to get it to work properly you had to create a bridge chip(ala 3dfx) or use driver tricks that wouldn't always work(ala ATi). PCIe was needed in order to bring multi GPUs to consumers in a realistic fashion.
http://www.theinquirer.net/?article=19025

Yeah, it is the Inquirer, but apparenlty they have been working on the drivers for quite some time.
 

William23

Member
Aug 20, 2004
74
0
0
I acctually see this as a good thing.If ATI developes a multi GPU setup like Nvidia's then there will be more competition and prices will comedown.....Hopefully!!!
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: gururu
Originally posted by: TheSnowman
What do you find funny?


its just the common trend these companies engage in:

1) Downplay a competitors feature
2) Say they'll have it when the industry 'really' needs it
3) Leave us hanging with expectations of some revolutionary product

I am allowed to find it humorous aren't I?

Sure you are, I was simply curious as to why. None of it seems funny to me, it just seems like common sense.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: BenSkywalker
nVidia is still struggling with their drivers, and they started this project when they purchased 3DFX.

No, they didn't. AGP had no proper support to allow for multi GPU configurations. In order to get it to work properly you had to create a bridge chip(ala 3dfx) or use driver tricks that wouldn't always work(ala ATi). PCIe was needed in order to bring multi GPUs to consumers in a realistic fashion.

I read an article which said that they did. Where are you getting your information from? PCI-E and NV40 were in development a long time ago. NV40 was "released" last spring. It would have taken at least two years for them to develop the technology, and as I said, it was part of nVidia's plan from the beginning.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: gururu
Originally posted by: TheSnowman
What do you find funny?

its just the common trend these companies engage in:

1) Downplay a competitors feature
2) Say they'll have it when the industry 'really' needs it
3) Leave us hanging with expectations of some revolutionary product

I am allowed to find it humorous aren't I?

At least ATi is accepting the technology basically immediately, as opposed to intel's reaction to AMD-64 and Hyper Transport.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: gururu
Originally posted by: TheSnowman
What do you find funny?


its just the common trend these companies engage in:

1) Downplay a competitors feature
2) Say they'll have it when the industry 'really' needs it
3) Leave us hanging with expectations of some revolutionary product

I am allowed to find it humorous aren't I?


I think its BS when a company downplays or discredits another company's product, but finishes the same sentence with, "We will have a similar solution in the near future". It's like saying, "It sux, but we will have it soon.".

 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
Originally posted by: TheSnowman

Sure you are, I was simply curious as to why. None of it seems funny to me, it just seems like common sense.

I'm watching Star Trek 2 right now and one of the ship Vulcans told Kirk "Humor, its a difficult concept. It's not logical".

I just thought that was funny as I rad your response. I know its common sense for the companies to issue such press releases. and I enjoy hearing that they are thinking about these issues.

 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
Originally posted by: gururu
X-Bit

i thought this was funny.


you know, i like ATI...but even me can recognize dumb PR-blah blah when i see it.

Downplaying sm 3.0 and future features because "its not needed now" is just plain dumb - from that point of view we still all would sit on ISA cards with 2MB ram.

And...in comparison to 2.0/1.4 shaders we *DO* have support for sm3.0 already in quite a few games.

The other thing is, also, if it is worth to spend $400 - $800 (shortage !!!) on a card which dioes NOT have these features....we all know that R420 (and probably even the next core, 480) are ultimately ONLY based on R300....eg since Radeon 9700 we really have NOTHING too new feature wise. They just cant go on and release "new" cores which in essence are STILL only based on R300 and NO new chip/core design....i do NOT support that of course.

ATI know that, they woul (OF COURSE) also rather have an R500 available NOW - but they failes with it....and all they can do is downplay and hope not to lose too much to Nvidia.

But..time does not stand still <-- hint, ATI !
 

sph1nx

Member
Sep 3, 2004
86
0
0
IMHO, ATI is falling a little behind. They are vowing to do things in their NEXT line of GPU's (God only knows when we will ACTUALLY see those). Meanwhile, Nvidia will be able to refine those features already available, as well as bring some new ones to the table. I just got my 6800 GT, and it is a SOLID card, my first Nvidia card in fact and I am very pleased. I like ATI, and I hope they can get their act together because they are falling behind. They are outselling Nvidia in integrated and low-level cards, but that has not taken into account the 6600 and 6600 GT, which have yet to really hit the market. ATI will not be able to survive on PR for very long, at least not with the gamer community, so they had better get their act together.
 

sph1nx

Member
Sep 3, 2004
86
0
0
Originally posted by: keysplayr2003
Originally posted by: gururu
Originally posted by: TheSnowman
What do you find funny?


its just the common trend these companies engage in:

1) Downplay a competitors feature
2) Say they'll have it when the industry 'really' needs it
3) Leave us hanging with expectations of some revolutionary product

I am allowed to find it humorous aren't I?


I think its BS when a company downplays or discredits another company's product, but finishes the same sentence with, "We will have a similar solution in the near future". It's like saying, "It sux, but we will have it soon.".

LOL.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: keysplayr2003

I think its BS when a company downplays or discredits another company's product, but finishes the same sentence with, "We will have a similar solution in the near future". It's like saying, "It sux, but we will have it soon.".

Yet they never said any of it sucks, but rather that they simply focused their efforts towards other things.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I read an article which said that they did. Where are you getting your information from? PCI-E and NV40 were in development a long time ago. NV40 was "released" last spring. It would have taken at least two years for them to develop the technology, and as I said, it was part of nVidia's plan from the beginning.

Two years to develop the technology and two and a half years if we believe what the Inquirer has to say- nV acquired what was left of 3dfx four years ago. This technology is actually quite unrelated to 3dfx's SLI- the only thing they share in common is that they both utilize multiple rasterizers(Alienware and nV's "SLI" setup are fairly close, 3dfx's was quite different). In 2000 they were not working on the NV40's "SLI" drivers, no chance.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: TheSnowman
Originally posted by: keysplayr2003

I think its BS when a company downplays or discredits another company's product, but finishes the same sentence with, "We will have a similar solution in the near future". It's like saying, "It sux, but we will have it soon.".

Yet they never said any of it sucks, but rather that they simply focused their efforts towards other things.



Here ya go. Your right, they used everyting BUT the word suck.
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
Originally posted by: keysplayr2003

Here ya go. Your right, they used everyting BUT the word suck.

keys, you never said that ATI said SLI sucks. you said their responses were "like saying it sux". so YOU are right.

 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Since when is pointing out very real situations were probems might arise akin to saying someting outright sucks?
 

Marsumane

Golden Member
Mar 9, 2004
1,171
0
0
Apparently sli capabilities were implemented as early as the geforce 3 core. They have been working on it for quite some time, although i dont know how extensive their testing was.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: TheSnowman
Since when is pointing out very real situations were probems might arise akin to saying someting outright sucks?

They also said, IIRC, that they were not planning to implement SLI in their future products because of these shortcomings. Their recent reversal suggests that they were either lying, or that it will be a very long time before they adopt this technology.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: TheSnowman
Since when is pointing out very real situations were probems might arise akin to saying someting outright sucks?

Since ATi's comments didn't actually say one good thing about Nvidia's SLI, only the potential issues and problems that could arise or are currently happening. There are pros and cons to Nvidia SLI yet ATI only chose to address the cons. They freakishly praised Alienwares.