ATi's next generation release ?

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
Hi.

Simply asked, would a release date estimate of the next core, being around late May / early June, be realistic ?

Or would it be more around late July / early August ?

I'd appreciate...hum...other estimates. Thanks.
 

n yusef

Platinum Member
Feb 20, 2005
2,158
1
0
It's supposed to be released in May, so everyone besides Anand (they'll have him review it) and the CEO of ATI will have to wait 'til Christmas.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: wayne2456
I think ati will release another video card after xbox 2. I think:confused:

This seems likely, given that the XBox2 (or whatever MS decides to call it) is using a variant of the R5XX architecture.

I wouldn't expect cards before late summer/fall, but of course it's all just speculation at this point.
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
Wait...

You guys are telling me ATi are actually, as I type this, probably creating their next generation GPU (If not already completed). That they will release just a few copies for known reviewers...but wouldn't release it for the public and general consumers around the same time ?

For example, releasing "review copies"...let's say sometime in May...but only available for us around September or even later ?
 

TourGuide

Golden Member
Aug 19, 2000
1,680
0
76
Originally posted by: Zenoth
Wait...

You guys are telling me ATi are actually, as I type this, probably creating their next generation GPU (If not already completed). That they will release just a few copies for known reviewers...but wouldn't release it for the public and general consumers around the same time ?

For example, releasing "review copies"...let's say sometime in May...but only available for us around September or even later ?

This has been their pattern, which has ticked a lot of people off. They release a card for reviewers - we hear how FAST it is - then can't find it at retail. Most of is feel this is BS as far as business practice goes.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
We have about a thread a day discussing ATI or Nvidia's next gen architecture, please use the search function.

-Kevin
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: Zenoth
Wait...

You guys are telling me ATi are actually, as I type this, probably creating their next generation GPU (If not already completed).


I'm getting tired of saying this:

Both ATi and NVidia start planning GPUs almost 3 years before they are released. ATi and Nvidia are currently starting to draw out their plans for Spring 2008 products.

GPU design and manufacture is not simple. I laugh so hard when I see these idiots saying they think NV is going to wait and see how strong r520 is and then build something better to release 6 months later. True ignorance.

Anyway, yes, ATi are currently wrapping up R520 - and they're a good way into the process of constructing R620, just like NV is nearly done with NV50 (G70, whatever they're calling it) and well on its way on NV60 (G80, etc).
 

Fenuxx

Senior member
Dec 3, 2004
907
0
76
Originally posted by: TourGuide
This has been their pattern, which has ticked a lot of people off. They release a card for reviewers - we hear how FAST it is - then can't find it at retail. Most of is feel this is BS as far as business practice goes.

Yeah, if you want proof, just look at the X800s and the 6800s. They were "officially" released in April or May last year, but they weren't available to the public until approx. September or so. Even then, if you wanted a "flagship" version (i.e. XT-PE or Ultra), then you usually had to pay upwards of $750-850. I expect this will continue, unfortunately, for the near future.

On a side note, its finally good to see that ATI\NVIDIA won't be releasing "bumped" versions of chips just to reclaim the performance crown. NVIDIA won't be releasing a re-spin of NV40, as its "6-month refresh" is SLI. I would like to see this trend continue, as it is pointless to keep going the way things are going, with new products that yield a less-than-10% performance increase. As I said, I hope ATI\NV back off a bit with their endless pursuit of the proverbial performance crown, because, as I said, IMHO, its pointless. Graphics technology has matured, and there really isn't a need for this behavior anymore IMHO. Interesting to think about isn't it :) .
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Graphics technology ain't matured yet. The next generation will be very close, as Unreal Engine 3 demonstrates, but I don't think we'll get to the "you can render anything you can imagine photoreal" point until the generation after that - say, around 2009-2010.

We are very, very close though.

The main things that developers need to focus on now in graphical terms are animation, physics, and level of detail. Half-Life 2 made excellent strides with the first two (and the upcoming PhysX chip from Ageia has me interested), but LOD is going to be difficult. Advances are being made there - witness normal mapping and virtual displacement mapping - but it's not just about more polygons - it's about noticing things at 2 feet that you didn't at 10 feet, etc.

Also, we need to work on strategic culling - Far Cry had huge problems with LOD over distance. Trees and grass suffered from polygon knocking and pop-in in a big way, even at the highest settings.

I think AA and AF are about as advanced as they need to be - let's focus on efficiency there. IQ is about as solid as it needs to be for these filters.
 

Fenuxx

Senior member
Dec 3, 2004
907
0
76
What I meant about maturing is that graphics technology isn't changing dramatically like it was years ago. We aren't having the huge changeovers that were DX7 to DX8.x, and DX8.x to DX9. There isn't going to be another DX version until Longhorn, which will be arriving in mid-to-late 2006 or early 2007, and that means no SIGNIFICANT advances in graphics tech until then. Sure, there is the innevitable new feature or new technology, as well as speed increases, but the BASIC way things are done hasn't changed since the first DX9 chips (i.e. shaders) such as R300 and NV30. Of course, the architectures have improved, and in NVIDIA's case, completely revamped. But there's still basically the same limit on the number of processes that can be done (though it has increased with the advent of SM3.0 and SM2.0a\2.0b). As I said, of course GPU's will get faster, and there will be some kind of new technology that will emerge, but as far as graphics maturation is concerned, developers will learn to better take advantage of the technology that is currently there, but there won't be anything significantly new that developers will have to learn.

Make sense?
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: geforcetony
What I meant about maturing is that graphics technology isn't changing dramatically like it was years ago. We aren't having the huge changeovers that were DX7 to DX8.x, and DX8.x to DX9. There isn't going to be another DX version until Longhorn, which will be arriving in mid-to-late 2006 or early 2007, and that means no SIGNIFICANT advances in graphics tech until then. Sure, there is the innevitable new feature or new technology, as well as speed increases, but the BASIC way things are done hasn't changed since the first DX9 chips (i.e. shaders) such as R300 and NV30. Of course, the architectures have improved, and in NVIDIA's case, completely revamped. But there's still basically the same limit on the number of processes that can be done (though it has increased with the advent of SM3.0 and SM2.0a\2.0b). As I said, of course GPU's will get faster, and there will be some kind of new technology that will emerge, but as far as graphics maturation is concerned, developers will learn to better take advantage of the technology that is currently there, but there won't be anything significantly new that developers will have to learn.

Make sense?



I don't know if I buy that. We're just now getting games that are really leveraging the DX8 feature set - they have a few nice whiz-bang features they pull from DX9 (HL2's water, or Doom 3's heat shimmer/glass distortion, for example) but games today are largely last generation stuff.

The features show up way before the hardware that can use them does. Usually two generations before. Cubic environment mapping is a good example: Support for it was introduced on the GeForce 2, but no game really leveraged it until GeForce 4 Ti/Radeon 9700 when Unreal Tournament 2003 was released.

Unreal Engine 3 is the first engine that truly uses the grunt DirectX 9 can provide...and surprise, it's targeted for R6XX and NV6X (G7X?) - two generations after the first DX9 chips.

I dunno about you, but I see a hell of a difference between today's titles and Unreal Engine 3.


Plus, the shaders in use now are extremely primitive and limited because of....survey says....processor power! In order to make shaders multiples of hundreds of instructions long (which will allow really cool/useful stuff), we need more grunt from our silicon.
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
Sure thing. All this makes sense. I understand why, and how. Many things I didn't know, and you guys enlightened me.

It's just a bit frustrating about the "when".

Reviewing products months before their true release, just to build hype. Oh well...

Looks like I'm going to upgrade much sooner, with currently available products. I don't feel like waiting this summer for new products just to have my mind blowed and my jaw dropped until the availability of the new generation just months later.