• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

G70 would be launched on June 8

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Acanthus
Originally posted by: apoppin
Originally posted by: ronnn
The guy who buys 2 6800 ultras does not really have money issues and if noise is important will spend a fortune to deal with that. So when the next gen is released, he will be the first to buy. That way he will really have next gen performance. Only one way now to stay on top of the curve - spend lots. Big money now only keeps you on top until the next better thing comes along, stating otherwise is fud. Buy midrange if you want dollars for fps and upgrade more frequently. But back to the topic, it will be interesting to see what sm 3.0 with 3dc brings to the table.

One thing everyone seems to be missing . . . when he sells his pair of 6800Us, he will get 'top dollar' for a used card . . . as opposed to holding on to those 6800gts for an extra year to squeeze every last ounce of performance out of them.

what really bugs me is that a $350 console owner will soon have better gfx than a top-of-the-line sli'd g70 PC. 😛
:shocked:

SLId G70s will likely slaughter a console if games are released that can push them.

Remember the PS3 is a single G70, and the Xbox 2 is a single R500.
Are you not keeping up?

The Xbox360 and especially the PS3 are gonna SLAUGHTER sli'dG70s.😛
:shocked:

We will not see the PC "catch-up" for TWO years. 😉
:roll:
___________________________________

Originally posted by: KruptosAngelos
Originally posted by: apoppin
first of all the, r520 will be the big disapointment as M$ will not allow ATI to release the "full version" of its Xbox360 GPU. . . .
:thumbsdown:

g70 will be less of a disappointment, but still not "all" the PS3 GPU will be. 😉

But at least ATI fans will finally get SM 3.0 and 'SLI'
:roll:

Rumors had it that the g70 would be twice an ultra, right? Rumors had it that the 520 would be 3x an x800. I wouldn't be disappointed really if it were 1.5x, it's still 50% increase. Considering I already run all new games with all eye candy smooth as a baby's bottom, that's more than enough.

ATI's move to MVP is purely marketing, so ATI fanboys will jump on the bandwagon, it's still pointless. And I think it's retarded this generation of cards didn't have 3.0, but it may have been a business move to increase sales of 520, you think?

Rumours? 😛

ATi's move to "sli" is necessary to "keep-up" with nVidia. 😉
 
The PC will overtake it as soon as G70 comes out, as i think the pc gpu will be more powerful, if not next cards will overtake consoles no probs.

I am with that other user, in that i never will owna console and hate them, i have played on them though and i know 2 pals that will get xbox360 and PS3.
 
Originally posted by: apoppin
Originally posted by: Acanthus
Originally posted by: apoppin
Originally posted by: ronnn
The guy who buys 2 6800 ultras does not really have money issues and if noise is important will spend a fortune to deal with that. So when the next gen is released, he will be the first to buy. That way he will really have next gen performance. Only one way now to stay on top of the curve - spend lots. Big money now only keeps you on top until the next better thing comes along, stating otherwise is fud. Buy midrange if you want dollars for fps and upgrade more frequently. But back to the topic, it will be interesting to see what sm 3.0 with 3dc brings to the table.

One thing everyone seems to be missing . . . when he sells his pair of 6800Us, he will get 'top dollar' for a used card . . . as opposed to holding on to those 6800gts for an extra year to squeeze every last ounce of performance out of them.

what really bugs me is that a $350 console owner will soon have better gfx than a top-of-the-line sli'd g70 PC. 😛
:shocked:

SLId G70s will likely slaughter a console if games are released that can push them.

Remember the PS3 is a single G70, and the Xbox 2 is a single R500.
Are you not keeping up?

The Xbox360 and especially the PS3 are gonna SLAUGHTER sli'dG70s.😛
:shocked:

We will not see the PC "catch-up" for TWO years. 😉
:roll:
___________________________________

Originally posted by: KruptosAngelos
Originally posted by: apoppin
first of all the, r520 will be the big disapointment as M$ will not allow ATI to release the "full version" of its Xbox360 GPU. . . .
:thumbsdown:

g70 will be less of a disappointment, but still not "all" the PS3 GPU will be. 😉

But at least ATI fans will finally get SM 3.0 and 'SLI'
:roll:

Rumors had it that the g70 would be twice an ultra, right? Rumors had it that the 520 would be 3x an x800. I wouldn't be disappointed really if it were 1.5x, it's still 50% increase. Considering I already run all new games with all eye candy smooth as a baby's bottom, that's more than enough.

ATI's move to MVP is purely marketing, so ATI fanboys will jump on the bandwagon, it's still pointless. And I think it's retarded this generation of cards didn't have 3.0, but it may have been a business move to increase sales of 520, you think?

Rumours? 😛

ATi's move to "sli" is necessary to "keep-up" with nVidia. 😉

Ok, first of all, it would be a disappointment if the r520 was only 1.5x as fast as a x850xt. If Nvidia can make a 6800u 2x as fast as a 9800xt, I'd sure as hell expect Ati to make the r520 2x as fast as a 6800u, using a single card. All you who say you don't need it will be saying the opposite when next year's games come out, and you'll want to play with all the eye candy, including HDR WITH AA.

As for SM3, it's already been debated enough, but those who want full eye candy will likely buy high end cards, so by the time Unreal3 comes out and the majority of new games use sm3, you'll likely buy a r520 or g70, even if you already had a 6800gt, or if your x800xt had sm3. They made a cost-cutting decision not to use sm3 this gen, and as long as they hurry up and release the r520, it wont be that big of a deal. If you really want sm3 that bad right now, just stick with Nv and stop arguing.

AMR is what Ati has to do in order to keep up with Nvidia, unless they figure out a way to cram all that power in a single gpu. Dual r520s will probably be overkill for anything this year, and probably the first half of next year, but if you want the best, it's gonna cost you, there's no way around it. So yeah, you technically wouldn't "need it", but Ati needs it to campete with Nvidia for the fastest 3d graphics HW.
 
Originally posted by: humey
The PC will overtake it as soon as G70 comes out, as i think the pc gpu will be more powerful, if not next cards will overtake consoles no probs.

I am with that other user, in that i never will owna console and hate them, i have played on them though and i know 2 pals that will get xbox360 and PS3.

nonsense.

the r520 is just a dumbed down version of the Xbox GPU and the g70 is likewise castrated.

Get this thru your heads . . .
the next gen of consoles will destroy PC gaming (graphicswise) for TWO years . . . MAYBE r620 will "catch-up" HW in '06 . . . but certainly not PC game SW until Unreal3 - or later - more likely with DX10.
 
Originally posted by: apoppin
Originally posted by: humey
The PC will overtake it as soon as G70 comes out, as i think the pc gpu will be more powerful, if not next cards will overtake consoles no probs.

I am with that other user, in that i never will owna console and hate them, i have played on them though and i know 2 pals that will get xbox360 and PS3.

nonsense.

the r520 is just a dumbed down version of the Xbox GPU and the g70 is likewise castrated.

Get this thru your heads . . .
the next gen of consoles will destroy PC gaming (graphicswise) for TWO years . . . MAYBE r620 will "catch-up" HW in '06 . . . but certainly not PC game SW until Unreal3 - or later - more likely with DX10.

I severly doubt that the G70 based RSX processor in the PS3 will far outclass a PC G70. Let alone two of them (SLI'd).

As far as bandwidth, the PS3 is monstrous. And the Cell processor is an enourmous step.
The only thing that will keep the PS3 ahead of PC's is the slow paced improvements of CPU's. Doesn't the Cell processor have either six or eight cores, albeit far simplified cores? Until PC CPU's become multi cored (more than two cores) I think the PS3 will have the advantage. Not because of the GPU's, but the CPU's.

 
Originally posted by: keysplayr2003
Originally posted by: apoppin
Originally posted by: humey
The PC will overtake it as soon as G70 comes out, as i think the pc gpu will be more powerful, if not next cards will overtake consoles no probs.

I am with that other user, in that i never will owna console and hate them, i have played on them though and i know 2 pals that will get xbox360 and PS3.

nonsense.

the r520 is just a dumbed down version of the Xbox GPU and the g70 is likewise castrated.

Get this thru your heads . . .
the next gen of consoles will destroy PC gaming (graphicswise) for TWO years . . . MAYBE r620 will "catch-up" HW in '06 . . . but certainly not PC game SW until Unreal3 - or later - more likely with DX10.

I severly doubt that the G70 based RSX processor in the PS3 will far outclass a PC G70. Let alone two of them (SLI'd).

As far as bandwidth, the PS3 is monstrous. And the Cell processor is an enourmous step.
The only thing that will keep the PS3 ahead of PC's is the slow paced improvements of CPU's. Doesn't the Cell processor have either six or eight cores, albeit far simplified cores? Until PC CPU's become multi cored (more than two cores) I think the PS3 will have the advantage. Not because of the GPU's, but the CPU's.

sure it will . . . if you were M$ [or Sony], would you let ATi [or nVidia] spend YOUR research money to develop a GPU for your console and then turn around and use the SAME [or better] GPU in THEIR PC graphics cards?

i know i'd sue 😛
:roll:

the Xbox360/PS3 will DESTROY pc gaming for 2 more years . . . . DX9 is fully utilized in NextGen consoles while only 15-25% of PC games in the next year will.

At the very LEAST gaming SW for the PC is WAY behind. do not expect that to change until [at least Unreal3] and really until DX10 (add a year or 2 AFTER that release to see games actually USING dx10 features).

06' may be my first console since Genesis. 😛
:shocked:

the PC won't be able to compete till '07. 😛
[i DON'T like it . . . it's just the way it "is"]
:thumbsdown:

 
Remember that in a console both the cpu and gpu has to be low power and low heat. In a pc they can get away with sticking huge coolers on both so can use more power and run them faster.
The big advantage consoles have is there is only 1 spec - i.e. you can optimise a console game much better then a pc one as you don't have to worry about users with slower cpus, graphics, less memory, etc. I expect that alone will help them keep up with pc's for a year.
 
Just make a supercomputer compatible with WindowsXP and games, a video card the size of a PC fulltower chassi and able to play 2048x1536 with ultra high custom textures and full AA and AF at 1000FPS in DooM III, Half-Life 2 and Painkiller all running simultaneously in multiplayer.

Then SLI the video card and shut the mouths of Everyone and All.
 
So appopin seems to be saying that ATI and Nvidia will only half-ass their next-gen launch...it sounds plausible, but I sincerely hope that's not what ends up happening.
 
apoppin, i DOUBT is and would bet money on it not being true.

Consoles suck, ok these seem powerful compaired to previous, but in 1 year max a new PC will be night and day from even todays PC's.

The consoles are sold at a loss and money made on games.
 
Originally posted by: sbuckler
Remember that in a console both the cpu and gpu has to be low power and low heat. In a pc they can get away with sticking huge coolers on both so can use more power and run them faster.
The big advantage consoles have is there is only 1 spec - i.e. you can optimise a console game much better then a pc one as you don't have to worry about users with slower cpus, graphics, less memory, etc. I expect that alone will help them keep up with pc's for a year.

Also, since the developers for consoles only have to deal with one hw setup, they can continue to push the limits of the console as they get more familiar with it's capabilities. Foe example, GT2 had better graphics than GT1 on the PSX, and GT4 looks better than GT3 on the same PS2 hardware. This is possible because the game on a console has full control of the hardware, there's no stuff like windoze to deal with. So, a PC will need faster hardware just to run a game as good as a console with slightly slower hardware.
 
Originally posted by: Emultra
Just make a supercomputer compatible with WindowsXP and games, a video card the size of a PC fulltower chassi and able to play 2048x1536 with ultra high custom textures and full AA and AF at 1000FPS in DooM III, Half-Life 2 and Painkiller all running simultaneously in multiplayer.

Then SLI the video card and shut the mouths of Everyone and All.

You realize that most of that is almost already possible, right? 😉
 
Of course PC's need better stuff, but Xbox 3 and PS4 won't be out for half a decade; meanwhile, we get better stuff during all that time.

So at its best, it would seem that consoles rule the hardware arena for 1 year out of 5 in every cycle. The leap at the beginning of a new console generation can never be high enough to sustain a lead for years.

Indeed, no hardware leap can?


Ronin: I don't have liquid nitrogen/hydrogen cooling, but there's a plant in my town that sells the stuff, they've got truckloads of canisters. Hmm...😉
 
SLId G70s will likely slaughter a console if games are released that can push them.

Remember the PS3 is a single G70, and the Xbox 2 is a single R500.

Haven't seen this brought up yet but I may have missed it- the G70 PC part that is going to be hitting this year will be a 110nm part while the PS3's RSX will be a 90nm part(Sony is licensing the RSX tech, they are fabbing it at their own multi billion dollar foundries with no risk to nVidia). Obviously nVidia doesn't like the prospects of the yields they can reach with the transition to 90nm. On that build process, pushing over 300million transistors doesn't sound too reasonable for a consumer part. More then likely, something is going to come up missing out of the G70 and more then likely it will be shader hardware. Now, given the architecture of the PS3 and the fact that we know what Cell is capable of, more then likely the RSX design is going to be focusing on pixel shader ops as vertex ops can very easily be offloaded to Cell(that 35GB/sec bus will come in real handy there).

Since nVidia isn't using unified hardware for their shading model, they have way more fillrate then they are going to use, and given the build process combined with Cell's incredible performance with vertex op style performance, it would seem that nVidia would be focusing their hardware resources on pixel shaders in a way in which I wouldn't be shocked to see it have a rather enormous performance edge over the first top tier G70 part. Oh yeah, the RSX should also be clocked a decent amount higher then the G70 also.
 
Originally posted by: Emultra
Of course PC's need better stuff, but Xbox 3 and PS4 won't be out for half a decade; meanwhile, we get better stuff during all that time.

So at its best, it would seem that consoles rule the hardware arena for 1 year out of 5 in every cycle. The leap at the beginning of a new console generation can never be high enough to sustain a lead for years.

Indeed, no hardware leap can?


Ronin: I don't have liquid nitrogen/hydrogen cooling, but there's a plant in my town that sells the stuff, they've got truckloads of canisters. Hmm...😉

lol.

I wasn't referring to LN, or any other extreme cooling. I come close to those numbers on a FX55 with 2x6800U's using 2048x1536. I play HL2 @ 4x/8x with 100FPS average (VSS with max settings puts me over 100FPS still, but that's not gaming performance in my opinion).

I'll run some actual tests probably next week (leaving for E3 tomorrow morning and won't be back until late Friday, and don't feel like messing around with it this weekend), and provide some personal results.
 
Originally posted by: BenSkywalker
SLId G70s will likely slaughter a console if games are released that can push them.

Remember the PS3 is a single G70, and the Xbox 2 is a single R500.

Haven't seen this brought up yet but I may have missed it- the G70 PC part that is going to be hitting this year will be a 110nm part while the PS3's RSX will be a 90nm part(Sony is licensing the RSX tech, they are fabbing it at their own multi billion dollar foundries with no risk to nVidia). Obviously nVidia doesn't like the prospects of the yields they can reach with the transition to 90nm. On that build process, pushing over 300million transistors doesn't sound too reasonable for a consumer part. More then likely, something is going to come up missing out of the G70 and more then likely it will be shader hardware. Now, given the architecture of the PS3 and the fact that we know what Cell is capable of, more then likely the RSX design is going to be focusing on pixel shader ops as vertex ops can very easily be offloaded to Cell(that 35GB/sec bus will come in real handy there).

Since nVidia isn't using unified hardware for their shading model, they have way more fillrate then they are going to use, and given the build process combined with Cell's incredible performance with vertex op style performance, it would seem that nVidia would be focusing their hardware resources on pixel shaders in a way in which I wouldn't be shocked to see it have a rather enormous performance edge over the first top tier G70 part. Oh yeah, the RSX should also be clocked a decent amount higher then the G70 also.

Consider the original jump. From 135M transistors on the 5950 to 222 Million for the 6800 series. A 300M transistor count on a desktop GPU isn't all that far fetched, and nVidia has shown they can easily make the jump, wouldn't you say? Obviously, the primary concern would be finding a proper way to dissipate all the heat being generated without going the route of the 5800, but it's still feasible, in my opinion.
 
I think it'll take a couple years for PC's to catch up. The hardware specs should be comparable within a year. The hardware is specially designed to work together in a console. A PC even with similar specifications would probably run worse than a console because the console is a dedicated gaming machine.
 
I'll stick with my PC thanks. Until my 3 year old asks us for one, there will be no console in our house. But, I'm sure the craze will hit him sooner or later. Unless I get him hooked on strictly PC gaming. Yeah, parents are supposed to "influence" their children right? 😉
 
ROFL.

I'm torn. I never bought a PS2 (but there's one in the house from a friend who didn't want his anymore), and I didn't get an XBox until last Christmas. But based on the screenshots just for GT, it might be one of those times where I don't wait.

However, a console will never replace my PC experience. 🙂
 
Originally posted by: keysplayr2003
I'll stick with my PC thanks. Until my 3 year old asks us for one, there will be no console in our house. But, I'm sure the craze will hit him sooner or later. Unless I get him hooked on strictly PC gaming. Yeah, parents are supposed to "influence" their children right? 😉

Consoles are good for when you have a few friends over and want to play a game. Unless you paid thousands for a massive computer monitor, it's really difficult to huddle 4 friends around a computer to play a sports game or something. It's relatively easy with a console... and will be even easier now with wireless controllers standard.

*EDIT* Consoles are also good for careless kids... bust a console... $300 for a whole new one... bust a PC... you're lookin at $1000-2000 for a whole new one. On the other hand... if one part of a PC goes bad, you can just replace that instead of the entire thing. To be honest... I'd rather have a kid sit down at a $300 console with a Pepsi than sit down at a $1000-2000 PC with a Pepsi. 🙂
 
Man, Ronin, your rig is about as good as it gets right now, isn't it? You must have a lot of money or be sponsored. 🙂
And you're going to E3 too...in defence of my lower (but satisfactory) position in the hierarchy, what do you have to look forward to, eh, what with having everything? 😉

And the PS3 is still a year away...the lumbering giant will come creeping forth.
 
Originally posted by: Rollo
Originally posted by: KruptosAngelos
Because you don't really get that performance. 2 6800U's in SLI is not natively supported in games, it requires drivers to make it work. Even then, it doesn't work well, and has overhead.

People really need to stop saying this as it's never been true. Here's how it works Kruptos:
1. Last night I installed Hexen2, a game from 1996. Oh noes! No profile!
2. I go to the nVidia driver control panel, use drop down to set non profiled games to AFR. Oh noes! It doesn't work, distortion abounds.
3. I go to driver control panel, switch to SFR SLI with drop down, play my game fine.
Pretty tough, eh? Took me a whole two minutes.


Fact is, SLI isn't necessary and doesn't help. You get 100fps, I get 90, doesn't matter. Refresh rate is 85 so we both see the same thing. SLI offers nothing noticeable in-game. Do you really want to spend $600 extra for futuremark scores? Oh wait, does it even work with that?

People really need to stop saying this, because anyone with eyes who can read can see single cards are pretty lame compared to high end SLI.

UT2004
Gee, 113fps for a X800XT PE vs 151fps for my 6800GT SLI.

Doom3
37fps vs. 68fps isn't a small difference either- I play at this setting, you don't.

Far Cry
36fps vs 53fps or 60fps. Again, I can play this setting easily, you can't.

Battlefield Viet Nam
61fps vs 83 or 94fps.

thats what ive been trying to tell him all along...looks to be another diehard ATI fanboy.
 
Originally posted by: keysplayr2003
Originally posted by: KruptosAngelos
Originally posted by: apoppin
first of all the, r520 will be the big disapointment as M$ will not allow ATI to release the "full version" of its Xbox360 GPU. . . .
:thumbsdown:

g70 will be less of a disappointment, but still not "all" the PS3 GPU will be. 😉

But at least ATI fans will finally get SM 3.0 and 'SLI'
:roll:

Rumors had it that the g70 would be twice an ultra, right? Rumors had it that the 520 would be 3x an x800. I wouldn't be disappointed really if it were 1.5x, it's still 50% increase. Considering I already run all new games with all eye candy smooth as a baby's bottom, that's more than enough.

ATI's move to MVP is purely marketing, so ATI fanboys will jump on the bandwagon, it's still pointless. And I think it's retarded this generation of cards didn't have 3.0, but it may have been a business move to increase sales of 520, you think?


Not to be rude or anything, but will you be getting to the actual point your trying to make anytime soon? And also, why you feel you need to make this point and drive it home?

hes a fanboy thats why
 
Originally posted by: apoppin
ATi's move to "sli" is necessary to "keep-up" with nVidia. 😉

Are you kidding me? Nvidia sells 30% less units per quarter, why would ATI need to "keep up" with anyone? Nvidia can't catch up 😛
 
Back
Top