• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

7800GTX 512MB to run at 550MHz/1800MHz

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Gamingphreek
Originally posted by: munky
Originally posted by: keysplayr2003
Originally posted by: munky
I think we could see a r580 in early 2006, and if the 512 gtx puts enough pressure on Ati, they might release the r580 sooner than expected. Otherwise Ati would just milk the r520 for all it's worth for as long as possible.

That would be a good thing. An R580 getting to market sooner than later that is. Both sides keeping up the pressure. But that memory clock rumor, holy hell thats insane for stock speeds. And 120MHz increase on the core is very impressive indeed. Do you happen to have a link to the presumed R580 specs so I/we can get a refresher? 😀

The commonly accepted theory at b3d is 48 pixel shaders, 16 ROP's, 16 TMU's, and 16 Z/Stencil operators. This is based on leaked internal Ati slides, that also described the current x1k cards several months before the launch, and which turned out to be true so far.

Yeah keep dreaming :roll:, if in fact they aren't referring to the ALUs and the Pixel Pipes together.

I however maintain my previous stance. ATI would be shooting itself in the foot to release a flagship product (X1800XT) and then trump their own product with a release in the next month. YOu guys are all saying wait for R580...sh!t im STILL waiting for the R520!!

As for your comment on the 700mhz clock or 1000mhz clock. First off, not going to happen. Second off, imagine the power consumption if it did. It would be UNREAL. The X1800XT already, in benchmarks, consumes more power than any previous GPU, 55Watt more than the G70. Imagine if you ramped the clock even farther to that extreme, we are talking a chip so hot, and so power hungry, theyll be lucky if someone has the guts to run it.

-Kevin

-Kevin

You think Nv can make a 110nm chip run at 550, but Ati can't make a 90nm chip run at 700? Power consumption is not an issuue for people who buy high end cards, how much power you think SLI'd gtx's will consume? Also, the 512gtx will draw more power than the 256gtx from the extra memory, so you might be surprised about which card will draw more power when this thing is launched.
 
Originally posted by: munky
Originally posted by: Gamingphreek
Originally posted by: munky
Originally posted by: keysplayr2003
Originally posted by: munky
I think we could see a r580 in early 2006, and if the 512 gtx puts enough pressure on Ati, they might release the r580 sooner than expected. Otherwise Ati would just milk the r520 for all it's worth for as long as possible.

That would be a good thing. An R580 getting to market sooner than later that is. Both sides keeping up the pressure. But that memory clock rumor, holy hell thats insane for stock speeds. And 120MHz increase on the core is very impressive indeed. Do you happen to have a link to the presumed R580 specs so I/we can get a refresher? 😀

The commonly accepted theory at b3d is 48 pixel shaders, 16 ROP's, 16 TMU's, and 16 Z/Stencil operators. This is based on leaked internal Ati slides, that also described the current x1k cards several months before the launch, and which turned out to be true so far.

Yeah keep dreaming :roll:, if in fact they aren't referring to the ALUs and the Pixel Pipes together.

I however maintain my previous stance. ATI would be shooting itself in the foot to release a flagship product (X1800XT) and then trump their own product with a release in the next month. YOu guys are all saying wait for R580...sh!t im STILL waiting for the R520!!

As for your comment on the 700mhz clock or 1000mhz clock. First off, not going to happen. Second off, imagine the power consumption if it did. It would be UNREAL. The X1800XT already, in benchmarks, consumes more power than any previous GPU, 55Watt more than the G70. Imagine if you ramped the clock even farther to that extreme, we are talking a chip so hot, and so power hungry, theyll be lucky if someone has the guts to run it.

-Kevin

-Kevin

You think Nv can make a 110nm chip run at 550, but Ati can't make a 90nm chip run at 700? Power consumption is not an issuue for people who buy high end cards, how much power you think SLI'd gtx's will consume? Also, the 512gtx will draw more power than the 256gtx from the extra memory, so you might be surprised about which card will draw more power when this thing is launched.

Considering at least on vendor is selling a 490Mhz part, I bet Nvidia can bin a 550Mhz part. But I will be surprised if this 512MB version is Nvidia stocked at 550Mhz. I bet 480Mhz would be a fair guess with EVGA selling an uber edition at 550Mhz.


 
Originally posted by: Pabster
Originally posted by: munky
Should they still be scared if an XT can be OC'd to 700mhz? I'm thinking it should have no problems reaching 700mhz on air given that it went past 1ghz on LN.

Yep, because ATi needs the higher clock to be competitive.

GTX @ 550 and XT @ 700 would be pretty equal.

Drivers would then tell the story...and perhaps the faster RAM of the 512MB GTX part.


I don't think this is true. Think about the math, here. As it is, the GTX and XT are pretty equal in core performance. Memory performance is where the XT takes the lead... look at the AA for example. It's more than well known that the XT takes much LESS of a performance hit when using AA than a GTX.

Whether we can attribute that to the specialized memory bus of the R520 or if it's only because the memory is clocked higher, I don't know.

Now... at 550mhz, we're lookin at around an average increase of 100mhz for 24 pipes. 700mhz for an XT is increasing 75mhz on 16 pipes. 1800mhz memory should obliterate the performance drop of the GTX with AA enabled...

Just my speculation.
 
Originally posted by: whitepig
i think 512mb 7800gtx is a fake

the link in HKEPC have been deleted...


www.HKEPC.com



Odd,, to say the least... makes me really suspicious of the validity of the rumors.. for now Im calling it BS until otherwise.. HKEPC listed MyDriver.com as a source and MyDriver.com showed HKEPC as the primary source with the picture of the Leadtek PX7800GTX Extreme and "7800 GTX 512MB" ..
 
Originally posted by: flynnsk
Originally posted by: whitepig
i think 512mb 7800gtx is a fake

the link in HKEPC have been deleted...


www.HKEPC.com



Odd,, to say the least... makes me really suspicious of the validity of the rumors.. for now Im calling it BS until otherwise.. HKEPC listed MyDriver.com as a source and MyDriver.com showed HKEPC as the primary source with the picture of the Leadtek PX7800GTX Extreme and "7800 GTX 512MB" ..

maybe , they saw rollo post and thought he was pr and inquirer made something out of it. Would be very funny : )
 
my guess is someone came across the samsung memory module on thei (samsung's site), then recalled that NV placed a large order with Samsung a while back.. Then assumed that the memory they ordered must be the new 1.1ns modules. next leadtek puts up pics of its 512MB workstation graphics card..

someone somewhere draws the conclusion that the leadtek in question MUST be the new reference 7800 GTX 512MB and that the 1.1ns memory on samsungs page would be used. problem is the Leadtek card is listed as 33.6GB/s bandwidth.. (double checking)
 
Originally posted by: flynnsk
my guess is someone came across the samsung memory module on thei (samsung's site), then recalled that NV placed a large order with Samsung a while back.. Then assumed that the memory they ordered must be the new 1.1ns modules. next leadtek puts up pics of its 512MB workstation graphics card..

someone somewhere draws the conclusion that the leadtek in question MUST be the new reference 7800 GTX 512MB and that the 1.1ns memory on samsungs page would be used. problem is the Leadtek card is listed as 33.6GB/s bandwidth.. (double checking)

That's quite a guess indeed. But that's all it is.

 
Originally posted by: munky
Originally posted by: Gamingphreek
Originally posted by: munky
Originally posted by: keysplayr2003
Originally posted by: munky
I think we could see a r580 in early 2006, and if the 512 gtx puts enough pressure on Ati, they might release the r580 sooner than expected. Otherwise Ati would just milk the r520 for all it's worth for as long as possible.

That would be a good thing. An R580 getting to market sooner than later that is. Both sides keeping up the pressure. But that memory clock rumor, holy hell thats insane for stock speeds. And 120MHz increase on the core is very impressive indeed. Do you happen to have a link to the presumed R580 specs so I/we can get a refresher? 😀

The commonly accepted theory at b3d is 48 pixel shaders, 16 ROP's, 16 TMU's, and 16 Z/Stencil operators. This is based on leaked internal Ati slides, that also described the current x1k cards several months before the launch, and which turned out to be true so far.

Yeah keep dreaming :roll:, if in fact they aren't referring to the ALUs and the Pixel Pipes together.

I however maintain my previous stance. ATI would be shooting itself in the foot to release a flagship product (X1800XT) and then trump their own product with a release in the next month. YOu guys are all saying wait for R580...sh!t im STILL waiting for the R520!!

As for your comment on the 700mhz clock or 1000mhz clock. First off, not going to happen. Second off, imagine the power consumption if it did. It would be UNREAL. The X1800XT already, in benchmarks, consumes more power than any previous GPU, 55Watt more than the G70. Imagine if you ramped the clock even farther to that extreme, we are talking a chip so hot, and so power hungry, theyll be lucky if someone has the guts to run it.

-Kevin

-Kevin

You think Nv can make a 110nm chip run at 550, but Ati can't make a 90nm chip run at 700? Power consumption is not an issuue for people who buy high end cards, how much power you think SLI'd gtx's will consume? Also, the 512gtx will draw more power than the 256gtx from the extra memory, so you might be surprised about which card will draw more power when this thing is launched.

Well:

1. ATI is already running at very high clock speeds. To push that even further probably isn't the best idea. Especially as far as you are referring to.

2.As for the SLIed GTX's, a lot less than you would think. I dont think i saw them break 300Watt even at full load.

3. 256mb of GDDR3 ram will not increase power consumption by 50Watts. If you factor in a possible increase in clockspeeds, yes it probably will jump, but remember Nvidia also has 24 Pixel Pipelines as well that need power. ATI is consuming that amount of power without those pipes.

DOnt get me wrong, i was impressed with ATI's architecture, especially with the memory subsystem, however, power-wise, there is A LOT to be desired.

-Kevin
 
Originally posted by: keysplayr2003
Originally posted by: flynnsk
my guess is someone came across the samsung memory module on thei (samsung's site), then recalled that NV placed a large order with Samsung a while back.. Then assumed that the memory they ordered must be the new 1.1ns modules. next leadtek puts up pics of its 512MB workstation graphics card..

someone somewhere draws the conclusion that the leadtek in question MUST be the new reference 7800 GTX 512MB and that the 1.1ns memory on samsungs page would be used. problem is the Leadtek card is listed as 33.6GB/s bandwidth.. (double checking)

That's quite a guess indeed. But that's all it is.


yes it is but then again its just as much as any other "rumor" .. lets see story starts with HKEPC,.. MyDrivers.com links to it, HKEPC links to MyDrivers .. numerous sites pass links around until of course L'INQ picks up on it,.. which once that happens it's suddenly "true" because of how accurate L'INQ has always been in the past..

Ya ok. OH but wait let me guess you're under NDA so you cant deny that either.. lol nothing better than brandboys..

Oh,.. joy what's this:

http://www.samsung.com/Products/Semicon...on/product_list.aspx?family_cd=GME1002
funny how the supposed memory in question: (K4J55323QG) is ONLY available up to 800Mhz (1.2ns) with 900Mhz in production but not available for orders.

I guess Nv must be Samsungs number one priority seeing as somehow they have been able to aquisition a large amount of memory that isnt even available outside of customer sampling.
 
Originally posted by: Gamingphreek

2.As for the SLIed GTX's, a lot less than you would think. I dont think i saw them break 300Watt even at full load.

Wrong. Why do you say these things without checking your facts first? SLI'd 7800GT and GTX use more power at load than an X1800XT... in fact significantly more. 30% more for the SLI GTX according to the numbers here. If you have a link to show otherwise, I'd love to see it.
 
What gives MORPH, the latest report is a 550/1600 version and a 600/1800 version

x1800 just got ROYALLY PWNED, i didnt say ATi, I said x1800
 
Originally posted by: flynnsk
my guess is someone came across the samsung memory module on thei (samsung's site), then recalled that NV placed a large order with Samsung a while back.. Then assumed that the memory they ordered must be the new 1.1ns modules. next leadtek puts up pics of its 512MB workstation graphics card..

someone somewhere draws the conclusion that the leadtek in question MUST be the new reference 7800 GTX 512MB and that the 1.1ns memory on samsungs page would be used. problem is the Leadtek card is listed as 33.6GB/s bandwidth.. (double checking)

I think your guess is pretty good. I have a feeling more than a few of their stories are nothing but educated guesses based on some rumors or tidbits of evidence found around the net. They pass them off as insider news stories and then just hope their guess turns out to be right.
 
We alreedy knew the picture was not of the 7800 GTX 512MB on HKEPC, and that it was of a Quadro 4500. You can (I guess now it should be "could") see the additional hardware at the top left hand side for the 14-pin header for NVidia G-sync/SDI option boards.

AT Forum Post
 
Originally posted by: CPlusPlusGeek
What gives MORPH, the latest report is a 550/1600 version and a 600/1800 version

x1800 just got ROYALLY PWNED, i didnt say ATi, I said x1800

What are you talking about? I quoted Gamingphreek and we're talking about power usage.
 
Originally posted by: M0RPH
Originally posted by: Gamingphreek

2.As for the SLIed GTX's, a lot less than you would think. I dont think i saw them break 300Watt even at full load.

Wrong. Why do you say these things without checking your facts first? SLI'd 7800GT and GTX use more power at load than an X1800XT... in fact significantly more. 30% more for the SLI GTX according to the numbers here. If you have a link to show otherwise, I'd love to see it.

I believe that is total system power. Also note that, they are using different motherboards for the comparison.

-Kevin
 
An X1800XT running at 700MHz is not gonna use more power at load than 7800GTX SLI. I believe that's what you were trying to claim. Show some numbers to support your claim. You think bumping the core to 700MHz would increase power usage over 30%? Not hardly.
 
Nice edit there :roll:

Link

And thats even a bit high seeing as the 6800U consumes much more power than the GTX at load.

You think bumping the core to 700MHz would increase power usage over 30%

Well that plus bumping the memory speeds, it would begin to climb quite high. Add that on with the supposed "48 pixel shaders" i would be willing to bet that would account for a whole hell of a lot more power.

-Kevin
 
Originally posted by: munky
Originally posted by: Gamingphreek
Originally posted by: munky
Originally posted by: keysplayr2003
Originally posted by: munky
I think we could see a r580 in early 2006, and if the 512 gtx puts enough pressure on Ati, they might release the r580 sooner than expected. Otherwise Ati would just milk the r520 for all it's worth for as long as possible.

That would be a good thing. An R580 getting to market sooner than later that is. Both sides keeping up the pressure. But that memory clock rumor, holy hell thats insane for stock speeds. And 120MHz increase on the core is very impressive indeed. Do you happen to have a link to the presumed R580 specs so I/we can get a refresher? 😀

The commonly accepted theory at b3d is 48 pixel shaders, 16 ROP's, 16 TMU's, and 16 Z/Stencil operators. This is based on leaked internal Ati slides, that also described the current x1k cards several months before the launch, and which turned out to be true so far.

Yeah keep dreaming :roll:, if in fact they aren't referring to the ALUs and the Pixel Pipes together.

I however maintain my previous stance. ATI would be shooting itself in the foot to release a flagship product (X1800XT) and then trump their own product with a release in the next month. YOu guys are all saying wait for R580...sh!t im STILL waiting for the R520!!

As for your comment on the 700mhz clock or 1000mhz clock. First off, not going to happen. Second off, imagine the power consumption if it did. It would be UNREAL. The X1800XT already, in benchmarks, consumes more power than any previous GPU, 55Watt more than the G70. Imagine if you ramped the clock even farther to that extreme, we are talking a chip so hot, and so power hungry, theyll be lucky if someone has the guts to run it.

-Kevin

-Kevin

You think Nv can make a 110nm chip run at 550, but Ati can't make a 90nm chip run at 700? Power consumption is not an issuue for people who buy high end cards, how much power you think SLI'd gtx's will consume? Also, the 512gtx will draw more power than the 256gtx from the extra memory, so you might be surprised about which card will draw more power when this thing is launched.

Actually, yes, when you consider the following.

90nm X 1.22 = 110nm

550 Mhz X 1.22 = 671 Mhz

Considering that 110nm is a mature process at this point and ATI's 90nm is still refining itself, I think it is safe to say that nVidia could attain 550Mhz before ATI could hit 700Mhz. Of course, this is assuming that both GPU's share the same design, which they do not.

But, the fact is, shrinking from 110nm to 90nm is just a marginal down size.

 
I have no doubt nVidia could mass-produce 110nm GPUs with 550MHz clock frequency. They've had plenty of time to refine the process and improve yields.
 
Back
Top