200 3Dmarks in it. Looks like there are equal clock for clock. Which is pretty amazing from a technical standpoint. Looks like a 75mhz OC will bring a 7970 to parity. Although the GTX looks to have a better feature set. Not so excited about a 670 anymore.
Power viruses not withstanding, do you have proof of a 480 drawing more than 250 watts continuous?
GTX 480 draws about 40 more watts than the 7970, let's get back to reality.
The reality of the situation is this is worse than that even... This is a GTX 480 that draws more power, and is slower than the 5870. Can you imagine that?
Because that is what the 7970 is shaping up to be compared to the 680, slower and draws more power.
Power viruses not withstanding, do you have proof of a 480 drawing more than 250 watts continuous?
Yes thanks ballatheFeared for your lesson, however I know what a core and shader clock are. That is what a hotclock is. 1.1ghz core shows 2.2ghz shader in that program, because the program reads the video card's core speed from the I2C bus, and outputs twice that measurement into the "Shader" box because the uncore & core domains in Fermi are 1:2.
There is no speed measurement derived from the ALU domain, the program just says xxx mhz times two for Nvidia. amirite?
He's saying don't look at GPU-Z for shader clocks. Hot-clock is there.
I'm confused then, because GPU-Z is showing a hot clocked shader speed.
Maybe I'll just leave this part of the conversation because I'm pretty much lost at this point /:|\
Is he saying he doesn't believe Kepler has a 1:2 hotclock, and that it's different? 1:1.5 or something? I dunno why I'm having such a hard time figuring out what he's actually saying.
He's saying that GPU-Z calculates the hot clock by saying 2 x core clock. There is NO actual independent verification of shader clocks. The program simply assumes the ratio 2:1 for Nvidia just as it assumes a 1:1 ratio for AMD.I'm confused then, because GPU-Z is showing a hot clocked shader speed.
Maybe I'll just leave this part of the conversation because I'm pretty much lost at this point /:|\
Is he saying he doesn't believe Kepler has a 1:2 hotclock, and that it's different? 1:1.5 or something? I dunno why I'm having such a hard time figuring out what he's actually saying.
Running points, the clock frequency is stronger than 7970 with 680 by 5%.
However, battlefield 3,680 is slower by 15%.
Or 1080, running 1600, 680 is estimated to lose by 20%.
That is, run 3d11, 680 super, games have to wait for the driver.
Also note that, Why run eight times aa. Not open aa leading 580 afraid not.
Translation could be wrong, I dunno, but his comment about games needing a new driver is curious. Does anyone here speak/read cantonese? HE also states that availability will be the 27th or 28th.That is, run 3d11, 680 super, games have to wait for the driver.
The problem is no game will actually load up a gpu like furmark can, continuously. Some may spike past normal power draws from time to time and a limiter would cause problems in those games, however there is no game that will do what furmark and other tests like it do.
It's an unrealistic test and sets a bad precedence. For example a 480 will draw upward of 270 watts in furmark, wow that's scary as the 5870 will only draw 190ish. Man that 480 is an awful card! The flip side of this is the 480 only draws about 430 watts in Heaven, while the 5870 still draws about 190ish but the 480 can nearly double the performance of the 5870 with tess, so which one looks inefficient and awful now?
So it goes back to the problem of tests like Furmark creating unrealistic results without actually providing any meaningful data. The only thing Furmark type programs are good for is testing your cooling solution, any other information obtained through it is worthless.
Chiphell is not an english website. Translated, original is in cantonese. http://www.chiphell.com/thread-387232-1-1.html
Not sure if the translation is correct, but he does mention something about 3dmark11 being "super" and games needing a new driver. This line in particular:
Translation could be wrong, I dunno, but his comment about games needing a new driver is curious. Does anyone here speak/read cantonese? HE also states that availability will be the 27th or 28th.
Furmark is above average but there are certainly games that will load a GPU more then it, continuously.
Except it doesn't. It creates extremely realistic and accurate data and that is what scares them. This is why they limit just those programs instead of everything.
Also I think you meant to say "480 only draws about 130 watts in Heaven" otherwise the context doesn't make sense
lol ok then! I guess that's why AMD has had them blocked in their drivers for years now too! It all makes sense, Nvidia and AMD are conspiring against the greatest program ever!!!!!!!!!!!!!!!!!!oneoneoneone1
I guess that no review site will have a valid review when the NDA lifts.And he said that the price will be lower than 7970. OBR said the same for the czech market.
It seems that nVidia blocked the driver so that there will be no real leaks for the end of the nda.
I guess that no review site will have a valid review when the NDA lifts.
And he said that the price will be lower than 7970. OBR said the same for the czech market.
It seems that nVidia blocked the driver so that there will be no real leaks until the end of the nda.
32
Pretty sure AMD does not throttle Furmark, at least in my experience. When the 480s were initially released, they didn't either.
When running Furmark to check stability my power supply's fan actually becomes audible, so yeah, I would agree it increases power consumption. It's no different than running Prime95 though, it should be able to handle the load. Would not surprise me if some people crash when running Furmark because it pushes their PSU beyond capability.