GF100 Previews from Digital Experience

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

x3sphere

Senior member
Jul 22, 2009
722
24
81
www.exophase.com
There are also rumors floating around suggesting that Fermi runs hot, very hot. If that's the case I wonder what the overclocking headroom on this card is like. If Fermi is 36% faster but barely gives you any room to OC on air than that lessens the value proposition right there, as we all know the 5850/5870 overclocks VERY well thanks to ATI being conservative on power consumption and keeping voltages down.

That said, the 5870 runs rather hot when you hit 1GHz speeds. If Nvidia just went crazy on the clocks to pull ahead of ATI then count me out. With most games being console ports these days I'm willing to sacrifice a little performance for a cooler running card that doesn't heat up my entire case like a furnace.
 
Last edited:

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
... but it's clear that 2 power cables are hooked to the 8 pin.

I have seen, and have owned, PSU's that have a single PCIe cable but with two plugs. Much like how regular molex power lines may have plugs that extend and support multiple devices. Almost all PSU's use the same 12v rails for both PCIe plugs. To cut costs they just run out another set of cables about six to eight inches and then terminate it in another set of PCIe plugs rather than having longer cables that terminate into the PSU box.

Kinda hard to see but here is an example of a PCIe cable that doesn't terminate but extend out.

**EDIT**
Keys has a much better picture/example which I didn't notice at first.

That in all 3 cases they used PS with such cables.

Your cable has no rubber around the 6+2 that goes out.
The cables in the other pictures clearly show insulation on all of them that would indicate that the power goes in.

That would make sense since the initial test systems nVidia is using for demo/testing purposes are likely to be the same type of configuration. These are still demos of engineering samples so they want it to be as problem free as possible. It just makes sense for all demo rigs to be 100% same hardware.

The rubber around the cables is just cosmetics and getting to the point of nitpicking. It is the cable type and how the cable is designed that is important and debunks the idea that the Fermi engineering sample has 3 power cables going into it.
 
Last edited:

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Well at any rate it does look better for fermi right now. I tell I am ready to spank Bob . He came over an got the PC with the Clarkdale which isn't mine but it also had my 5870 in it . I have never benched the 5870. and now I see Fermi On the way I want benchies now so I can do a release driver compare between them .

I got to say the News from the show is really good . I really liked that Ananda report on Tegra II. I so want to see it go up against the Imagination 540 .

2009 was boring as all hell . 2010 looks like its going to Rock.

I read I thought in this thread but couldn't find all these contracts that NV has with Apple and what not that TegraII was going to kill in the smart phone market and what not . If someone could point me to right topic I would very much like to post this in it . Than I want a few people from a hot debate to standup and come forward when I get that OLD OLD topic and slap some people around. Heres the link . I just need someone to put it in that topic thread were tegra II is being discussed . I don't want to derail this thread as its just to amusing to derail .

http://www.slashgear.com/imagination-technologies-powervr-sgx545-to-be-apple-islates-gpu-0869124/
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
That in all 3 cases they used PS with such cables.

Your cable has no rubber around the 6+2 that goes out.
The cables in the other pictures clearly show insulation on all of them that would indicate that the power goes in.

And why on earth would you think my PSU is identical to the ones used in the Fermi pics?

Do I have to have rubber around my 6+2? Is that some sort of IEEE standard?

What are you fishing for dude? Give!!! What do you think is happening and that is so strange? I'm dying to know now. LOL.

I think that you think:

1. Fermi has, and needs, two 8pin and one 6pin PCI-e power connectors.
2. Regardless of "clear" pics shown, you still have doubts. Did you take a look at the solder points? Can you tell me how many there are?
Do you see 16 when there are only 14?
 
Last edited:

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
5850/5870 overclocks VERY well thanks to ATI being conservative on power consumption and keeping voltages down.

That said, the 5870 runs rather hot when you hit 1GHz speeds. If Nvidia just went crazy on the clocks to pull ahead of ATI then count me out.

Both companies are going to keep their flagship cards under a relative threshold when finalizing clock speed & voltage. They have to account for chip leakage across the bin, aging, poor ventilation, dust filled HSF's - all that sort of stuff. If they don't back it down some, then they risk the chance of countless RMAs due to overheating & artifacting within months.

I bet cypress cores hit 1000mhz in the labs all day long, but they decided with 850mhz on the 5870 because it's practical and made sense on the stock designed cooler with a sensible amount of voltage. 850mhz was also a mark that could be hit by a qualifying percentage of chips needed to be released as the 5870 sku.

My guess is that the average GF100 GTX380 chip will ship at around 650-700mhz core, but that it could hit around 850-900mhz in the lab with ease. As you said, Nvidia will probably push the final clocks as high as possible to best compete with cypress, but they won't go overboard with it. Otherwise they risk the chance of releasing a finalized product with a high fault rate due to overheating.

Considering all the cards of recent generations, I like the 5850. That is a card you can immediately overclock right into the top 1% of video cards. It overclocks ~30% on the core on almost all samples. Another good card was the GTX 260. Able to take the gt200/b from 575mhz up almost 30% as well, making it an upper echelon product.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
You do realize you need to buy a monitor to use a video card don't you?

And you do realize when talking about 'making 3 - 6 monitor gaming affordable to the masses' in a connversation regarding Nvidia and AMD we'd be talking about the parts they control, not the cost of the electric to run three monitors, system memory, hard drives, the desk your monitors are on that is needed for multiple monitors, etc, don't you? The total cost of the platform certainly matters, but AMD made it possible and even affordable to game on 3 monitors with their part. Three 24" 1080P monitors, a 5850, and a mini display port to DVI adapter look like they can be had for about $1000.

When talking about AMD's Radeon bringing gaming on 3 - 6 monitors first, we're not comparing the Radeon to pricey non-gaming cards or Matrox cards that can you cannot play any modern game on or run three monitors at once that are higher then 1680x1050 res, don't you? This is what Nvidia looks like they are going to follow suit with, the ability to game on more than two high resolution monitors.
 
Last edited:

T2k

Golden Member
Feb 24, 2004
1,665
5
81
I read I thought in this thread but couldn't find all these contracts that NV has with Apple and what not that TegraII was going to kill in the smart phone market and what not . If someone could point me to right topic I would very much like to post this in it . Than I want a few people from a hot debate to standup and come forward when I get that OLD OLD topic and slap some people around. Heres the link . I just need someone to put it in that topic thread were tegra II is being discussed . I don't want to derail this thread as its just to amusing to derail .

http://www.slashgear.com/imagination-technologies-powervr-sgx545-to-be-apple-islates-gpu-0869124/

Here, same guy: http://www.guildwarsguru.com/forum/rahjas-freeeeeeee-t10420384.html?

Dunno, he doesn't strike as authentic at all, much more like a tool, set to spread misinformation.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
That video was pretty slick. Real-time rendered videos are getting closer to Pixar quality animation.

That's not a hard thing to do - Pixar movies are always puppet-shows, not real-world imitations.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Quite the contrary. If the card has extra pixel pushing to spare, you can be sure they're going to try to leverage physx and other taxing, perhaps even exclusive, features MORE to sell this beast. All of which is fine with me as long as it helps improve visual quality.

Since when crapping Anti Aliasing with competitor's card is a good thing? I can understand of PhysX and stuff like that. Adding too much IP to a game will certainly reduce the game sales since it won't run in nearly half of PC's the way its supposed to run. That kind of attitude will kill the PC gaming market. Adding vendor exclusive features to a game is nice, as far as it doesn't break basic functionality like Anti Aliasing and basic game effects.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Since when crapping Anti Aliasing with competitor's card is a good thing? I can understand of PhysX and stuff like that. Adding too much IP to a game will certainly reduce the game sales since it won't run in nearly half of PC's the way its supposed to run. That kind of attitude will kill the PC gaming market. Adding vendor exclusive features to a game is nice, as far as it doesn't break basic functionality like Anti Aliasing and basic game effects.

:rolleyes:

Batman: AA was one of the best selling games last year and even a candidate for GotY.

Probably thanks to all the work NVIDIA put into it.

AA is not basic functionality in a UE3 game.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
:rolleyes:

Batman: AA was one of the best selling games last year and even a candidate for GotY.

Probably thanks to all the work NVIDIA put into it.

AA is not basic functionality in a UE3 game.

Dirt was one of the best games of last year and got best racing game from many places (racing games never get GoTY, not even Gran Turismo - based on the selection given on the Wikipedia GoTY page).

Probably all thanks to the work AMD put into it.

DX11 is not basic functionality in an EGO game.


Wait wait wait. Oh yeah! Consoles don't use DX11.
But then, they don't use hardware PhysX... and (the Xbox certainly) doesn't care about NV AA methods.
Oh, and Dirt 2 PC got "best graphics" from VG Chartz, with Readers Choice being.. Batman AA.

Point being: Batman AA wasn't a great game because of anything NV did, it was a great game because it was a great game.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
:rolleyes:

Batman: AA was one of the best selling games last year and even a candidate for GotY.

Probably thanks to all the work NVIDIA put into it.

AA is not basic functionality in a UE3 game.

Tell that to Mirrors Edge which uses the UE3 engine and have an anti aliasing option in the control panel. ;)

The game is the best selling game not because of nVidia, is because the game is indeed good. I enjoyed the PhysX effects a bit but it still glitchy like all PhysX implementations available. What nVidia did is just putting IP crap to affect ATi's Anti Aliasing implementation which worked fine when I changed my DeviceID :p, thank you!! Try again.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
how is production of GF100 is confirmed? because Jen Hsun said so? its the same guy who showed a wood screw card and said "this puppy is Fermi"..

You think he is lying? The SEC would make an example of him along with all the lawsuits brought by shareholders for giving false production information.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Tell that to Mirrors Edge which uses the UE3 engine and have an anti aliasing option in the control panel. ;)

The game is the best selling game not because of nVidia, is because the game is indeed good. I enjoyed the PhysX effects a bit but it still glitchy like all PhysX implementations available. What nVidia did is just putting IP crap to affect ATi's Anti Aliasing implementation which worked fine when I changed my DeviceID :p, thank you!! Try again.

Um isnt the story the dev didnt develope an AA routine but Nvidia wrote the code for AA implementation? The dev put the device check in to disable ATI cards from using Nvidia written code. Most likely to cut down any Q&A. Without Nvidia you wouldnt even be able to hack your way into using AA in that title.
 

at80eighty

Senior member
Jun 28, 2004
458
5
81
Um isnt the story the dev didnt develope an AA routine but Nvidia wrote the code for AA implementation? The dev put the device check in to disable ATI cards from using Nvidia written code. Most likely to cut down any Q&A. Without Nvidia you wouldnt even be able to hack your way into using AA in that title.

apparently not

http://forums.anandtech.com/showthread.php?t=2039124

read the linked article - one needs to just change the Vendor_ID and it works apparently

hey evolucion8 how DO you spoof the vendorID anyway? apparently my googlefu sucks
 

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
How will you live up to your words? Just out of curiosity.

GTX 280's TDP is 236W, BTW.


Its on a 40nm production and thus more energy efficent, add 2 and a half years of development time and I'd say it'll be around 230W under load.

I suspect it will also be very energy efficient while idle with all the advances going around.

Anyways theoretically and still dependent on final clocks it should end up about 75% faster than GTX 285, you make the math how much that is difference with 5870
 

Schmide

Diamond Member
Mar 7, 2002
5,745
1,036
126
Anyways theoretically and still dependent on final clocks it should end up about 75% faster than GTX 285, you make the math how much that is difference with 5870

By vantage numbers

285 is 6341 *1.75 = 11097

Thus since the 5870 is 8028

(11097 - 8028) / 8028 = 38% faster
 

Kakkoii

Senior member
Jun 5, 2009
379
0
0
Its on a 40nm production and thus more energy efficent, add 2 and a half years of development time and I'd say it'll be around 230W under load.

I suspect it will also be very energy efficient while idle with all the advances going around.

Anyways theoretically and still dependent on final clocks it should end up about 75% faster than GTX 285, you make the math how much that is difference with 5870

It doesn't matter what fabrication it uses, 40nm or not, it depends on the actual die size and architecture efficiency. And the die is big, thus it will produce a good amount of heat compared to other cards.

But all this is pointless, as long as it runs stable and lasts who gives a fuck if it run's hot. And obviously Nvidia isn't going to release a Geforce card that can't fit into the TDP of at least some consumer PSU's.
 
Last edited:

Kakkoii

Senior member
Jun 5, 2009
379
0
0

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
http://www.geeks3d.com/20100108/ces-2010-nvidia-gf100-fermi-tri-sli-and-physx-rocket-sled-demo/

More awesome video of the Supersonc Sled demo. The Nvidia rep explains how the demo is working in good detail. Most of the objects are PhysX objects that make the sled actually move and interact with the scene. Pretty cool.

There's also what looks to be FPS numbers near the bottom. Wonder if someone managed to take a glance at them to see how fast it was running.

Nice catch.
All looks great, but if Nvidia miss their tight window of opportunity, ATI might be ready and waiting for them with a refresh of the 5800 series.