GTX 480 Unigine and 3D Vision Surround Demo (GF100)

TJ Tom

Member
Feb 1, 2010
26
0
0
Cool video, finally something from nVidia!

Based on their graph in the video you can see the GTX480 will compete with an OCed HD5870 instead of a HD5970 like alot of fanboys said :) GTX480 seems to pewn the HD5870 in tesselation though.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
Cool video, finally something from nVidia!

Based on their graph in the video you can see the GTX480 will compete with an OCed HD5870 instead of a HD5970 like alot of fanboys said :) GTX480 seems to pewn the HD5870 in tesselation though.

The assumed advantage in tessellation really appears to be only because the Fermi card can use the rest of the shaders for it, where as ATI has the dedicated hardware.

I would be shocked and floored if in a real game, where GPU is doing many other tasks, this advantage holds. I've been surprised before though.
 

Apocalypse23

Golden Member
Jul 14, 2003
1,467
1
0
Tessellation is just a marketing gimmick to me, i think Ati will hands down beat nvidia to it with the next gen cards, and also at the same time, the 5870 still dominates very much I think in real world performance.

Thanks for the link, the 3d Vision technology is once again a copy from Ati's Eyefinity technology, but honestly how many users are going to buy those shades to play a game? It won't work.

Nvidia should have been able to deliver a very powerful GPU but they couldn't, and instead focused on things like Physx and tesselation to just add more 'junk in the trunk' to a card, it's mostly marketing and I can't wait to see what Ati will have in store. (Ati engineers are probably LOLing at this right now, smiling and knowing who the leader is :) )
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
The assumed advantage in tessellation really appears to be only because the Fermi card can use the rest of the shaders for it, where as ATI has the dedicated hardware.

I would be shocked and floored if in a real game, where GPU is doing many other tasks, this advantage holds. I've been surprised before though.

What you describe sounds a awfull lot like when they both went unififed shaders...to maximze the performance and make sure the eg. Vertex pipeline didn't stall the Pixel pipeline and you got the most from your hardware at all time.

Sounds like NVIDIA kept this philosophy, while AMD went a step back...but only time will tell.
 

TJ Tom

Member
Feb 1, 2010
26
0
0
Aint it a little soon for AMD next generation cards? HD5xxx isn't even available in decent quantities here in europe.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Well the full on benchmark shows what many people are fearing and scratching their heads about: when tessellation wasn't being heavily utilized the 5870 and GTX480 are really, really close performance wise.

Personally I'm disappointed with this. I *hope* the underlying technology in Fermi is sound and that it just needs to be tweaked some more and respun again (much like going from fx5800 to the fx5900) to realize it's full potential, otherwise I think Nvidia's diverging strategy (HPC) is going to dramatically slow down competition in the consumer GPU space. I wanted a compelling reason to upgrade my gtx260 216 but I don't think there is one right now. Here's to hoping the fall refreshing from both companies create better competition in the consumer space.
 
Last edited:

waffleironhead

Diamond Member
Aug 10, 2005
7,122
622
136
I'm interested in the comparison of 480 and 470 to see what the loss of Sm's does to the performance.
 

Apocalypse23

Golden Member
Jul 14, 2003
1,467
1
0
Well the full on benchmark shows what many people are fearing and scratching their heads about: when tessellation wasn't being heavily utilized the 5870 and GTX480 are really, really close performance wise.

Personally I'm disappointed with this. I *hope* the underlying technology in Fermi is sound and that it just needs to be tweaked some more and respun again (much like going from fx5800 to the fx5900) to realize it's full potential, otherwise I think Nvidia's diverging strategy (HPC) is going to dramatically slow down competition in the consumer GPU space. I wanted a compelling reason to upgrade my gtx260 216 but I don't think there is one right now. Here's to hoping the fall refreshing from both companies create better competition in the consumer space.

Agreed, I also think that 3d gaming in general has always been about Direct3D, the core of graphics technology and that's the way it should be. Tessellation and incorporating PhysX will simply not drive away users from Ati products. Some of the reasons include the lack of Dx11 titles, the lack of titles with considerable amount of PhysX and tessellation, and also the needless pressure encompassing developers to incorporate these new technologies. I'm sure developers worldwide are already having a tough time in keeping up with Dx11 technology alone.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Well the full on benchmark shows what many people are fearing and scratching their heads about: when tessellation wasn't being heavily utilized the 5870 and GTX480 are really, really close performance wise.

Personally I'm disappointed with this. I *hope* the underlying technology in Fermi is sound and that it just needs to be tweaked some more and respun again (much like going from fx5800 to the fx5900) to realize it's full potential, otherwise I think Nvidia's diverging strategy (HPC) is going to dramatically slow down competition in the consumer GPU space. I wanted a compelling reason to upgrade my gtx260 216 but I don't think there is one right now. Here's to hoping the fall refreshing from both companies create better competition in the consumer space.

So you are saying a 85/100% bump in speed (from your gtx 260) is no reason to upgrade?
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
So you are saying a 85/100% bump in speed (from your gtx 260) is no reason to upgrade?

Well, for anyone playing at less, and many playing at, 1080p resolutions it isn't.

Most folks have the performance they currently require out of last gen tech. The only real reason to upgrade would be to gain access to more features. The only one that really pops out in this gen is dx11... which it seems won't be useful for a while, and even when it is won't be very playable on this latest gen.
 

ScorcherDarkly

Senior member
Aug 7, 2009
450
0
0
You can only get that kind of improvement by spending well over $300. Is it worth it then?

Worth is entirely dependent on how happy you are with your current situation, and/or on how much money you have to burn. Obviously that will differ from person to person.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
It was already linked somewhere, but this heise article should be rather interesting: http://www.heise.de/newsticker/meld...ung-der-GeForce-GTX-470-enthuellt-946411.html

For the minority ;) who can't read fluently german..
Unigine 4x AA (fps):
470: 29
5870: 27
5850: 22

Unigine 8x AA (fps):
470: 20
5870: 23
5850: 19


Though they could always tweak memory and co a little bit, atm they're using 1255Mhz for the shader cores and 1.6GHz GDDR5 (Read-Write-Clock).
They mention that the clock is slower than the Ati counterpart, but afaik Ati uses a 1.2Ghz clock, though I'm not sure what Read-Write-Clock means. If they're talking about the actual data rate 1.6GHz sounds way, way too low, so I'm confused..
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
So you are saying a 85/100% bump in speed (from your gtx 260) is no reason to upgrade?

According to Anand's reviews with the games I play most often right now, the biggest performance delta between the 5870 and gtx260 216 @ 1920x1200 is about 45% (crysis warhead). It's definitely a big jump in performance over my gtx260 216, but I don't think it's worth the $390+ price of admission and I don't think nvidia's cards are going to drive prices down anytime soon. Plus I'd be losing physx in Batman (playing it right now) and I plan on getting metro 2033 at some point this year. So the cost of performance improvement and losing physx for the few games I use it for isn't worth it to me today.

EDIT: My math was different than your math. I was looking at the performance difference from the standpoint of a 5870. The difference from the perspective of my card is 68% for crysis warhead. It's definitely a BIG jump, but since I don't game at 2560x1600, and turning off AA gives my gtx260 nearly the same scores as the 5870 with 4x AA on, paying $400 to get AA for free isn't worth it at this time for me.
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
You can only get that kind of improvement by spending well over $300. Is it worth it then?

ask the people who bought a 5850/70.:rolleyes:

I agree,it's not really nessasary to buy a high end card unless you need absolute max settings @1900x1080 or higher with a few choice games.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
According to Anand's reviews with the games I play most often right now, the biggest performance delta between the 5870 and gtx260 216 @ 1920x1200 is about 45% (crysis warhead). It's definitely a big jump in performance over my gtx260 216, but I don't think it's worth the $390+ price of admission and I don't think nvidia's cards are going to drive prices down anytime soon. Plus I'd be losing physx in Batman (playing it right now) and I plan on getting metro 2033 at some point this year. So the cost of performance improvement and losing physx for the few games I use it for isn't worth it to me today.

EDIT: My math was different than your math. I was looking at the performance difference from the standpoint of a 5870. The difference from the perspective of my card is 68% for crysis warhead. It's definitely a BIG jump, but since I don't game at 2560x1600, and turning off AA gives my gtx260 nearly the same scores as the 5870 with 4x AA on, paying $400 to get AA for free isn't worth it at this time for me.

So what you were really saying is there is no reason for you to upgrade to any next generation card because the software (games) hasn't outdone your gtx 260 at you current resolution?

That I could buy.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Well, for anyone playing at less, and many playing at, 1080p resolutions it isn't.

Most folks have the performance they currently require out of last gen tech. The only real reason to upgrade would be to gain access to more features. The only one that really pops out in this gen is dx11... which it seems won't be useful for a while, and even when it is won't be very playable on this latest gen.

So the 2 or 3 million people (last I heard) who bought a 5xxx series card are just buying for e-peen reasons unless they have a 30 inch monitor? Or is it the 3$ a month they will save on there electric bill?:D

So mabe thats the reason Nvidia don't seem to be in any rush to get there new cards out. :thumbsdown:
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
So the 2 or 3 million people (last I heard) who bought a 5xxx series card are just buying for e-peen reasons unless they have a 30 inch monitor? Or is it the 3$ a month they will save on there electric bill?:D

So mabe thats the reason Nvidia don't seem to be in any rush to get there new cards out. :thumbsdown:
I am in no way commenting on what people should or should not buy.. just that it is entirely valid to not want to dive into dx11 based on what we have seen so far.

It is entirely likely that the majority of nice looking dx11 games of tomorrow will require the highest end systems from this gen to use them. Not something the average person cares to buy.
 

Wag

Diamond Member
Jul 21, 2000
8,288
8
81
Wait...Is there a 1080p version of this video on YouTube? Why post a 1080p demo in 720p?

Also- we finally got a good look at the card- how big is it? Is it the same size as the GTX 295?
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Thanks for the link, the 3d Vision technology is once again a copy from Ati's Eyefinity technology

nV has supported the same tech as Eyefinity on their Quadro line for years and Matrox was supporting it prior to them(although I'm not sure if Matrox's setup works as a simulation of one giant screen as the Quadro's solution does). If anything, ATi was copying the others when it comes to Eyefinity. IMO I'll take a single 46" display loaded up with AA over any of the bezel filled solutions.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
The assumed advantage in tessellation really appears to be only because the Fermi card can use the rest of the shaders for it, where as ATI has the dedicated hardware.

I would be shocked and floored if in a real game, where GPU is doing many other tasks, this advantage holds. I've been surprised before though.

Yea I want to see benchmarks in real games where other things are going on and has to use shaders for other things as well until I say anything.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
So the 2 or 3 million people (last I heard) who bought a 5xxx series card are just buying for e-peen reasons unless they have a 30 inch monitor? Or is it the 3$ a month they will save on there electric bill?:D

So mabe thats the reason Nvidia don't seem to be in any rush to get there new cards out. :thumbsdown:

That is assuming all of those that bought a 5xxx series card already had something like a gtx 260 or equivalent card.

Many people don't upgrade every single generation - skipping one generation is quite common - and you have to consider new builds.

Most modern games are quite playable with a last gen GPU.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
So what you were really saying is there is no reason for you to upgrade to any next generation card because the software (games) hasn't outdone your gtx 260 at you current resolution?

That I could buy.

Yes I guess that's what I'm saying.

If DX11 was more prevalent at this time (say half of all PC releases having enhanced DX11 stuff) that would be just as influential to me as the performance increase and, in my mind, would justify upgrading now. Maybe by this fall or winter 2010/2011.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Most modern games are quite playable with a last gen GPU.

Playable, sure. NV recently introduced their mainstream performance card, the GT240, which is quite a bit slower than the 8800GT many gamers have been packing since late 2007/early 2008. That 8800GT still plays all modern games at either lower res, no AA or reduced settings.

So you could say most modern games are quite playable with a mainstream GPU of two generations ago.

But playable and awesome are two different things. NV and ATI both have convinced me to be fine with simply "playable" until fall of 2010 (I'm the type to buy high end and skip a generation), but that may not be the case for everyone.