Nvidia to Amd : Bring On Antilles

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
It sure sounds like this will be the closest AMD and Nvidia have been at the high end since the x800/6800GT, fun times
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Sorry but I believe the stock tip that a plastered bum gave me at 7:00am this morning over this "article."
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
Quite aside from that, why would *anyone* care how 'big and hot' it was if it walked the walk? Surely that was the argument from a certain side of the game since ever Fermi popped out? ;)
To nVidia people it won't matter at all, except that is they may wonder why it is all of a sudden something acceptable to the other side.
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
To nVidia people it won't matter at all, except that is they may wonder why it is all of a sudden something acceptable to the other side.

Nobody minds excess power consumption if that also means excess performance to match . Doesn't matter which side you prefer, fact is the 480 was horribly inefficient, and rightfully got flamed for it. NVIDIA fixed those problems with the 580 and you now have a part that - while still big on power consumption - offers substantial performance gains that justify the power figures.
 

SHAQ

Senior member
Aug 5, 2002
738
0
76
Delay it so you can lose sales especially around Xmas? Dumb idea. Setting clocks is kinda pointless since the end consumers can OC it themselves, which is what Nvidia would be doing anyway. It probably isn't ready.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Please list for us, just how AMD matches Nvidia in features, but only if they ad single card surround to their feature set. I'd love to hear this.

First rule about features - ask yourself "so what?".

Features without benefits are useless.

One of my features is I weigh 196 pounds. "So what?".
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
First rule about features - ask yourself "so what?".

Features without benefits are useless.

One of my features is I weigh 196 pounds. "So what?".

That sounds like a personal thing though. Not all features are relevant to everyone.

Applying your specific needs to the whole market isn't the way to look at things.


Someone might think 196 pounds of Phynaz glory is sexy as hell, someone else might want something different, others might be indifferent.
 

JimmiG

Platinum Member
Feb 24, 2005
2,024
112
106
Perhaps more importantly, have we decided how we define 'big and hot' to be? Maybe Fermi in it's initial forms, or the legendary FX 5800? ;)

NV30/5800U is neither hot nor big compared to current GPUs. It was noisy because the cooling wasn't as advanced or refined. Also it was the first dual-slot video card, which was quite controversial in 2003 but which is the norm these days. It didn't use that much power either, only requiring a single Molex connector for power, combined with the little power the AGP slot could deliver.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Don't you want voices from different point of views?
I want to hear your point of view! What are your thoughts on AMD vs Nvidia drivers/features. Also important is how well does each card handle soap water?
 

Ares1214

Senior member
Sep 12, 2010
268
0
0
As far as i know, AMD has more "features". This is counting CPU's of course, which he must be talking about :hmm: Otherwise, in the Graphics department:

CUDA:...APP? Win Nvidia
Physx:...nothing Win Nvidia
Surround (SLI needed): Eyefinity (Xfire NOT needed) Win AMD

As far as major features, Nvidia has that one, better drivers too boot. Until recently, AMD clearly had the better, or more efficient cards, however thats starting to change. I dont use CUDA, dont like Physx, and hate eyefinity, so all of them are non-factors to me anyway.
 

Jdawg84

Senior member
Feb 6, 2010
256
0
0
As far as i know, AMD has more "features". This is counting CPU's of course, which he must be talking about :hmm: Otherwise, in the Graphics department:

CUDA:...APP? Win Nvidia
Physx:...nothing Win Nvidia
Surround (SLI needed): Eyefinity (Xfire NOT needed) Win AMD

As far as major features, Nvidia has that one, better drivers too boot. Until recently, AMD clearly had the better, or more efficient cards, however thats starting to change. I dont use CUDA, dont like Physx, and hate eyefinity, so all of them are non-factors to me anyway.

IMO while nice Eyefinity is a complete waste of money. I'm curious to see the amount of users actually go out and drop 900 dollars on a tri monitor setup.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
IMO while nice Eyefinity is a complete waste of money. I'm curious to see the amount of users actually go out and drop 900 dollars on a tri monitor setup.

And then narrow it down even further to how many of those with 900 dollars worth of monitor only use one card to drive it.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
And then narrow it down even further to how many of those with 900 dollars worth of monitor only use one card to drive it.

Raises hand...

I already have 2 monitors. Going to bet a 3rd one after I get my 6850. Yes, I do prefer turning down settings to run on 3 monitors (I also play racing games a lot, which is why its worth it to)

Not arguing who has more features, just saying.

EDIT: They aren't $300 monitors. Well not over there they wouldn't be.
 
Last edited:

Jdawg84

Senior member
Feb 6, 2010
256
0
0
I'd love to see how BC2 looks like on 3 of my 27.5" LCD's. Would be AWESOME.

Problem is cost as my monitors run over 300 dollars each.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
1681511.jpg

235841r85wowur3l3lnn5r.jpg.thumb.jpg

asusmars2large_thumb.jpg

NVIDIA-Dual-GTX-470-at-Computex-2.jpg
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
top two are front/back of ref. 3-display dual gf110 580's or 570's or undervolted/underclocked 580's, 3rd one is asus' prototype dual gf100 375's, 4th is a dual gf100-275 (470's) from gigabyte maybe?