New 6950 with only 1gb of memory comming, cheaper!(FZ)

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Mabe I worded it wrong I said
"go out and spend $850 and buy 2 extra monitors , a diplayport adapter, a $300 card for those framerates."

Thats 2 monitors , a display port adapter and a 300$ 69502gb........for 850$
My point was you should buy 2 6950's with 2gb's and get some real framerates. Not spend 850$ on above items to play some games @ barely 30fps with eyeinfinity and 1 6950 2gb.

Better?:)

Barely 30fps?
I see 50+ for Far Cry 2 and ~38 for Dirt 2.
And that 50+ for Far Cry is Ultra with 4xAA.
The "barely 30" in F1 2010 is, like people have said, with 4xAA.
Dirt 2, again max settings.

If a single card can manage 2 of these 3 games at 5760x1080 at above 30fps average on maximum settings with AA, then surely a single card is actually enough.
Also 3 monitors can be used for more than just gaming, while 2 GPUs typically can't, whether AMD or NV. I've been using 3 monitors in non-gaming contexts for the last 3+ years quite happily.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Again, people use weaker cards to run multiple monitors, there's nothing wrong with running eyefinity onan HD 6950...

I see your point but....

Do you see someone spending money on 3 matching monitors ,a displayport adapter and a single 69502gb to get barely ok performance.
Remember these are enthusiast people spending lot of money to get a better/different experience.

Why half step on your video cards and get just ok performance at lower details after you just spent hundreds of dollars for eyefinity monitors, a displayport adapter and then go buy a card that might just get the job done.

Eyefinity/surround = crossfire/sli

(the next paragraph is my opinion, no flaming allowed :))
I'll tell you the ultimate bang for buck surround setup. Go buy 2 gtx460 2gb cards($440 on newegg) and overclock them. Thats the way its meant to be played, sorry for the pun. That will smoke any single card out there.
If AMD made 2gb 6850's, I'd also recommend that, but I don't think they do.

See my point?
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Eyefinity and NV Surround, as cool as they may be to see videos of and hear about, are simply and completely impractical IMO. I'd rather (and I think I speak for the majority) have one big screen than 3 independent smaller screens with bezels. The extra power and display cords, let alone space, that 3 PC monitors need is an obstacle in itself and bezels just add to it. Even if bezels could be measured in low single digit millimeters, then that would be only one less issue.

And insofar as people saying 1 card can drive multiple monitors adequately in a number of games, that's generally true of mostly older games. And, usually, people don't spend $300+ upgrading their GPUs just so they can play an older game. They do it for current/upcoming games.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
And insofar as people saying 1 card can drive multiple monitors adequately in a number of games, that's generally true of mostly older games. And, usually, people don't spend $300+ upgrading their GPUs just so they can play an older game. They do it for current/upcoming games

Well once again , you said it better then me. :)
Thanks.
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
Multi-monitor makes just about everything computer related better, be it gaming, web surfing, etc. The DisplayPort adapter is like $30, it's hardly what I would call an "obstacle" and I don't get why it's being used as a decision factor here. Neither are the bezels, as there is no alternative that is going to offer the same amount of FOV as 3 displays can (a giant screen/projector is just that, a blown up image of the same single-display resolution).
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
The DisplayPort adapter is like $30, it's hardly what I would call an "obstacle"

? who said that. We were talking about single cards, memory amounts and there performance with surround/eyefinity vs dual cards.
Feel free to jump in.:)
 

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
My 6950 is working out great so far for my triple-screen setup. I almost started to feel an itch for a 4th U2410, since I know the card can support it.

Right now I'm using 2 mini-DP -> DP cables and one DVI cable for the three.



Also, Hap, you're the one who brought up the adapter right on this very page.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Eyefinity and NV Surround, as cool as they may be to see videos of and hear about, are simply and completely impractical IMO. I'd rather (and I think I speak for the majority) have one big screen than 3 independent smaller screens with bezels. The extra power and display cords, let alone space, that 3 PC monitors need is an obstacle in itself and bezels just add to it. Even if bezels could be measured in low single digit millimeters, then that would be only one less issue.

And insofar as people saying 1 card can drive multiple monitors adequately in a number of games, that's generally true of mostly older games. And, usually, people don't spend $300+ upgrading their GPUs just so they can play an older game. They do it for current/upcoming games.

I have no idea why you feel so strongly about something that I have never heard you say that you had long term experience with. I felt cheated at first when I went Eyefinity, but it's proven its worth over time for multitasking. As an example, you can get a 6950 and steamroll any modern game on the center monitor--there is NOTHING forcing someone to use all three monitors for gaming; they can crank up the fps by using center-monitor-only for those games that run better that way. You then have the option of going Eyefinity if you so desire for less-demanding games, and for multitasking such as browsing in one window, writing a document in another, and watching TV in the third. By the way, Fallout New Vegas is not an old game, it just came out and even a NON overclocked 6850 can drive it on 5040x1050. And so long as current-gen consoles rule the roost, don't hold your breath on hugely demanding PC video games coming down the pike.

The desk space and extra power draw is potentially issues for some, though power draw is sorta arguable because once you get the hang of it, you are SO much more productive in multitasking that you can do the same tasks in less time if you had tri-monitor. Less time = you can turn off your computer in less time, too = less power draw than you might expect. Furthermore, Eyefinity has hotkeys where you can switch from various display settings within seconds. I am typing this on a single center monitor right now to save power, but within seconds I could switch to tri-monitor view for multitasking. Lastly, those 30" LCD monitors draw a ton of power, too, and there is no magic switch to use 1/3 of the power like there is in Eyefinity where I can reduce usage down to the center-monitor-only with a hotkey stroke, for times when I don't need all that screen space.

I find the complaint of two extra power/display cords out the back to be completely irrelevant to most people. That's really stretching, there.

Bezels (esp. with bezel correction) is not an issue for long-term users, anymore so than the A-pillars on your car "obstruct" your view. Your mind ignores bezels after a while, sort of like how your mind--not your eyes--edits out the "blind spot" in your eyes where the optic nerve connects to the retina.

Lastly, I already talked about 1 huge screen vs. 3 smaller screens and how 1 huge screen is better in some cases, like photoshop, but 3 screens is hard to beat for extended FOV in games and also for multitasking. (Ideally you would have 3 huge screens and get the best of both worlds.) The reason Blizzard banned anything but PPP-Eyefinity was because the extra FOV was deemed a competitive advantage and they didn't want that. Thankfully more forward-looking companies allow it. WoW, LOTRO, TF2, you name it, those extra FOV degrees are something that would be hard or impractical to replicate on a single huge screen. What are you going to do, hack a game so it only uses the middle 1/3 of your screen in order to match the FOV that you'd get with tri-monitors? Good luck. In the meantime, I can play TF2 and hardly ever be backstabbed due to my huge FOV and Surround headset. (I used to be hard to backstab because of my paranoid twirling around every several seconds, but now it's even harder to backstab me.)

Anyway, single-GPU tri-monitor is a nice bonus but doesn't make or break most purchases, sort of like PhysX, but better, since you can really only use PhysX in games right now, whereas Eyefinity/Surround is useful outside of games, too.

Since when was MORE choice and MORE options a BAD thing?

Try it (for an extended period of time) before you knock it, k?

P.S. For those debating cost, it really depends. My total cost of going Eyefinity, including buying 3 monitors for ~$360 and selling one for $270, and buying a DP->VGA adapter, and going from no video card to having a new 6850, was about $290. Had I waited a while, i could have gotten some IPS Dell 23"'s for not THAT much more. As it is I will probably shell out a few hundred more (net) to upgrade to 3 x 1080p IPS LED later this year. This is not unreasonable considering that the price of a single GTX480 would have been $500 and then start its inevitable depreciation in value. Monitors don't depreciate in value the same way GPUs do. I feel like it's better to invest in three decent monitors and a good midrange/upper-midrange GPU that can drive them for older games (and kick ass on the center-monitor-only for the heavy games), than to keep pouring money down the drain with ultra-high-end GPUs which have horrible bang for the buck in the first place and then plummet in value. The monitors will last years and give great multitasking ability and FOV in many games and great performance if forced into single-monitor gaming, and upgrading a midrange GPU every year or so financially hurts less than doing the same for an ultra-high-end GPU every year. But that's just my opinion. Hey it's your money, do what you want with it.
 
Last edited:

nyker96

Diamond Member
Apr 19, 2005
5,630
2
81
I think it's a smart move by amd. 6950 1gb is more confusing than just calling it a 6930, though I'm sure that they are saving 6930 for harvested gpus.

I have to disagree, I think it's a terrible combination. anyone running such a powerful gpu will be doing so at very high rez with a ton of AA/AF on, possibly on 3 monitors. At that point 1gb became the limiting factor on the gpu, bottlenecking the gpu performance. The parts need to be matched, a fast gpu need enough memory to run at the intended rez. In this case cutting mem might make it cheaper, but anyone buying this card and running at high rez, mulit monitor or god forbid Xfire mode will find out soon enough it's money not well saved.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I have to disagree, I think it's a terrible combination. anyone running such a powerful gpu

This is the part people don't understand. The 6950 IS NOT THAT FAST.
It's only 5% faster then a 5870 1gb. So just like a 5870 it will not use more then 1gb of video ram EXCEPT @ 2500X1600 AND EYEINFINITY.

The 6950 1gb is a great idea for 1900x1080!!!
If you have a 2500x1600 monitor or eyefinity get a 2gb version or get 2 ,2gb versions.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I have to disagree, I think it's a terrible combination. anyone running such a powerful gpu will be doing so at very high rez with a ton of AA/AF on, possibly on 3 monitors. At that point 1gb became the limiting factor on the gpu, bottlenecking the gpu performance. The parts need to be matched, a fast gpu need enough memory to run at the intended rez. In this case cutting mem might make it cheaper, but anyone buying this card and running at high rez, mulit monitor or god forbid Xfire mode will find out soon enough it's money not well saved.
probably well over 90% of the people buying a 6950 will have zero benefit with a 2gb over a 1gb model. those running above 1920 are the only ones that should really be concerned with having more than 1gb on the 6950.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
In the end this will turn out like GTS 250 1GB vs 512MB, 4850 512MB vs 1GB, 4870 512MB vs 1GB. These cards didn't see much benefit from an extra 512MB of RAM at the time of their debut. Now the 1GB versions have shown to bring more consistent performance and are much better for multi-GPU setups. Even the Sapphire 4850X2, which came in two VRAM flavors, is a good example if you were to compare benches now and benches in the past.

The 6900 series will end up just like what history has shown us if we look back and compare the current state of old cards. These cards don't need 2GB of RAM, but the RAM does help, and you really can't argue against that.

Looking at current Crossfire/SLI numbers and the next generation of high end cards will definitely need more than 1GB of RAM.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
This was in no way, shape or form inspired by the fact that the 470 and 480 both had more than 1GB of vram? If marketing is happening here it's simply a response to the move that Nvidia made first - adding Vram where none was needed (on your view).

Agreed. GTX470 is slower than an HD5870 and it has 1.3GBs of Ram. :\

850 for 2 monitors is extreme. I know if i wanted to pickup 2 more of mine it would be 358. Hmm.. I might just do that

So you spent $350 on a motherboard but 1 of your monitors your monitors are only $179 a piece? You should really buy nice monitors next time around instead as they will last you 5+ years!
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
In the end this will turn out like GTS 250 1GB vs 512MB, 4850 512MB vs 1GB, 4870 512MB vs 1GB. These cards didn't see much benefit from an extra 512MB of RAM at the time of their debut. Now the 1GB versions have shown to bring more consistent performance and are much better for multi-GPU setups. Even the Sapphire 4850X2, which came in two VRAM flavors, is a good example if you were to compare benches now and benches in the past.

The 6900 series will end up just like what history has shown us if we look back and compare the current state of old cards. These cards don't need 2GB of RAM, but the RAM does help, and you really can't argue against that.

Looking at current Crossfire/SLI numbers and the next generation of high end cards will definitely need more than 1GB of RAM.

This is a nice post buddy.

Do you think the 6950 will NEED the 2gb of ram before the end of the year and we get the 7xxx series or gtx6xx series?

It seems like a good choice if your gonna crossfire 2 6950 2gb cards down the road some, just like the 4850x2 did use the extra memory.
I don't see a single 6950 with 2gb being powerfull enough to give good framerates with modern games in the future just like the single 4850 1gb wasn't, even though the memory helped. It might prolong the life of the card some but most people will just upgrade at the end of the year and never use the 2gb of memory on the 6950. Thats my take on it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Do you think the 6950 will NEED the 2gb of ram before the end of the year and we get the 7xxx series or gtx6xx series?

If this statement is to be true, that would make 99.5% of all videocards on the market obsolete in 2011. Hardly something game developers want to do on the PC. I could see Crysis 2 having some "Ultra settings" that need 2Gbs of ram though. I have a feeling Ultra tessellation will be killer before anything else comes into play in that game :sneaky:
 

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
I wonder about:

a) Nvidia/AMD's costs with respect to adding more onboard memory versus
b) What they charge the public.

Is one of the motivations for a thread like this to be a wakeup call to the vendors and say:

"Look, you're adding memory that is never used and charging us out the wazoo for it. Very few games use that much memory and only at pretty crazy settings. So you don't need to have so much memory on your cards, and you can pass on the cost savings to us too! Kind regards, customers."

Both companies do it, so are we pointing out a type of gouging here? Charging us for a racing stripe?

See, the 5970 did benefit in Eyefinity setups from the added memory in the 4GB version. Then we might say "well, very few people do that, so why bother?" Then: "Very few cards were made and you could buy the 2GB variant if you wanted to, it was cheaper". Perhaps one solution to this will be that Nvidia/AMD both offer 'lesser memory versions' of cards? So a 580 with 1GB? A 6970 with 1GB? Logistical concerns here, perhaps?

If we're going to go this route we might head further down the chain and see if lower tier cards can play at resolutions that use as much memory as they are equipped with in games, and so forth. Perhaps we should.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
^^Making the 2gig the reference design makes it cheaper than what happened previous gen where there was a, IIRC, $100 premium for the 2gig Eyefinity versions of the 5870. Look at what enthusiasts had to pay for a 4gig (2gig per GPU) 5970's! I don't think the premium is anywhere near that for the 6900 cards, because they are standard with 2gig of RAM. Especially, I would assume the 6990, when it arrives. It's not going to be $1000+ card this time for a 4gig dual GPU card.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
whatever AA that the game uses.

What x16 AA in the nVidia control panel is actually x4 multi-sampling with x12 coverage samples. X16 or x16 CSAA offers more quality than x4 multi-sampling but has a smaller memory footprint than x8 multi-sampling.