New 6950 with only 1gb of memory comming, cheaper!(FZ)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
At 1920x1080 8AA/16AF, HD5870 is 1% faster than an HD6950 on average.
http://www.computerbase.de/artikel/...6950/26/#abschnitt_performancerating_mit_aaaf

So for a single GPU setup, the extra 1GB of ram on the 6950 below 1920x1200 seems to be pretty much useless at the moment. By the time mainstream games start to use > 1GBs of Ram at 1920x1080, 6950 will be too slow anyways.

If Crysis 2 brings the graphics level a couple notches up from Crysis 1, you can bet it will take another 2-3 years before any videocard will be able to max that game out. By that time, 6950 will be ancient history.

I have to applaud AMD for releasing 2GB cards. Now gamers think you need 2GBs of Vram for 1920x1080....but benchmarks clearly show otherwise unless you are gaming at 2560x1600 4AA. Brilliant marketing move.
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
At 1920x1080 8AA/16AF, HD5870 is 1% faster than an HD6950 on average.
http://www.computerbase.de/artikel/...6950/26/#abschnitt_performancerating_mit_aaaf

So for a single GPU setup, the extra 1GB of ram on the 6950 below 1920x1200 seems to be pretty much useless at the moment. By the time mainstream games start to use > 1GBs of Ram at 1920x1080, 6950 will be too slow anyways.

If Crysis 2 brings the graphics level a couple notches up from Crysis 1, you can bet it will take another 2-3 years before any videocard will be able to max that game out. By that time, 6950 will be ancient history.

I have to applaud AMD for releasing 2GB cards. Now gamers think you need 2GBs of Vram for 1920x1080....but benchmarks clearly show otherwise unless you are gaming at 2560x1600 4AA. Brilliant marketing move.

beautifull post, once again....:thumbsup:
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
At 1920x1080 8AA/16AF, HD5870 is 1% faster than an HD6950 on average.
http://www.computerbase.de/artikel/...6950/26/#abschnitt_performancerating_mit_aaaf

So for a single GPU setup, the extra 1GB of ram on the 6950 below 1920x1200 seems to be pretty much useless at the moment. By the time mainstream games start to use > 1GBs of Ram at 1920x1080, 6950 will be too slow anyways.

If Crysis 2 brings the graphics level a couple notches up from Crysis 1, you can bet it will take another 2-3 years before any videocard will be able to max that game out. By that time, 6950 will be ancient history.

I have to applaud AMD for releasing 2GB cards. Now gamers think you need 2GBs of Vram for 1920x1080....but benchmarks clearly show otherwise unless you are gaming at 2560x1600 4AA. Brilliant marketing move.
I agree with everything you said except the last paragraph. I believe AMD equipped it with 2 GB ram because these cards can be used for Eyefinity and I strongly believe that in those higher resolutions the extra RAM would make a difference. You can't just market something and not provide the hardware to back it up.

Now as for the 1 GB variants comming out, they should make an interesting entrance as I don't really know where they would be place. The performance space between 6870 - 6950 - 6970 is already so small.
 

flexcore

Member
Jul 4, 2010
193
0
0
I' glad I waited to pick up a card. I was looking at 6870 and 460/470. With the 6950 1GB and the 560 coming out around same time I should be able to pick up a very nice card for 1080p at a sweet price.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I agree with everything you said except the last paragraph. I believe AMD equipped it with 2 GB ram because these cards can be used for Eyefinity and I strongly believe that in those higher resolutions the extra RAM would make a difference. You can't just market something and not provide the hardware to back it up.

Try to run a modern game with high details,eyefinity with 1 6950 2gb and it will be a slide show.
If the gtx560 2gb could do surround in single card form, I'd say the same thing.

These 2gb cards are for 2500x1600 in single card form or eyefinity/surround in crossfire/sli.
 

Castiel

Golden Member
Dec 31, 2010
1,772
1
0
I'm not sure why Nvidia plays with different video memory. 580 should have 2GB, 570/560 should have 1.5GB. I'd much rather spend the money for more memory then a 1GB card anymore.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I'm not sure why Nvidia plays with different video memory. 580 should have 2GB, 570/560 should have 1.5GB. I'd much rather spend the money for more memory then a 1GB card anymore.

Its because of the 320 bit/384 bit memory bus they have.
They have 2 choices 1.5 or 3gb for thegtx580 or 1.25 or 2.5gb's for the gtx570.
The gtx460/560 have a 256 bit bus and there choices are 1 gb or 2gb.

Something like that. :)
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Try to run a modern game with high details,eyefinity with 1 6950 2gb and it will be a slide show.
If the gtx560 2gb could do surround in single card form, I'd say the same thing.

These 2gb cards are for 2500x1600 in single card form or eyefinity/surround in crossfire/sli.
I remember blastingcap used a 6850 for Eyefinity so it's definitely possible especially with older games. Also you can crossfire and still benefit from the 2 GB ram.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
So, what should the price be for the 1 gig versions? I agree that for a large majority of people the 1gig will be every bit as good as the 2gig. To me that means that you don't charge much less for the 1gig versions. $20.00 less? $25.00? More than that and the 2gig version stops selling.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I agree with everything you said except the last paragraph. I believe AMD equipped it with 2 GB ram because these cards can be used for Eyefinity and I strongly believe that in those higher resolutions the extra RAM would make a difference.

Yes, definitely you are right. For Eyefinity (i.e., 5760x1080 and so on), 2GB comes in handy. Of course that's 52% more pixels than 2560x1600!!
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
So, what should the price be for the 1 gig versions? I agree that for a large majority of people the 1gig will be every bit as good as the 2gig. To me that means that you don't charge much less for the 1gig versions. $20.00 less? $25.00? More than that and the 2gig version stops selling.

I think 20/30$ is usually about right compared to past cards.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Yes, definitely you are right. For Eyefinity (i.e., 5760x1080 and so on), 2GB comes in handy. Of course that's 52% more pixels than 2560x1600!!
:thumbsup: I think these 1 GB card will be exactly what people with single monitors would need and hopefully they cut the price down to make it more attractive.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
:thumbsup: I think these 1 GB card will be exactly what people with single monitors would need and hopefully they cut the price down to make it more attractive.

:thumbsup: This is the perfect storm for a price war, 2 cards about equal perfomance about equal price ,launch about the same time.:cool:
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
We always look at nVidia cards compared to AMD cards, and vice versa, and understandably so. But... How do these 1gig cards fit into AMDs own line up? There's not a lot of spaces, both price and performance wise, to squeeze in 2 more cards in between 6870-> 6950-> 6970.

It seems that rather than actually compete with one another for the best cards, they are trying to outmaneuver each other in the marketplace instead.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I remember blastingcap used a 6850 for Eyefinity so it's definitely possible especially with older games. Also you can crossfire and still benefit from the 2 GB ram.

My attention is elsewhere now. I have bought something like $2000+ of high-end kitchenware in the last few months and hardly game anymore. (My gf is a foodie. And kitchenware holds its value a hell of a lot better than tech, that's for sure, so I am willing to spend more on quality parts.)

That said, yes, I am still running 3 x 22" tri-monitor, but almost all of the time in Extended Desktop mode. Three displays kicks ass and is much more useful than a single larger display for web-browsing, as the minimizing/maximizing is easy. But a single bigger display kicks ass for other applications such as photo-editing. Either setup is probably better than most dual-monitor setups, because it's nice to have either a single huge monitor with easy-to-use split screen (thank you Windows 7) or a centered monitor and two wing monitors, rather than whatever the heck people do with dual-monitor setups (split it down the middle? have one monitor centered and the other growing off the side of it like a tumor?).

Obviously the solution is to get 3 x 30" displays to get the best of both worlds, but I am not willing to spend that kind of money right now. :)

Anyway, to address the questions, yes, many games (TF2, Oblivion, Left 4 Dead series, Fallout 3/NV, LOTRO, WoW, etc.) run great on even a 6850 1GB at 5040x1050. A non-overclocked 6850 can run Fallout New Vegas at that resolution for instance, with overclocking allowing for even smoother framerates. A 6950 1GB would do even better.

5670x1080 is potentially a problem: 17.6% more pixels to push. But it's doable, and if you run into VRAM problems, just turn down AA a bit and leave textures on high (or turn textures to medium and boost AA higher, though I think most of the time that works worse than the opposite). AA eats VRAM like candy. Going down to 2x MSAA or, gasp, no MSAA, isn't the end of the world, especially when you trade it for deeper immersion and peripheral vision that can translate to a performance edge in multiplayer games, since you have an expanded field of vision. Worse comes to worst, turn off the wing monitors and game on the center monitor at great framerates, for those games that you can't run Eyefinity on. I can run Crysis like a dream on my center monitor.

IMHO, the whole 2GB futureproofing thing is ridiculous for anything less than 3 x 1920 x 1200 so long as you don't insist on having high AA. Futureproofing does not work well with quickly-depreciating assets such as video cards. Buy only what you need for the next 12 months. Heck, I don't even game much anymore, so I am considering downgrading to a 5770 until the 22nm GPUs come out (gaming on the center monitor, tri-monitor for websurfing). An oc'd 5770 still kicks ass at 1680x1050.

I probably sound like a shill for Eyefinity/Surround or something, but seriously, it's awesome. I had a 24" 1920x1200 Dell Ultrasharp and could have spent $$$ getting a GTX480 or 5870 or something to try to max out Metro 2033 or Crysis or something. Instead, I sold the Ultrasharp, got 3 supposedly crappy 22" Acers and a mid-high end video card, and have not regretted it since. I massively boosted my computing experience outside of games with tri-monitor and had a blast with my most-played games (TF2 back then, Fallout NV and L4D2 now). I will sell the monitors later this year and upgrade, probably to 3 x 1080p IPS LED monitors to save energy and for photoshop work, but for me, there is no pressing need to get the highest-end GPUs until the new consoles come out and push PC graphics requirements up again.
 
Last edited:

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
My attention is elsewhere now. I have bought something like $2000+ of high-end kitchenware in the last few months and hardly game anymore. (My gf is a foodie. And kitchenware holds its value a hell of a lot better than tech, that's for sure, so I am willing to spend more on quality parts.)

That said, yes, I am still running 3 x 22" tri-monitor, but almost all of the time in Extended Desktop mode. Three displays kicks ass and is much more useful than a single larger display for web-browsing, as the minimizing/maximizing is easy. But a single bigger display kicks ass for other applications such as photo-editing. Either setup is probably better than most dual-monitor setups, because it's nice to have either a single huge monitor with easy-to-use split screen (thank you Windows 7) or a centered monitor and two wing monitors, rather than whatever the heck people do with dual-monitor setups (split it down the middle? have one monitor centered and the other growing off the side of it like a tumor?).

Obviously the solution is to get 3 x 30" displays to get the best of both worlds, but I am not willing to spend that kind of money right now. :)

Anyway, to address the questions, yes, many games (TF2, Oblivion, Left 4 Dead series, Fallout 3/NV, LOTRO, WoW, etc.) run great on even a 6850 1GB at 5040x1050. A non-overclocked 6850 can run Fallout New Vegas at that resolution for instance, with overclocking allowing for even smoother framerates. A 6950 1GB would do even better.

5670x1080 is potentially a problem: 17.6% more pixels to push. But it's doable, and if you run into VRAM problems, just turn down AA a bit and leave textures on high (or turn textures to medium and boost AA higher, though I think most of the time that works worse than the opposite). AA eats VRAM like candy. Going down to 2x MSAA or, gasp, no MSAA, isn't the end of the world, especially when you trade it for deeper immersion and peripheral vision that can translate to a performance edge in multiplayer games, since you have an expanded field of vision. Worse comes to worst, turn off the wing monitors and game on the center monitor at great framerates, for those games that you can't run Eyefinity on. I can run Crysis like a dream on my center monitor.

IMHO, the whole 2GB futureproofing thing is ridiculous for anything less than 3 x 1920 x 1200 so long as you don't insist on having high AA. Futureproofing does not work well with quickly-depreciating assets such as video cards. Buy only what you need for the next 12 months. Heck, I don't even game much anymore, so I am considering downgrading to a 5770 until the 22nm GPUs come out (gaming on the center monitor, tri-monitor for websurfing). An oc'd 5770 still kicks ass at 1680x1050.

I probably sound like a shill for Eyefinity/Surround or something, but seriously, it's awesome. I had a 24" 1920x1200 Dell Ultrasharp and could have spent $$$ getting a GTX480 or 5870 or something to try to max out Metro 2033 or Crysis or something. Instead, I sold the Ultrasharp, got 3 supposedly crappy 22" Acers and a mid-high end video card, and have not regretted it since. I massively boosted my computing experience outside of games with tri-monitor and had a blast with my most-played games (TF2 back then, Fallout NV and L4D2 now). I will sell the monitors later this year and upgrade, probably to 3 x 1080p IPS LED monitors to save energy and for photoshop work, but for me, there is no pressing need to get the highest-end GPUs until the new consoles come out and push PC graphics requirements up again.
Thanks for the input blastingcap!
 

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
I have to applaud AMD for releasing 2GB cards. Now gamers think you need 2GBs of Vram for 1920x1080....but benchmarks clearly show otherwise unless you are gaming at 2560x1600 4AA. Brilliant marketing move.

This was in no way, shape or form inspired by the fact that the 470 and 480 both had more than 1GB of vram? If marketing is happening here it's simply a response to the move that Nvidia made first - adding Vram where none was needed (on your view).
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126

So you link me a race game @ 25 to 31 fps and tell me thats ok?
Are you kidding?

Mabe you better take a look at the rest of that review and tell me if you want to go out and spend $850 and buy 2 extra monitors , a diplayport adapter, a $300 card for those framerates. Anyone who has the $850 to toy with this stuff I would hope would be smart enough to buy 2 cards.

It does seem however to support 3 lower resolution monitors like Blastingcap has, but just barely, and it still wouldn't work with Crysis, Stalker, Metro and a
the other demanding games.
f120101.png
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
So you link me a race game @ 25 to 31 fps and tell me thats ok?
Are you kidding?

Mabe you better take a look at the rest of that review and tell me if you want to go out and spend $850 and buy 2 extra monitors , a diplayport adapter, a $300 card for those framerates. Anyone who has the $850 to toy with this stuff I would hope would be smart enough to buy 2 cards.

It does seem however to support 3 lower resolution monitors like Blastingcap has, but just barely, and it still wouldn't work with Crysis, Stalker, Metro and a
the other demanding games.
f120101.png

I own the game, I know whats playable and what isn't. without AA that game runs fine at that res. Did you look at the other games in the list?

$850 for 2 monitors? really?
 

Castiel

Golden Member
Dec 31, 2010
1,772
1
0
850 for 2 monitors is extreme. I know if i wanted to pickup 2 more of mine it would be 358. Hmm.. I might just do that
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
850 for 2 monitors is extreme. I know if i wanted to pickup 2 more of mine it would be 358. Hmm.. I might just do that

Mabe I worded it wrong I said
"go out and spend $850 and buy 2 extra monitors , a diplayport adapter, a $300 card for those framerates."

Thats 2 monitors , a display port adapter and a 300$ 69502gb........for 850$
My point was you should buy 2 6950's with 2gb's and get some real framerates. Not spend 850$ on above items to play some games @ barely 30fps with eyeinfinity and 1 6950 2gb.

Better?:)
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Mabe I worded it wrong I said
"go out and spend $850 and buy 2 extra monitors , a diplayport adapter, a $300 card for those framerates."

Thats 2 monitors , a display port adapter and a 300$ 69502gb........for 850$
My point was you should buy 2 6950's with 2gb's and get some real framerates. Not spend 850$ on above items to play some games @ barely 30fps with eyeinfinity and 1 6950 2gb.

Better?:)
Again, people use weaker cards to run multiple monitors, there's nothing wrong with running eyefinity onan HD 6950...