- Aug 10, 2009
- 11,951
- 204
- 106
This probably should have been called 280SE or XL
Cramming it between R280 & R290 is just stupid
Then why the hell didn't they call it R380? If its a new GCN chip, redesigned with all the efficiency focus and all.
Cramming it between R280 & R290 is just stupid.
Then why the hell didn't they call it R380? If its a new GCN chip, redesigned with all the efficiency focus and all.
Cramming it between R280 & R290 is just stupid.
just as a side note, Am I the only one annoyed they no longer allow the AIB's to customize the output options? Maybe they have to account for it somehow in drivers, or the AIB's were breaking things?
Its just reference PCB. Too early for custom designs, if there will be any.
It says AMD on the back.Considering it doesn't say AMD on the PCB and they are custom coolers, it doesn't look like there is actually going to be a reference design.
Exactly what I was thinking when I said it should be called a 280SE/XL.
This is most likely going to be slower than Tahiti (one never knows, it could be faster).
And Silverforce is starting to sound disenchanted.![]()
I'm not a loyal fan of any company I only buy whatever gives the best bang for buck.
Putting 2GB vram on such a powerful card in late 2014 is plain WRONG. I dont care which side does it.
It says AMD on the back.
2GB is fine if it's cheap enough, these cards don't have the power to make the most of 4GB anyway.
In "lazy" ports that don't stream/compress textures like titanfall it won't be enough for the highest setting, but those games don't look that good anyway, so what does it matter?
more games will attempt to use the extra memory, since the consoles have a lot more than 2GB available and are targeting 1080p max.
2GB is the same amount of memory all the 6970s (and some 6950s) had back in 2011,
you can still make use of extra memory avoiding stutters and other problems even if it's not the fastest GPU.
Consoles as such doesnt have a lot more than 2 GB. They have ~5GB to be shared. Its actually unlikely for consoles to use over 2GB VRAM. They already struggle hard enough on 720-1080p to reach the desired FPS with lowered settings.
if they have 5GB to be shared (physical 8GB of 256bit GDDR5 on the PS4) that's a lot more and games will be designed to take advantage of the memory, the settings "lowered" don't necessarily mean low ram usage, also there is less memory wasted compared to the PC with Windows.
with just 2GB I think this card will be obsolete sooner than it should, like a 7850 1GB.
but sure, for now most games are great with 2GB, but we can see at least 2 newer games that were clearly built to have some (questionable) benefit from 3GB and more.
Remember those 5GB are shared. Its about equal to a PC with 4GB system memory and a 2GB card.
2GB is fine if it's cheap enough, these cards don't have the power to make the most of 4GB anyway.
In "lazy" ports that don't stream/compress textures like titanfall it won't be enough for the highest setting, but those games don't look that good anyway, so what does it matter?