Several R9 285's pictured (VideoCardz)

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Sapphire-Radeon-R9-285-DualX-VideoCardz-2.jpg

XFX-Radeon-R9-285-VideoCardz.jpg-2.jpg

HIS-Radeon-R9-285-Mini-VideoCardz-2.jpg

Radeon-R9-285-CrossFire-XDMA-VideoCardz.jpg


So, there's boxes, can't be too far off then. The last pic shows no crossfire connector like Hawaii.
SOURCE
 
Feb 19, 2009
10,457
10
76
Feels like its going backwards. If its faster than 7970Ghz, having 2GB ram is a letdown. This is especially the case when its a new GCN part and in Crossfire, it could really have some good bang for buck if it weren't limited by vram.
 
Feb 19, 2009
10,457
10
76
Then why the hell didn't they call it R380? If its a new GCN chip, redesigned with all the efficiency focus and all.

Cramming it between R280 & R290 is just stupid.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
I don't mind tbh, its still 28nm and probably isn't that much different from Hawaii. If its slower than Tahiti , then the name is dumb.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Then why the hell didn't they call it R380? If its a new GCN chip, redesigned with all the efficiency focus and all.

Cramming it between R280 & R290 is just stupid.

I believe Tonga is Pitcairn's replacement. So it will probably end up a 370.

just as a side note, Am I the only one annoyed they no longer allow the AIB's to customize the output options? Maybe they have to account for it somehow in drivers, or the AIB's were breaking things?
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Then why the hell didn't they call it R380? If its a new GCN chip, redesigned with all the efficiency focus and all.

Cramming it between R280 & R290 is just stupid.

Exactly what I was thinking when I said it should be called a 280SE/XL.

This is most likely going to be slower than Tahiti (one never knows, it could be faster).

And Silverforce is starting to sound disenchanted. :eek:
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
just as a side note, Am I the only one annoyed they no longer allow the AIB's to customize the output options? Maybe they have to account for it somehow in drivers, or the AIB's were breaking things?

Its just reference PCB. Too early for custom designs, if there will be any.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
With the updated CFX, if these had 4GB VRAM, they'd be ideal budget CF cards. But nope, 2GB . :/
 
Feb 19, 2009
10,457
10
76
Exactly what I was thinking when I said it should be called a 280SE/XL.

This is most likely going to be slower than Tahiti (one never knows, it could be faster).

And Silverforce is starting to sound disenchanted. :eek:

I'm not a loyal fan of any company I only buy whatever gives the best bang for buck.

Putting 2GB vram on such a powerful card in late 2014 is plain WRONG. I dont care which side does it.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
If you game on 1080p I am sure it will offer great performance. A 4gb offering needs to be made but it may nip at the heels of the R9 290 in price. Efficiency will be key for these cards, if they are not efficient then they will be stuck between a rock and a hard place. I would love to sell my r9 290x and try out 2 x 285 4gb cards just to get the experience of 2 cards. I just don't want all that power and heat from 2x 290x GPUs. It was bad enough on the 7970.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I'm not a loyal fan of any company I only buy whatever gives the best bang for buck.

Putting 2GB vram on such a powerful card in late 2014 is plain WRONG. I dont care which side does it.

I am still using 2GB today with 2560*1440. I really dont see the issue with it. Also they could have improved compressions etc to reduce memory need.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
It says AMD on the back.


FWIW, Powercolor stamped AMD on the back of their non reference PCBs and MSI stamped AMD's logo on the front to make it look like a reference PCB, which it was really not.

Take a look at Lavaheadache's non reference 290 post and also there is a picture of a "reference" MSI 7970 with a lighter brown PCB that is clearly not reference on OCN. I guess it is cheaper to make a clone of the reference PCB in house rather than to purchase them from AMD. My non-reference PCB Powercolor 290x died after 3 months. My RMA replacement was an official AMD PCB from the early releases, works much better.
 
Last edited:

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
2GB is fine if it's cheap enough, these cards don't have the power to make the most of 4GB anyway.

In "lazy" ports that don't stream/compress textures like titanfall it won't be enough for the highest setting, but those games don't look that good anyway, so what does it matter?
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
2GB is fine if it's cheap enough, these cards don't have the power to make the most of 4GB anyway.

In "lazy" ports that don't stream/compress textures like titanfall it won't be enough for the highest setting, but those games don't look that good anyway, so what does it matter?

more games will attempt to use the extra memory, since the consoles have a lot more than 2GB available and are targeting 1080p max.

2GB is the same amount of memory all the 6970s (and some 6950s) had back in 2011,

you can still make use of extra memory avoiding stutters and other problems even if it's not the fastest GPU.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
more games will attempt to use the extra memory, since the consoles have a lot more than 2GB available and are targeting 1080p max.

2GB is the same amount of memory all the 6970s (and some 6950s) had back in 2011,

you can still make use of extra memory avoiding stutters and other problems even if it's not the fastest GPU.

Consoles as such doesnt have a lot more than 2 GB. They have ~5GB to be shared. Its actually unlikely for consoles to use over 2GB VRAM. They already struggle hard enough on 720-1080p to reach the desired FPS with lowered settings.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
Consoles as such doesnt have a lot more than 2 GB. They have ~5GB to be shared. Its actually unlikely for consoles to use over 2GB VRAM. They already struggle hard enough on 720-1080p to reach the desired FPS with lowered settings.

if they have 5GB to be shared (physical 8GB of 256bit GDDR5 on the PS4) that's a lot more and games will be designed to take advantage of the memory, the settings "lowered" don't necessarily mean low ram usage, also there is less memory wasted compared to the PC with Windows.

with just 2GB I think this card will be obsolete sooner than it should, like a 7850 1GB.
but sure, for now most games are great with 2GB, but we can see at least 2 newer games that were clearly built to have some (questionable) benefit from 3GB and more.
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
So we had the 7970,.. then the 7970 ghz edition.

Then 280x,...

And the 285? in late 2014 w/ 2gb VRAM? Sell it for $150 or get the heck out of town.

Huge let down if true. Should call it the 275, or just call it WTF.

Should have just lowered prices on the 280x and 280.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
if they have 5GB to be shared (physical 8GB of 256bit GDDR5 on the PS4) that's a lot more and games will be designed to take advantage of the memory, the settings "lowered" don't necessarily mean low ram usage, also there is less memory wasted compared to the PC with Windows.

with just 2GB I think this card will be obsolete sooner than it should, like a 7850 1GB.
but sure, for now most games are great with 2GB, but we can see at least 2 newer games that were clearly built to have some (questionable) benefit from 3GB and more.

Remember those 5GB are shared. Its about equal to a PC with 4GB system memory and a 2GB card.

Also you exclude any method AMD may use to reduce memory demand.

Not to mention the hype of actually needing more memory. As said, I got 2GB today. And it works perfectly fine for me in 2560*1440 in any game I throw at it. And I dont see consoles pushing that amount forward. If anything, it would be the PC platform itself. Just compare 2vs4GB GTX770. Its hard to find any reason to pay the premium for a 4GB card.

It seems the focus is more on a number than the actual need.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
Remember those 5GB are shared. Its about equal to a PC with 4GB system memory and a 2GB card.

no it's not, the PC cannot use efficiently those slow 4GB as video ram, also as I said the Windows PC is wasting more memory than fixed spec, custom OS platforms,

and the PS4 have 8GB of physical memory, even if currently game devs can only access a portion, it doesn't mean it's not going to change, with the old consoles OS updates allowed games to use more memory over time, and also regarding old consoles those things only have 512MB (256+256 on the PS3, more like a PC) and could easily do games a PC would need to have 3GB total (ram+vram) to match,
 
Feb 19, 2009
10,457
10
76
2GB is fine if it's cheap enough, these cards don't have the power to make the most of 4GB anyway.

In "lazy" ports that don't stream/compress textures like titanfall it won't be enough for the highest setting, but those games don't look that good anyway, so what does it matter?

2 of them in CF would certainly have the longevity and power, if only it wasnt 2gb.

Then there's also the ultra HD texture mods for games, which a 7970Ghz can handle with ease at common resolutions thanks to its 3gb vram.

Frankly there's no excuse unless its a 7870 replacement, if that's the case, then why slot it in R285 above R280..

Not to mention there's plenty of 7950/70 series on ebay for dirt cheap, if they want users on older stuff to upgrade to it, putting 2gb on it is not attractive compared to other options.
 
Last edited: