[Sweclockers] Radeon 380X coming late spring, almost 50% improvement over 290X

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Wait, so if they made the R380X exactly the same as it were if its water cooled, ie. similar power to R290X and offered 50% more performance.. but if they made it have a reference air cooler, it would be good?

"it's another thing for a mainstream GPU to *require* an AIO solution"

ps. This isn't a mainstream GPU. It's not a 750ti or 960.

You are making it seem like whatever AMD does is fail, because it seems like this:

AMD : "hey guys, we've improved efficiency and performance by 50% and we also improved our reference cooler so its quieter & cooler than some of the best open air designs that dump heat in your case..."

Naysayers: "nah bro, you should have gone air on it so it dumps that heat in the case instead and run hotter & louder"

AMD : "umm, but there's gonna be cool custom air designs from our partners for those who want air"

Naysayers: "not good enough dudes, the fact you went water on reference make it seem like a desperate move"...
hehe I know it is suppose to be a parody but this is pretty spot on :)
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
A CLC cooled card combines both the strengths of the blower, and the open air cooler.

It exhausts hot air outside the case, and as a bonus the rad can be mounted on the back of the case to double up as the exhaust fan, usually the top 120mm in the back... and has not only the temperatures of a decent open air cooler (see the Tri-X and Vapor-X coolers from Sapphire) but also the low noise level they have, too.

To even think that a CLC cooled card is inferior to one with a blower or an open air cooler is insane, there's no logic in that statement!

Nobody is debating that a CLC cooler is inferior on the grand scheme of things. In fact, from what I gather, everyone agrees it would be an awesome option.

Key word: option. As crazy as it may seem to some of the extreme techies here, a CLC is not the answer for every enthusiast, and is especially not the answer for the majority of the public. True, some of both categories will buy such, because of all the reasons they are so well received: great at cooling, typically less noise, and perform double function so overall less fans in the case and less noise altogether.

But that does not make it the one cooler to rule them all. Plenty of people either cannot host such coolers, or do not want to deal with them. "Deal with them" you ask? Well, everyone has their own personal reasons for whatever they do in a PC case. I'm not making excuses up, that's simply the sentiment of people. Some want these things, some want the standard simplicity of air cooling.

One chief reason I can cite for people choosing air over CLC, even when their case and build decisions make it a possibility without sacrifice: they don't want water, period.
These systems are not fail proof. Are there warranties? Sure. Is failure exceptionally rare? Basically. But water leaking is also not the only failure. These things WILL need topped up with water if you keep it long enough, they are not perfect barriers to evaporation that happens in all cooling systems with time. Maybe you won't keep any single CLC package long enough for that to be an issue, but for some, it may very well be a concern.

Next up as a concern, other failure issues. The pump can fail entirely, which renders the entire system down until you can replace the entire CLC package. With an ordinary fan, you simply replace the fan, and quite likely, thanks to the heatsink, you can probably keep on trucking at 2D base clocks until you replace it. Without the pump, that GPU isn't getting any cooling accomplished, period. It will be down for the count.

Another concern, if not for pump failure, you have abnormal pump noise. I have seen it reported that pump noise has been a nasty detractor from these CLC packages. Not necessarily common, in fact, it must be fairly uncommon considering the overall reviews, but it is common enough that it is reported in the forum communities like overclock.net.

There are enough issues that simply do not exist with air cooling setups, enough that make it not a walk off homerun when it comes to cooling. It is, quite matter of factly, not for everyone.

To reiterate, for some, they simply cannot use the system without making drastic changes. Many PC gamers or other rig builders do not necessarily many any build decisions with the idea of adding any kind of water cooling setup later on. Others simply do not want any kind of CLC or other water kits for fear of the potential of damage, or of the other inconveniences. Even if failure isn't with the CLC kit itself, a failed GPU (especially third-party CLC on a normal GPU) or a failed CPU means more inconvenience diagnosing the issue than standard air or even factory kit setups. For most enthusiasts who go this route, true, that isn't an issue due to overall knowledge, but not everyone wants extra hassle. Regardless of how anyone argues, it is an added hassle that does not exist in other setups. Minor to some, crossing the line for others.

Many here take part in true water cooling setups, and from what I've seen, some seem to be entirely biased to the point that it seems they feel superior and everyone should aspire to emulate their success. Water cooling is simply not what the average enthusiast wants to deal with. The water cooling route is exceptionally rare when you really boil down the numbers.

CLC kits are, however, gaining acceptance in all the computing crowds, so yes, people do want it. I make no argument to the contrary. I simply argue that not everyone wants it.

And ultimately, the "detractors" in this very thread have not argued for the mass ban of such cooling. They simply took to crying foul with the idea of the cooler being the reference cooler. Reference coolers are very popular when they are successful. When they suck, the aftermarket comes to the rescue. If the card maintains the same TDP as the current flagships, then AIBs COULD make custom coolers, but one might have to argue, how strong will the appeal be? And how well will they truly compete with the reference cooler if it is a CLC kit?

If there are indeed options, and good options, everyone will be happy. Again, no one has actually suggested otherwise. Some have made no mention of "well, if there are options...", and simply complained about the *rumored* reference cooler, myself included. Take this to simply be arguing that the reference will be likely the most efficient and effective cooler design, because it most surely will, and due to that, we fear what the AIB partners will do. Everyone assumes they will make other coolers, and everyone assumes the stated TDP figures and other facts are accurate because they are on charts. Nobody truly knows for sure, so we are all holding our breath, and thus, simply voicing our hopes that options remain available for everyone when it does come time to launch.


This isn't a matter of "well this is AMD's solution... ehh, it's not good until Nvidia does it!"... no, just no. Stop. It's childish, and embarrassing. People just want options, and don't want to be forced into a solution that may not work in their case, or introduces more risks or the potential for more issues than which they are comfortable accepting.

FWIW, I also question the wisdom of placing radiator coolers as side panel exhaust and/or even front panel exhaust. Perhaps they desire negative pressure cooling, and perhaps they are even comfortable cleaning out their case more often. What that doesn't address is how much of a change that is to the standard airflow that the ATX standard is really built around, and how most PC cases are designed to accommodate. I thought about doing that myself, but it doesn't seem like it would be a good airflow solution. Every actual test I've seen of airflow and temperature modeling within PC cases demonstrates the best temperatures with the standard airflow model, with one possible variation: first, the standard includes a possible mix of front, bottom, and side intake, with possible top and rear exhausts; one variation I have seen measured with good result is, at least in some PC case designs, the top is turned into an intake, all others remain the same.

Shaking up the airflow, without careful planning, can introduce deadzones and turbulence that, in the end, results in less optimal cooling for random components, dependent upon the actual airflow situation. And perhaps the front panel is the only solution for some cases: in almost every example I've seen, that requires removing all of the drive cages, or one if it's a half and half cage assembly design, and that may not leave enough expansion potential depending upon the user and their storage preferences or requirements. Going with extensive 3.5" RAID, an optical drive, and additional SSD system drive(s) requires a fair bit of mounting space, which may very well render the idea of putting radiators in front an impossible option.

I speak from some personal hesitation and issues, as well as common concerns I've seen repeated on OCN.


TLDR:
not everyone wants the risks and potential issues associated with CLCs
not everyone can even host CLCs
we all simply wish that there are options, and not a required CLC;
additionally, the fear is that a reference CLC equals the best and most optimal solution, so how many AIBs will make other designs, and will they sell enough to justify continued expansion of non-reference designs. Will they even be sufficiently overclocked/overclockable compared to what the reference design can achieve?
 

garagisti

Senior member
Aug 7, 2007
592
7
81
snip...

TLDR:
not everyone wants the risks and potential issues associated with CLCs
not everyone can even host CLCs
we all simply wish that there are options, and not a required CLC;
additionally, the fear is that a reference CLC equals the best and most optimal solution, so how many AIBs will make other designs, and will they sell enough to justify continued expansion of non-reference designs. Will they even be sufficiently overclocked/overclockable compared to what the reference design can achieve?

For the bolded, it has been repeatedly mentioned that it is the reference design. How many people do you know who buy top line GPU's and buy them with reference coolers only? Some Nvidia cards which only come with reference cooling, and AMD's 290s picked up due to mining high, they were more of an exception. Most people actively seek custom cooled card, and that has been the norm for donkey number of years. Anyone who's spending barely 10 minutes a day on review sites/ forums etc., will know that much as well. All this fear mongering and fud spreading is just that. Please don't try to paint it as a rational debate, as it wasn't started in that vain by those who're being answered to. Facts regarding coolers and their abilities have been constantly ignored and here we are 4-5 pages later, still having the same old debate.

For those who don't want an AIO card, they can always buy an air cooled option which will most likely come out. Heck there are air cooled 500W 290x2 beasties. If it could cool dual 290x's, your measly 380x (which those arguing against weren't ever going to buy) will be fine on air. Just don't cheap out on a card, and buy one with good cooling than comparing it with some Nvidia card that costs a lot more, and whinge how cheap you were, sorry, how bad the cooler is.
http://www.newegg.com/Product/Produc...82E16814131584
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,538
136
TLDR:
not everyone wants the risks and potential issues associated with CLCs
not everyone can even host CLCs
we all simply wish that there are options, and not a required CLC;
additionally, the fear is that a reference CLC equals the best and most optimal solution, so how many AIBs will make other designs, and will they sell enough to justify continued expansion of non-reference designs. Will they even be sufficiently overclocked/overclockable compared to what the reference design can achieve?

All very valid points I agree with, yes sir. I also favor air coolers because of having much less points of failure than on a CLC, that's why I went with a huge heatsink for my 2500k instead of a CLC. But sometimes you've gotta make sacrifices in order to maximize performance, and the CLC cooler is just that at the end of the 28nm era. For some it'll be the perfect solution to the problem, for others there will be custom cards with aftermarket coolers, that's a given. Those who make and supply decent open air coolers for the custom 290/Xs will be more than happy to adapt and improve them for the 380x.

If you have a 120mm fan hole you can host a CLC. Even ITX cases don't have much problems with that requirement.

Now, if the CLC were to become the standard cooling solution top to bottom, then I'd agree that would be problematic. It'll be on the absolute high end most likely as an option, everyone gets to choose their cup of tea.



And nope, the double standard argument isn't childish, actually it's far from it. It's something that will never disappear, in no small part due to AMD's joke of a marketing division. They almost make it too easy for Nvidia's to mock and do as they please. Mark my words, if Nvidia had come up with this you wouldn't be seeing people ranting over and over the same things, in no small part due to their Apple like ability to market features. Complete opposites...
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
yeah I am - bandwidth only matters for how much is needed for the given workload. If GM200 has enough to feed it at high resolutions than the r380 having 10 times the bandwidth won't matter if it doesnt have the horsepower to use it.

HBM is great and im sure it will be a standard at some point, I just don't think that alone is going to mean AMD wins @ 4k.

I'm sure nVidia will improve 4K performance. As long as they lag behind though we'll continue to hear that 4K doesn't matter like we do with the 970 now. Same with DX11 when they didn't have DX11 cards and DX11.1 feature set, etc... If nVidia lags behind the standard line is "it doesn't matter anyway".

AMD has been designing for 4K since Tahiti. That's why they have the advantage at higher resolutions. Nothing wrong with being ahead of the curve and setting the pace.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
And how well did those AMD cards sell? Did it improve their standing in the market and turn a profit for AMD? Was it a marketing success? I already know the answer so no need for a mile long reply with 100 meaningless graphs from TPU. This can be the quietest and coolest operating GPU in the world with an awesome AIO but it won't gain mindshare or OEM wins and those are things AMD needs desperately. What would help AMD is coming up with a cool efficient card that works on air and can compete against GM200.

I'd wager that it outsold Titan-Z, it's direct competition. The one that nVidia ended up ashamed to even send out to review sites after the 295x2 dropped.
 

96Firebird

Diamond Member
Nov 8, 2010
5,741
340
126
And nope, the double standard argument isn't childish, actually it's far from it. It's something that will never disappear, in no small part due to AMD's joke of a marketing division. They almost make it too easy for Nvidia's to mock and do as they please. Mark my words, if Nvidia had come up with this you wouldn't be seeing people ranting over and over the same things, in no small part due to their Apple like ability to market features. Complete opposites...

It really is though. I've been on these forums long enough to see the bipartisanship. Both sides will downplay anything the other brings to the table. However, just like in this thread, people claim "Well if Nvidia came up with the idea, you'd like it!" That is very childish, and does not add to the discussion at all. But I guess mature conversation is hard to come by these days...

For those who don't want an AIO card, they can always buy an air cooled option which will most likely come out. Heck there are air cooled 500W 290x2 beasties. If it could cool dual 290x's, your measly 380x (which those arguing against weren't ever going to buy) will be fine on air. Just don't cheap out on a card, and buy one with good cooling than comparing it with some Nvidia card that costs a lot more, and whinge how cheap you were, sorry, how bad the cooler is.
http://www.newegg.com/Product/Produc...82E16814131584

I really hope we won't have to rely on triple-slot air coolers like the one you've linked to...
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Did I really read correctly!!?? Some people here are actually bitching about better cooling solution for high end GPU out of the box? I've been hoping something like this for years now and this is awesome, but I guess not everyone are happy about progress? :eek:

I can understand if there is fear of pricing going out of hand since it cannot be as cheap as air cooling, but since we are talking about amd here it's not going to get out of hand like it would when nvidia has been forced to do the same move. Also I can understand if you have chassis space issue, but that doesn't seem to be the case since russiansensation already showed that it can fit even ITX case. So what is the "real" problem? Your brain just doesn't compute and you are afraid of change? I thing it's even more sad that there are so many people in this world who are afraid of any change and like to live in their safe little bubble.

I don't think it will be more expensive. These are off the rack coolers that they've made GPU mounts for. Nothing special. They sell them by the truckloads for CPU's. You can get one right now for as little as $37.50AR retail from newegg, and I've seen them for as low as $30. Imagine what AMD will pay for them with a large OEM contract. Less than the cost of engineering the Titan cooler.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
It really is though. I've been on these forums long enough to see the bipartisanship. Both sides will downplay anything the other brings to the table. However, just like in this thread, people claim "Well if Nvidia came up with the idea, you'd like it!" That is very childish, and does not add to the discussion at all. But I guess mature conversation is hard to come by these days...



I really hope we won't have to rely on triple-slot air coolers like the one you've linked to...
What you failed to note is that particular card has two 290x cores with 8gb ram. Surely, 500W+ in TDP needs certain cooling power, no? However, that was merely an example as to what is possible. All this whinging that is going on in this thread is as i described, unwanted fear mongering and fud spreading, nothing more nothing less. AIB partner's have always have had some good to great solutions for top cards from either side. 380x is under10W more than 290x, and it will have been just fine, only better a launch, as AIO water cooling solution is most decidedly superior a solution. Heck, i may even suggest that some may just be bitter that their new shiney cards don't have it, and possibly won't anytime soon. GM200 is not reported to be running on water, Pascal may come with AIO... It is just the new standard in reference cooling. Whether one likes it or not, is immaterial when it comes to evaluation of an engineering solution. This is not Architecture 101 where you're supposed to make things pretty, it helps if they are, but function has always ruled over form in semiconductor.
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
hopefully quad 380x can power 7680x1440 with "all" the eye candy delivering 60fps minimum.

if so? will take 4. :biggrin::biggrin::biggrin::biggrin:
 

garagisti

Senior member
Aug 7, 2007
592
7
81
hopefully quad 380x can power 7680x1440 with "all" the eye candy delivering 60fps minimum.

if so? will take 4. :biggrin::biggrin::biggrin::biggrin:
Which other card is expected to do so? Wait, not even gm200 will do so. Heck, we are barely getting to where a single card will do uhd at 60p. CF scales better, and the new memory should make it sweeter still, and you will need 2 or 3 of either 380x or gm200. Single card at resolution you ask of, it is not going to happen till may be 395x2 (just guessing the name with 390 or whatever) or titan z2 or something.

Heck I already have a 4k screen, and I will love nothing more than playing some games at that resolution. I can play some older games easy on my old old card, but I have upgraditis and bad.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You should see some of the rename cards announced first, then the FIJI'.
FIJI wont have the 380x naming.

I think whether it's called 380X or 390X is semantics. What matters is the performance and price of those chips relative to the 290X/980. I presume you are implying that Hawaii 290/290X will get re-badged like 7950 V2/7970Ghz became R9 280/280X? If that's true something in the rumours doesn't align then because if the top flagship AMD 300 chip is 50% faster than the 290X, even if AMD re-badges 290/290X as 380/380X, then how are they going to fill up that major 50% gap with just 2 Fiji chips slotted above? Your post made things more confusing if anything.

I guess you are implying that the flagship AMD card 50% faster than 290X won't even show up until next year? I think the naming is less important than how much more performance will AMD bring on board this year and at what prices.

If AMD re-badges 290/290X, I don't see those cards selling well because even today a $250 290 and $300 290X aren't popular compared to the 970/980 cards, unless AMD drops them to $199 and $249?
 
Last edited:

xLegenday

Member
Nov 2, 2014
75
0
11
lets say, 30-40% is more realistic number. Maybe 50% in a very specific setting / game.

The renames would be 300 series and will offer 10% on average expected performance increase. so the gap wont be that big.

Think like this, AMD is a short cash company.
If you had to decide - >Using old tech (28nm) and develop a entire line up of brand new gpus for each segment
OR -> focus on on one key segment (high end) for the last 28nm, test the new architecture... try to survive one extra year and in the mean while start focusing on 14/16nmF for the next year new GPU lineup.
 

S.H.O.D.A.N.

Senior member
Mar 22, 2014
205
0
41
Building a new architecture for a single high end card on 28nm right before you switch nodes seems like a monumental waste of development costs. It's as if NVidia decided to not make the 970, "because reasons, and instead went with just the 980.

Not buying that.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
I stand by the statement that if you need an AIO cooler on a stock card, you're doing it wrong. If to beat NV's performance AMD is forced to resort to a design consuming more power/generating more heat and thus needing an AIO, I absolutely think they've done it wrong. It's one thing to create a card that is an all out attempt to get a perf crown; it's another thing for a mainstream GPU to *require* an AIO solution - which is exactly what my comment was about.
You know, I don't ever recall seeing these types of arguments in the past when advancements were made in cooling solutions:

"OMG, you need copper instead of aluminum to cool that GPU! You're doing it wrong!"
"OMG, you need heatpipes to cool that GPU! You're doing it wrong!"
"OMG, you need a vapor chamber to cool that GPU! You're doing it wrong!"

Why were all these other technological advancements considered acceptable but suddenly an AIO cooler is doing it wrong?

My daughter's computer has a Corsair H50 AIO cooler and it works great. I have an H60 cooler sitting on the shelf awaiting my next build. Many people are using AIO CPU coolers now and manufacturers continue to churn out new models.

As AMD proved with the 295X2, AIO coolers are a great match for GPUs. And oddly enough, not one reviewer said, "OMG, AMD needs a water cooler for this! They're doing it all wrong!".
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
@ Creig,

1. People are resistant to big changes due to human nature. Going to heatpipes then to vapor was still around air cooling type of tech. I guess to some people going AIO CLC is such a dramatic change for the cooling type, it's too dramatic of a change for them to accept it as superior. The same reason the 300W TDP is hitting their psychological level despite this being just 10W more than a 290X.

2. As a car enthusiast I appreciate the Supercharged V8 in the new Z06, the naturally aspirated 12 Cylinder in Ferrari F12, the twin-turbocharged engines in McLaren 650S, Nissant GTR or BMW M5, and also Hybrid cars like the Porsche 918 or LaFerrari. I guess in the GPU world, people don't view things that way. It seems if you aren't following NV's market leading strategies and design, you are doing it wrong. I can appreciate NV's approach of perf/watt+air cooling and AMD's HBM+Hybrid WC as simply different engineering solution to the same problem = getting more performance on the limiting 28nm node. Making statements that OMG AMD needed Hybrid WC just to stay in the game is just as absurd as saying OMG NV needed to improve Perf/watt just to stay in the game. No one stops NV from combining AIO CLC + Maxwell efficiency to smoke the flagship 300 series by 20-40%. In fact, I would love such a product in the marketplace.

@xLegenday,

I see what you mean but it also means no chance to compete with a GM200 in 2015 if their best chip is only 30-40% faster than a 290X unless they price this card way out of GM200's price range, like $499-549.

Also, if you say 300 series Hawaii rebadges will be at most 10% faster than a 290X, as in these would be rebadged 380/380X Hawaii, sounds like picking up an after-market 290X for $225-250 would be better than waiting for the mid-range 300 series cards. I am following everyhing you are saying so far but I also remember when 280/280X replaced 7950/7970Ghz, their price/performance was worse. Basically for someone looking to get a mid-range card soon, there isn't much point then in waiting for 380/380X mid-range chips? 300 series sounds a lot less exciting all of a sudden. Hmmm...

Building a new architecture for a single high end card on 28nm right before you switch nodes seems like a monumental waste of development costs. It's as if NVidia decided to not make the 970, "because reasons, and instead went with just the 980.

Not buying that.

In fairness it's going to be building on top of Tonga. I wouldn't exactly call it "a new architecture", but more of revision 2 of Tonga, aka GCN 2.0 or w/e. We've seen GPU makers use the same underlying architecture with slight tweaks to geometry, shaders in 5800 line --> 6800/6900 line.

I guess what strikes me as insane is AMD trying to sell 290 series cards as mid-range rebadged 300 cards because if NV drops 970's price to $249, well those cards won't even move! I can't imagine AMD selling 290/290X for $199/249 for another 1.5 years. Surely that can't be good for their margins. Today I understand why 290/290X are so cheap -- inventory clearance leading up to 300 series launch.

I mean I would be OK with 290/290X level of performance for mid-range 300 cards but then I would have liked 50-75W reduction in power. I suppose if AMD prices 380X = 290X + 10% at $299 and it's close to a 980, I guess that could work. The problem is that strategy fails to take into account that NV could also refresh the 980+ drop it's price! Surely AMD needs to anticipate GM204 getting faster and cheaper soon.
 
Last edited:

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
I've got plenty of room in my case for even 2 of those CLCs with no difficulty whatsoever. This is like being asked: Fast, quiet and cool, pick three. Sounds great to me. A reference or at least available at launch and sent to reviewers open air cooler would be neat to have as well for people who want to pick two and not worry about layout so much.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
So much speculation without a hint of logic. AMD cannot keep rebranding the older GCN cards forever. Right now they are playing a losing game with throw away prices and no margins. AMD needs a GPU stack which is competitive on perf/watt and perf/sq mm. Thats how they can make money. Lisa Su said they are not cutting back on R&D or engineering. So I am sure they will bring a full 28nm GPU stack to market. I am sure they have atleast 3 GPUs built ground up. Their names might change. But this is what I am guessing

Bermuda - 4096 sp (HBM based product). Die size - 500 - 550 sq mm
Fiji - 2560 - 3072 sp (HBM based product). Die size - 350 - 400 sq mm
Treasure Island - 1536 sp (GDDR5 product). Die size - 250 sq mm

14nm FINFET will not arrive for GPUs before Q2 2016 (flagship GPU) and will take till end of 2016 for very high volume to completely replace 28nm GPUs. Apple, Qualcomm and others are hogging capacity at leading nodes and so GPU vendors like Nvidia/AMD are unlikely to get very high volume wafer allocation before H2 2016. AMD's R9 3xx should be the main GPU volume driver for 2015 and 2016.
 
Last edited:

S.H.O.D.A.N.

Senior member
Mar 22, 2014
205
0
41
In fairness it's going to be building on top of Tonga. I wouldn't exactly call it "a new architecture", but more of revision 2 of Tonga, aka GCN 2.0 or w/e. We've seen GPU makers use the same underlying architecture with slight tweaks to geometry, shaders in 5800 line --> 6800/6900 line.

Which makes the idea that a potential 370 would be a re-badged 280 even more outlandish.
 

96Firebird

Diamond Member
Nov 8, 2010
5,741
340
126
Which makes the idea that a potential 370 would be a re-badged 280 even more outlandish.

I don't think they could do a straight re-brand without at least adding the support for full-gaming free-sync capabilities. That is a major feature they are pushing this year, and if all of their gaming cards don't support it, that is a failure.

Also, it appears to me that the island used to cover up the name on the latest leak is Treasure Island (located in the Fiji Islands).

X6ti8JY.jpg


DkefhB4.jpg
 
Last edited:

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Which other card is expected to do so? Wait, not even gm200 will do so. Heck, we are barely getting to where a single card will do uhd at 60p. CF scales better, and the new memory should make it sweeter still, and you will need 2 or 3 of either 380x or gm200. Single card at resolution you ask of, it is not going to happen till may be 395x2 (just guessing the name with 390 or whatever) or titan z2 or something.

Heck I already have a 4k screen, and I will love nothing more than playing some games at that resolution. I can play some older games easy on my old old card, but I have upgraditis and bad.

He did say "quad" as in quad-fire - I think everyone understands single-cards aren't handling 5760x1080, 3840x2160, or 7680x1440 at max quality settings, not any time soon. By the time current gen titles can be played on a new single-GPU, the latest engines or optimized titles will not... I think we may actually be a long while from ever having single-GPU cards handling those resolutions at max for then-current titles.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
Which makes the idea that a potential 370 would be a re-badged 280 even more outlandish.

well said. :thumbsup: exactly how is AMD going to incorporate the latest features like freesync support, the new scaler which supports VSR to 4k res, the new memory bandwidth compression tech, ROP and tesselation performance and other architectural enhancements (which they are designing now for its high end R9 3xx flagship) into rebadged GPUs like Tahiti which are 3+ years old. D:

There is no logic in the statement that AMD will rebadge Hawaii. If thats the case why did AMD waste its time on Tonga aka R9 285. Tonga was a testbed chip which incorporated improvements to tesselation,ROP and memory bandwidth efficiency. AMD has now laid the ground work to improve the core shader architecture and allow performance scaling without bottlenecks. AMD has talked that they will make continuous improvements to GCN as they see a long life for it. Thats not a suprise given that the current gen consoles will atleast be around for 5 years (or more) and they sport GCN. Improving GCN but maintaining architectural compatibility with the consoles helps AMD to exploit the most out of the latest games as they are developed console first and then ported to PC.

GCN 2.0 looks to be a more efficient version. There are talks of a tiled GCN architecture to improve efficiency and perf/sp.

https://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=en&ie=UTF-8&u=http%3A%2F%2Fwww.bitsandchips.it%2Fhardware%2F9-hardware%2F5159-gpu-fiji-un-grande-chip-monolitico-anche-per-amd&edit-text=
 

xLegenday

Member
Nov 2, 2014
75
0
11
@xLegenday,

I see what you mean but it also means no chance to compete with a GM200 in 2015 if their best chip is only 30-40% faster than a 290X unless they price this card way out of GM200's price range, like $499-549.


Currently... Lets see when its finalized.
they might up clocks, better drivers... fine tune clocks to better adapt nvidia offerings.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Currently... Lets see when its finalized.
they might up clocks, better drivers... fine tune clocks to better adapt nvidia offerings.

Do you work for AMD or an AIB? How do you know that Fiji is 30-40% faster, not 45-50% compared to the 290X?

I don't get how a 550mm2 chip with 4096 SPs, 256 TMUs, and HBM is not 45% faster at high rez unless we are talking CPU limited benches?