[Various]Radeon Fury X and Radeon Fury coming

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

thehotsung8701A

Senior member
May 18, 2015
584
1
0
You do realize that this release date has been known for awhile right? It wasn't just picked right now we've known it for a decent while... since BEFORE the 980Ti launched.

Did you just magically expect this date to change and become today?

Some people said it would be reveal today at computex. And yes to those who went really?

YES I did OVERREACT. I'm just so tire of waiting.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136

RaulF

Senior member
Jan 18, 2008
844
1
81

R9-390X-Leaked-635x396.jpg


This might be true then. :wub:
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Did you make that slide? This sounds good maybe we'll see some good overclocking headroom.
The two 8-pin connectors allow together with the PCIe slot theoretically a power consumption of up to 375 watts, but AMD's Graphics CEO Joe Macri said Golem.de that Fiji graphics card should not require more energy than the Radeon R9 290X. Thus, should the upcoming models remain at under 300 watts - possibly even significantly
 

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
Did you make that slide? This sounds good maybe we'll see some good overclocking headroom.

Interesting wording, more energy.

Depending on perspective, that doesn't mean Fiji is necessarily any more efficient that Hawaii. If you can draw 120 frames in 1s and consume 480J, that's not using any more energy than drawing 120 frames in 2s and consuming 480J. In the first case though, you're at 480W of power and 120fps, while in the second you're at 240W but 60fps.

Edit: I'll wait for the replay though. It sounds like that might just be a poor translation from English to German and back.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
There's literally no positive news coming from AMD, and it's very worrisome. Everything from the Fiji rumors, to 7 month old drivers, to the Project Cars and TW3 debacle, to the news regarding their developer relations, to their market share numbers... it's ALL bad.

If indeed it's slower than a 980Ti for "around" $600 I see no reason to even buy one.

-Not saving much, if any money
-Slower
-Less VRAM
-Far worse driver support for many new game releases.

I'm not seeing any positives here.

This is probably the result of cutting costs and laying people off so many times over the last few years; at some point such cuts materially damage R&D, driver teams, and support. The remaining employees (who don't leave for other companies and weren't laid off) are asked to do more work and that tends to slow things down.

I really hope AMD pulls a rabbit out of a hat so that it can continue to put competitive pressure on NV. Competition is good for consumers.
 

thehotsung8701A

Senior member
May 18, 2015
584
1
0
#FirstWorldProblems

If you think about it, if we were in a 3rd World (which I came from btw), we would not even worry about a GPU in the first place and vice versa. Meaning that people who are in 3rd world if switch place with us would do the same. I been in both world and if you think 3rd world folks are any different, I'm living proof that we are all the same.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
I really hope AMD pulls a rabbit out of a hat so that it can continue to put competitive pressure on NV. Competition is good for consumers.
We all hope AMD puts the rabbit out of the hat so they sell more hardware not just force Nvidia to drop prices.
 

Azix

Golden Member
Apr 18, 2014
1,438
67
91
If you think about it, if we were in a 3rd World (which I came from btw), we would not even worry about a GPU in the first place and vice versa. Meaning that people who are in 3rd world if switch place with us would do the same. I been in both world and if you think 3rd world folks are any different, I'm living proof that we are all the same.

third world and anticipating all the same. Difference is I might keep such a card for 5 years
 

DDH

Member
May 30, 2015
168
168
111
Finally a clear shot of the chip without the paste smeared all over it....

In the article they say this though:

Sounds like its slower to me?


Sounds like no one has any ideas so lets just wait for 16 June, yeah? :thumbsup:
 

Udgnim

Diamond Member
Apr 16, 2008
3,683
124
106
WOW, AMD slides really are bottom of the barrel eh?, Hunting Titans?, who made that up?, the PR guys kids?..LOL

it's something that was created a while ago and probably isn't from AMD

the video card in the background isn't the WCE we've physically seen
 
Feb 19, 2009
10,457
10
76
"However, AMD's Graphics CEO Joe Macri told Golem.de that Fiji graphics card should not require more energy than the Radeon R9 290X."

Also that German translation is Fiji is faster than both Titan X and 980Ti.

What did ChipHell's original leak from last year? Fiji uses less power than R290X. o_O
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The original TITAN was priced that high in part because it was a middle ground between gaming and pro graphics that needed better DP.

If Fury X has better DP than 980 Ti, that may justify a price premium and milk the "I need better DP but won't pay for a Tesla/FirePro" crowd.

Tahiti offered full 1/3 DP but that never made it worth even as much as the GK104, never mind a premium. The whole DP argument was nothing more than forum members claiming it made the card worth more. Just like the 12GB now. nVidia never promoted Titan as any kind of "prosumer" card. Simply a gaming card, which is all it was.

While I'm sure there were a few people who took advantage of Titan's DP performance, the vast majority didn't. The vast majority won't ever need 12GB VRAM either. They'll just claim that's why there's a premium.

Then there's the case that the Geforce drivers aren't optimized or certified for pro apps anyway. I use Cinema 4D for modeling, but any issues with consumer cards won't be supported by either side. Although I did get one issue dealt with by AMD (soft shadows in C4D caused the program to crash) I told AMD and got the standard canned response of not guaranteeing Radeon operation with pro apps (They did respond though contrary to what others claim,). I responded that I realize that but they might find it was a common OpenGL issue if they looked at it. They agreed and when they did they found a fix (It took 2 driver updates). I was lucky.

Bottom line is people don't mind nVidia being a terrible value. AMD's customers are different. If AMD abandons the value proposition they'll alienate their customer base. The 970 and 980 ti offering good value for performance I think is directed at AMD's customers. It's too bad for them they screwed the pooch so bad with the 970 specs and memory operations. Like their other gafs though, people will justify it. :shrug:
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
That is what I expect as well. AMD needs to beat by those margins just to combat the marketing hype. There is a very good chance that the nvidia cards will still outsell, even if AMD hits those numbers.

I don't see any way AMD is going to hit those numbers. nVidia is not that incompetent to offer performance that much lower than what AMD can provide.

IF AMD did manage that I think you are giving nVidia too much credit. The bleeding would be massive.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Honestly, while I am not agreeing or disagreeing with what you say, the benchmark is the 980Ti not the TX anymore. A week ago, yes. Today, not so much.

It doesn't mean AMD can't also have a premium part. As long as they have a part that competes with the 980 ti perf/$ they'll be fine.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Good point. Assuming this is true, that probably is telling on why AMD wanted gen1 HBM. They needed the power savings to allocate to compute...

NV had power consumption a little more in control, and could wait it out until gen2.

Considering AMD's first hand intimacy with HBM I doubt nVidia is ready to go HBM yet. AMD's just a generation ahead with the tech. It's not unusual for nVidia to adopt tech slower than AMD.
 
Status
Not open for further replies.