Phynaz
Lifer
- Mar 13, 2006
- 10,140
- 819
- 126
And you base that claim on what info?
AMD saying they won't be profitable in the foreseeable future.
They don't have the volume to turn a profit on those chips.
And you base that claim on what info?
Just like FX "owns" perf/dollar to this day?
Fury X is inferior to Geforce GTX980 Ti and priced the same (actually more because 980 Ti is cheaper now). Try harder.
It's official... Nintendo signed for AMD. We have a dark generation incomming.
RIP Nintendo.
And MS needs to be wise and leave the dying console market. PC gaming with keyboard and mouse is the future.
And Gamecube ended to be a flop. Also it was ATI. ATI won with Nintendo WiiUh, Nintendo signed on with AMD chips (ATi) back in 2001. Everything since the Gamecube has had AMD graphics.
FTFY, see how that works?
The heart of the market is with mainstream cards like the Radeon R7 360, R7 370 or R9 380 -- which are excellent at their respective price points.
Didnt we hear the same story about the 200 series? What happend? Oh right, a marketshare crash from 35% to 22.5% thats tiill only goes one way fast, and thats downwards.
Didnt we hear the same story about the 200 series? What happend? Oh right, a marketshare crash from 35% to 22.5% thats tiill only goes one way fast, and thats downwards.
Honestly, given how aggressively priced the Radeon 200-series cards were, I was a bit surprised that they continued to bleed share like this given the strong perf/$ ratios of these cards. NVIDIA is cleaning up in gaming notebooks, though, so if mix is shifting from desktop to gaming-oriented laptop this could help explain it. NVIDIA is also a more recognizable/trusted brand than AMD in gaming, which probably also contributed.
And Gamecube ended to be a flop. Also it was ATI. ATI won with Nintendo Wii
Didnt we hear the same story about the 200 series? What happend? Oh right, a marketshare crash from 35% to 22.5% thats tiill only goes one way fast, and thats downwards.
I never personally felt that way about the 200 series. I always felt that most of the lineup were just too power hungry/hot to be practical.
Apparently it goes for financials like it goes for technical once the usual experts step in a thread, to summarize it s all authoritarive statements to better cancel the void of the argumentation, most striking is that there s never a single relevant number that appear, if there s a number at all.
Costs are discussed without anybody having the slightest clues about the components actual costs, and indeed, is it a surprise, there s no numbers, not even one with a modest 200% error margin...
A lot of words.....to say nothing.
I never personally felt that way about the 200 series. I always felt that most of the lineup were just too power hungry/hot to be practical.
And its only gotten worse since.
You forgot power consumption, on purpose.
You forgot power consumption, on purpose.
You know that you are posting non sense, but nevermind, the show must go on...
It s obvious that for your very case going Nvidia rather than AMD you ended paying to the former the few energy saving, and as a consequence had to live in the cold to get back this loss...
