Learn to zen them or try to laugh. Works mostly.Even when I log in just to activate my ignore list, the garbage still gets quoted for pages at a time.
I can't win. :thumbsdown:
Learn to zen them or try to laugh. Works mostly.Even when I log in just to activate my ignore list, the garbage still gets quoted for pages at a time.
I can't win. :thumbsdown:
Suppose they have broad availability before nVidia? Does that put them ahead? It's easier to release something if you only have a 5 minute supply.
Bacon1 said:If Nvidia is ahead so far, why can't they keep tiny amounts of under performing
overheating "premium" 1080's in stock anywhere?
yea because it goes above 150w , rx480 won't.
Interest, whats your source on this?
It would be a huge (probably unrealistically optimistic given the sales in the previous generation) success for AMD to get half of this part of the market vs NV.
Even if they did manage that though, I think they'd still drop overall market share.
Now please keep the NVIDIA discussion out.
Nothing points to AMD will regain market share with their mainstream/low end lineup.
Also a card like the GTX 1070 will most likely be the single best selling card at all. Just like the GTX 970 was.
GP106 from the rumoured specs looks to be better than Polaris 10 in the metrics. so all in all, the dream of regaining a big part of the lost market share and revenue seems to be just that, a dream.
If the 8-pin designed to supply up to 150W of power, is it possible that RX480 will consume about 110W? it has TDP of 150W.
It doesn't, because AMD didn't promise broad availability by the end of the month. They mentioned mid-2016, but the fact that were teasing Polaris months before Pascal (January-March) and vague statement like 'several months advantage' made people believe that they were in fact ahead of the FinFET transition.
Are you trolling now? ShintaiDK doesn't seem to have problems with his FE unit running the latest drivers, and there's plenty of custom models that can handle boost clocks perfectly fine at lower temps/noise. Now please keep the NVIDIA discussion out
exactly, they increased tdp of both 1070 to 150 from 145 of 970's , and 1080 to 180 from 165w of 980's. so it's pretty obvious 1060 will atleast have 120w tdp just like 960. it's likely to be above 120w. if we look at 1070/1080. 960 also had less p/w than 980. usually nvidia's lower chips have less p/w. but it's opposite for amd, their low chips are more p/w efficient ( compared to their high chips).gtx 1070 uses 161w apparently...
http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1070_review,8.html
...hence the the pcie +8pin for a 150w tdp :\
gtx 1070 uses 161w apparently...
http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1070_review,8.html
...hence the the pcie +8pin for a 150w tdp :\
kraatus77 said:exactly, they increased tdp of both 1070 to 150 from 145 of 970's , and 1080 to 180 from 165w of 980's
Slaughterem said:Did Nvidia promise broad availability when they released their cards?
And again we all know that the context of the several months advantage was in reference to laptops
and mainstream back to school products
Maybe he's saying real world performance difference of the bandwidth is only 5-7%? If so, he didn't state it very clearly. That's all I can think, as 5-7% is probably in that ballpark of real world performance gain of +25% memory bandwidth.
Going to have to wait for reviews, that sort of thing isn't public yet - anything you read is (wild) speculation.
gtx 1070 uses 161w apparently...
http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1070_review,8.html
...hence the the pcie +8pin for a 150w tdp :\
:\
![]()
![]()
Yet it still draws less power according to TPU.
No, but they still managed to launch Pascal a month before AMD's first FinFET GPUs and many people I know already own the cards.
Where is Polaris 11? Despite being the tiniest chip, no show at Computex. Yes, it should be inside Apple's designs, but there's no launch date yet.
You mean Geforce GTX 970/980 level of performance at lower prices? Looks like that's where the competition is going next a well.
But 6-pin PCI-E connector is rated up to 75 watts, and that's not a speculation.
Yes, but RX 480 sips power from a single 6-pin PCI-E connector and not 8-pin.
So... at max load it will consume about 100-110w.
Yeah, but 75W PCI-E base + 75W from the additional connector is not 110W. Where are you getting that from?
I'll just reply to this as I see it as the key.
Unreasoned rumour is a waste of time as far as I'm concerned. We have too many here who post the most ridiculous things to incite a nasty response.
As far as we know, Vega (either a Q4 2016 or Q1 2017 release depending on what leaks you believe) is still going to be on GloFo. Most of the heavy lifting on 14LPP was done by Samsung anyway; GloFo is basically tasked with implementation of an already engineered process.
It takes 90 to 120 days to complete a wafer on 14nm. Do you go the route of a paper launch in a high volume mainstream segment or do you stock up inventory to meet the expected demand? And yes it appears that they are months ahead in this segment especially for laptops and OEM desktops, even if NV rushes out 1060 card that will not be able to compete with the 480.
This is a new architecture, do you maybe think that the amount of shaders per cu is different? Maybe 1 cu is not 4 X 16 ALU? Maybe 36 cu does not equal 2304 shaders. Maybe the patent they received makes them have 2 X 16 ALU with 2 X 8, 2 X 4, 2 X 2, and 2 highspeed scalers. maybe its 2160 shaders with 72 high speed scalers capable of running 4 threads in 4 clock cycles which would give them 2448 effective shaders?
You spec your power connectors based on maximum power draw, not TDP. Keep in mind and AMD representative declined to confirm RX 480 having 150W TDP, as the question was under NDA territory. That means that for them the term "Power" used in the slides may not have meant TDP, but rather Max Power (card design).Yeah, but 75W PCI-E base + 75W from the additional connector is not 110W. Where are you getting that from?
because both consume more than 150w.You spec your power connectors based on maximum power draw, not TDP. Keep in mind and AMD representative declined to confirm RX 480 having 150W TDP, as the question was under NDA territory. That means that for them the term "Power" used in the slides may not have meant TDP, but rather Max Power (card design).
Moreover, for those of you who still expect the worst, ask yourselves the following: since they are both ~150W TDP, why does the 1070 come with an 8-pin connector, and why does 970 come with 2x 6-pin connector?
You spec your power connectors based on maximum power draw, not TDP. Keep in mind and AMD representative declined to confirm RX 480 having 150W TDP, as the question was under NDA territory. That means that for them the term "Power" used in the slides may not have meant TDP, but rather Max Power (card design).
Moreover, for those of you who still expect the worst, ask yourselves the following: since they are both ~150W TDP, why does the 1070 come with an 8-pin connector, and why does 970 come with 2x 6-pin connector?
because both consume more than 150w.
can i have a cookie now ?![]()
i meant both cards consume more power than 150w. i know what they do.Nope.
Single 8pin gives the illusion of less power than 2 6pin although they are equal. Guess the benefit would be it's probably cheaper as far as cost goes.
Nope.
Single 8pin gives the illusion of less power than 2 6pin although they are equal. Guess the benefit would be it's probably cheaper as far as cost goes.
