dacostafilipe
Senior member
- Oct 10, 2013
- 804
- 305
- 136
Show me any GPU's in that pic. There isn't even one, never mind two.
Going by the packaging, it looks like an AMD GPU. I'm not saying that it is, just that it looks like one.
Show me any GPU's in that pic. There isn't even one, never mind two.
As long as we're all speculating, what if AMD has discovered a way to bridge multiple GPUs in a way that is transparent to the game? What is this actually is a dual GPU, but operates as seamlessly as one? That would certainly be a game changer.
As long as we're all speculating, what if AMD has discovered a way to bridge multiple GPUs in a way that is transparent to the game? What is this actually is a dual GPU, but operates as seamlessly as one? That would certainly be a game changer.
As long as we're all speculating, what if AMD has discovered a way to bridge multiple GPUs in a way that is transparent to the game? What is this actually is a dual GPU, but operates as seamlessly as one? That would certainly be a game changer.
If you look at the image there is no GPU(s) shown at all. The "Dual GPU?" has been Photoshopped onto the image. The actual slide just shows the controllers and the DRAM. It's FUD.
The higher-ups at AMD seem to have been making a lot of miscalculations. I don't know how many more the company can afford. I'm starting to get the feeling that AMD has pretty much nothing of substance to offer until 2016 - assuming they survive long enough to reach the promised land of 16nm FinFET.
That excuse doesn't work for most of AMD's lineup. Of all the GCN chips, only Tahiti and Hawaii had substantial DP performance (1/4 SP for Tahiti, 1/2 SP for Hawaii, though the latter was limited to 1/8 on Radeons). Cape Verde, Pitcairn, Bonaire, and Tonga all have very limited DP performance (1/16 SP). The AMD cards do all work better with OpenCL, but that's a driver optimization issue, not a hardware limitation. Maxwell actually increased compute performance relative to Kepler on some integer workloads (Scrypt mining).
I think something went very wrong with Tonga - it was supposed to compete with Maxwell on perf/watt, but it couldn't. Maybe it was initially designed for GloFo's 28nm SHP process, but that didn't work out for some reason or other and it got ported to TSMC instead. The fact that no fully enabled Tonga has been released to the consumer AIB market is really odd - surely yields on a mature process like 28nm can't be so poor that the high-end Retina iMac is sucking up all of the non-defective chips. Even if AMD wanted to hold off on it on consumer cards because of a large back stock of Tahiti (sunk cost fallacy), that doesn't explain why they used a castrated Tonga for FirePro W7100 instead of the full chip.
sometimes I wonder if RussianSensation is a paid bot-contributor just to generate comments to keep ATF alive
I am really leaning this way. But that means there really isn't a full brand new line up coming.
:thumbsup:![]()
I think the state of the CPUs for gaming today (i.e., Core i7 920 @ 4.0-4.4Ghz or Core i5 2500K / 2600K @ 4.5-4.8Ghz) mean that the excitement in the CPU section of the forum is all but dead. That used to be one of my favourite sub-forums. Naturally as CPU upgrades were exciting, so was the research for a motherboard and after-market CPU cooling. These 2 upgrades went hand-in-hand. Today a "bare-bones" X99 board like the Asrock X99 Extreme 4 is packed with so many features and high-end components that you now have to do research for the purposes of specific extra features you want beyond the fundamental basis that's already so good. In the past, older boards had horrible BIOSes, budget parts, poor overclocking/instability. Those days are largely behind us. Skylake delays and X99 BW-E delay to Q1 2016, which means Skylake-E is even farther now, aren't making the situation better.
For CPU cooling, Corsair/NZXT AIO CLC or CM212+/Thermalright/Noctua fill in that space. Sure there are occasional super deals like Zalman CNPS14x for $10 but since CPU's power usage hasn't really increased much from Core i7 920 OC days, any solid high-end cooler such as Thermalright True Spirit 140/ Silver Arrow or Noctua NH-D14 can keep on trucking for many generations.
With SSDs, more or less Crucial and Samsung 850 Evo dominate the price/performance categories while SanDisk and Samsung 850 Pro dominate the high-end. On the PCIe SSD space, we have Intel and Samsung more or less. On the PSU space, anything basically from SeaSonic, Corsair, Antec, LEPA/Enermax, Rosewill or EVGA is rock solid for the most part.
Essentially the high quality products and bang-for-the-buck products are so obvious today that it basically requires very minimal research for PC enthusiasts who have been building for 10-20 years+.
That leaves custom water-cooling loops, GPU upgrades and monitor upgrades as the last 3 exciting areas left imho. Custom water-loops are very niche while 4K monitors are hardly affordable for the masses and we have a serious problem with FreeSync/GSync similar to HD-DVD vs. BluRay which is holding back many people (in addition to lack of high quality IPS panels and larger sized panels in this space as of now).
Getting back to GPUs, delays of huge games like GTA V, the Witcher 3, Batman AK, Project CARS, the Division and horribly optimized games at launch with sub-part graphics like Watch Dogs and AC Unity have all weighed down at the excitement of GPU upgrades. BF Hardline seems like a meh game overall, while GTX970/980 hardly raised the performance bar from now old R9 290/290X series. That leaves us with the $1K Titan X as the most exciting GPU out today. It's no wonder this is a pretty uneventful time in PC gaming for experienced builders. It probably would be a lot different if we had a lot of truly next generation PC games like Star Citizen and TW3 out today and if 4K monitors were offered in many varieties/sizes and price levels. That would have forced a lot more GPU upgrades.
Also, the 28nm node is heavily weighing down on this generation. It makes it too difficult to ignore that 16nm FinFET+ and 14nm are not that far away and are going to be a gargantuan leap.
"The foundry [TSMC] said its 16FF+ process will deliver a 10% performance uplift than competing nodes, while at the same time consuming 50% less power than its current 20nm node." ~ Source
^ That's against a 20nm node, imagine against a 28nm one?!
With HBM/HBM2 and 14nm/16nm, and reference AIO CLC, it's not out of the question that AMD/NV could in theory get GPUs 75-100% faster than Titan X out of the next generation if they could manage to build 550-600mm2 successors. A lot of gamers are now thinking that the longer we wait for R9 390X and the competitor's equivalent, the closer we are to 14nm/16nm generation. That makes this generation one of the least exciting to me in a long time. (I am assuming 14nm/16nm GPUs actually deliver but if all we get from September 2016 to September 2017 are mid-range next gen parts...ahem....then that would be seriously disappointing).
Also, I think a lot of people are anxious to see what Windows 10 and DX12 games do to change the PC gaming landscape but again we likely won't start seeing the fruits of that until 2016 and even 2017.
Just my 2 cents.
I think what you meant is "a full new architecture" instead of a new line-up. AMD can easily improve R9 290/290X performance by 5-10% and drop power usage. At the same time, there are bound to be 2-3 SKUs based on R9 390X as AMD will end up with non-full die yielding 4096 SP chips. Anyway, everyone knew a long time ago that R9 390 series was not a new architecture (aka not post GCN) but a continuous improvement of GCN. Most likely it could be called GCN 1.3 and use the foundation of Tonga and improve even beyond that in terms of perf/watt due to a more mature 28nm node. However, it would be totally wrong to equate the dramatic architectural moves from Fermi -> Kepler -> Maxwell to GCN 1.0-> 1.1 -> 1.2 -> 1.3(?). Part of that is because GCN was always built to be modular and a strong compute architecture from the ground-up. It was never meant to be a 2-3 year architecture only. When Eric Demers unveiled GCN, it became obvious AMD would use this for at least 5 years from his presentation. HD7970 was unveiled on Dec 22, 2011 and GCN is going to be the foundation for R9 300 series which means by end of this year the architecture will turn 4 years old.
I think what you meant is "a full new architecture" instead of a new line-up. AMD can easily improve R9 290/290X performance by 5-10% and drop power usage. At the same time, there are bound to be 2-3 SKUs based on R9 390X as AMD will end up with non-full die yielding 4096 SP chips. Anyway, everyone knew a long time ago that R9 390 series was not a new architecture (aka not post GCN) but a continuous improvement of GCN. Most likely it could be called GCN 1.3 and use the foundation of Tonga and improve even beyond that in terms of perf/watt due to a more mature 28nm node. However, it would be totally wrong to equate the dramatic architectural moves from Fermi -> Kepler -> Maxwell to GCN 1.0-> 1.1 -> 1.2 -> 1.3(?). Part of that is because GCN was always built to be modular and a strong compute architecture from the ground-up. It was never meant to be a 2-3 year architecture only. When Eric Demers unveiled GCN, it became obvious AMD would use this for at least 5 years from his presentation. HD7970 was unveiled on Dec 22, 2011 and GCN is going to be the foundation for R9 300 series which means by end of this year the architecture will turn 4 years old.
Unless you are referring to low power ARM chips, we might as well ignore these claims. All the great stuff 20nm offered over 28nm, it was completely meaningless in GPUs."The foundry [TSMC] said its 16FF+ process will deliver a 10% performance uplift than competing nodes, while at the same time consuming 50% less power than its current 20nm node." ~ Source
^ That's against a 20nm node, imagine against a 28nm one?!
They are specifically talking about the low power ARM chips.He also mentioned that new Cortex-A72 designs on 16FF+ will offer a 3.5x performance increase over Cortex-A15 parts (presumably on 28nm silicon), while at the same time consuming 75% less power than the A15.
But to me, i am not liking what I see.
clearly you haven't overclocked a FX-8xxx to 4.5ghz and beyond!!!but since CPU's power usage hasn't really increased much from Core i7 920 OC days, any solid high-end cooler such as
and monitor upgrades
clearly you haven't overclocked a FX-8xxx to 4.5ghz and beyond!!!
yep. 32" 1440p @ 85hz will be my next upgrade.
Every single full node jump in GPUs gave us 50-100% increase in GPU performance when coupled with a brand new architecture. This time we have HBM/HBM 2.0 as well! Why would you think that going from 28nm-> 14nm/16nm+ HBM would not give us 50-100% increase in GPU performance at similar power levels? I think you are too conservative. The last time NV had a full node jump + new architecture, 780Ti was 2X faster than a 580. AMD and XX will produce the largest single generational jump in memory bandwidth in its history going from 320-336GB/sec to 650-700GB/sec+. Already this generation AMD will be moving to 1-1.25Ghz HBM1 and for 14nm GPUs, HBM 2.0 will get even faster!
I know how node shrinks went before but here we are completely skipping 20nm. Has that ever happened before?
I dont know how things will go. But looking at intels 22nm then 14nm, i am not confident for high performance GPUs on 16nm. I am really thinking it might not be what everyone thinks.
I really hope i am wrong. trust me, it is just this sick gut feeling. Not anything i want to happen. It is just this terrible 'what if' i used to think and now it makes me sick cause it started to stick in my head
What kind of a monitor is that? What about LG's 34" 3440x1440 models?
I was looking at them and the actual size is 32.7" I don't know how they round it to 34" and not 33" I am way less likely to purchase a $900 monitor that is 32.7"
hehe. A $40 Thermalright True Spirit 140 is more than up to the task. I would have never bought an FX-8000/9000 series CPU for gaming to start with though since they lose in 95%+ of games to an overclocked i5 2500K, nevermind IVB & Haswell. Prices of the top air coolers have come down over the years since AIO CLC became more popular. As a result, you can now buy a top 3 air cooler that even beats most AIO CLCs for just $60. Chances are such a CPU cooler would easily last 5 years and survive 1-2 CPU upgrades if you wanted to.
That's what I meant that the excitement to research the best cooling for CPUs now outside of custom water loops is just not there anymore. We are no longer experiencing these dramatic leaps in cooling performance increases. Once we reached a level of Thermalright Silver Arrow/Noctua NH-D14 in 2011, it's been 2-4C improvement at best for the best air coolers like NH-D15 and the Phanteks PH-TC14PE.
What kind of a monitor is that? What about LG's 34" 3440x1440 models?
Also, the 28nm node is heavily weighing down on this generation. It makes it too difficult to ignore that 16nm FinFET+ and 14nm are not that far away and are going to be a gargantuan leap.
"The foundry [TSMC] said its 16FF+ process will deliver a 10% performance uplift than competing nodes, while at the same time consuming 50% less power than its current 20nm node." ~ Source
^ That's against a 20nm node, imagine against a 28nm one?!
With HBM/HBM2 and 14nm/16nm, and reference AIO CLC, it's not out of the question that AMD/NV could in theory get GPUs 75-100% faster than Titan X out of the next generation if they could manage to build 550-600mm2 successors. A lot of gamers are now thinking that the longer we wait for R9 390X and the competitor's equivalent, the closer we are to 14nm/16nm generation. That makes this generation one of the least exciting to me in a long time. (I am assuming 14nm/16nm GPUs actually deliver but if all we get from September 2016 to September 2017 are mid-range next gen parts...ahem....then that would be seriously disappointing).
I was looking at them and the actual size is 32.7" I don't know how they round it to 34" and not 33" I am way less likely to purchase a $900 monitor that is 32.7"
If that's true that's terrible, especially considering that measuring ultra wide aspects diagonally is already borderline misleading.
clearly you haven't overclocked a FX-8xxx to 4.5ghz and beyond!!!
yep. 32" 1440p @ 85hz will be my next upgrade.