The all-important question: Does Raven Ridge support HDMI 2.0?
At least HP's laptop with 2500u supports HDMI 2.0b.
http://store.hp.com/us/en/pdp/hp-envy-x360-convertible-laptop-15z-touch-1za07av-1
The all-important question: Does Raven Ridge support HDMI 2.0?
Come on, you know the "technical reason": bandwidth.
The technical reason is that you need DDR4-3200 what is already out of spec to have more bandwidth than the GT1030.
And with that i might add that the new core probably need more bandwidth.
Other problem is TDP, Vega 8 on 2200G is petty much an integrated RX550 (that an GT1030 trades blows with at half the bandwidth) w/ "Vega improvements" and RX550 is already a 50W TDP part at 14nm.
Aside from that (TDP limits and bandwidth) everything else looks good at the technical part. And thats incluiding both the 2200G and 2400G.
Actually is the non-technical reasons that has me worried about this, and thats comes courtesy of AMD slides and im not going to repeat the same. I see no reason for AMD wanting to hide the APU performance vs a GT1030/RX550 if the APU is competitive.
Memory Bandwidth on the 2400G + DDR-4 3200 will not be that much of a problem.
For comparison GT1030 has 48,06GB/s of memory bandwidth and 2400G with DDR-4 3200 will have 51,2GB/s
NVidia is a lot more efficient using bandwidth than AMD. The AMD RX 550 is the the competitor at this level and it has 112 GB/s of bandwidth.
Especially if the team in question has a long history of doing just that. All else being equal, precedence is a legitimate factor.
It must not be comforting to have to move the goal posts all over the field to make your point possible.
And you inadvertently made my point in your last statement. If the vast majority of buyers don't care much about how strong an igpu is, then it stands to reason that they care more about compute performance and is what was holding back sales of Bristol, Carrizo, Kaveri and Trinity. Ryzen APUs eliminates that deficit, and gives buyers the ability to have discrete class graphics paired with strong CPU cores without needing the discrete graphics and the complexity of multiple drivers/software packages from different vendors. Buyers were attaching discrete GPUs to intel APUs because they had no choice if they wanted anything but a failed gaming experience.
Just a reminder , RX550 is not VEGA
Especially if the team in question has a long history of doing just that. All else being equal, precedence is a legitimate factor.
It must not be comforting to have to move the goal posts all over the field to make your point possible.
And you inadvertently made my point in your last statement. If the vast majority of buyers don't care much about how strong an igpu is, then it stands to reason that they care more about compute performance and is what was holding back sales of Bristol, Carrizo, Kaveri and Trinity. Ryzen APUs eliminates that deficit, and gives buyers the ability to have discrete class graphics paired with strong CPU cores without needing the discrete graphics and the complexity of multiple drivers/software packages from different vendors. Buyers were attaching discrete GPUs to intel APUs because they had no choice if they wanted anything but a failed gaming experience.
If Vega was so good, we would have seen midrange Vega on desktop. When I brought up just how weak the performance was on Vega to the point that Polaris would be similar anyway, I was yelled at as a troll, yet when you look at Vega 56, it's not far enough ahead of Polaris 10 to make a large gap for Vega midrange. Vega just isn't powerful period, and it's going up against Volta this year....Vega never really moved the needle on anything over Polaris.
This focus on the market for the gamer who can't afford a dGPU is a waste of time, and this isn't where AMD will move the needle. That Buyer was already Team AMD.
The main market for RR breakthrough is non-gamers, and it is the CPU that will make the difference.
I know that I have recommended people away from AMD systems when they asked me for advice for years and I bet I wasn't the only one doing this. One of my friends just bought his second Intel system last year on my advice (previous was 8 years before that).
But I don't have to do that anymore. AMD is finally competitive again. There will now be AMD options to recommend.
Oh right, nobody will do that will they? Because you looked into your crystal ball and say so?You talk about moving goal posts, then say buyers will not care about the igpu perf, but the compute performance?
Seriously? We expect the average buyer to do that?
Also, AMD gpus with igpus do need drivers/have driver complexity, what are you talking about? That's complete misinformation to say otherwise. It's intel igpus that are almost always supported/plug and play since they're the most widely used. If you're talking compatibility, AMD loses there by a longshot...
The dream of people gaming on AMD's extremely weak igpus, in hopes they'll upgrade to a discrete AMD card later is just that, a sad sad dream that won't ever happen.
Volta will seal the deal as the comparison to a GTX 1030 will not even make sense when a GTX 2030 blows it away. It's never been a competitive strategy to go for the lowest of lowest end GPU performance and expect it to be relevant.
AMD APU performance won't even be relevant once Volta hits.
Memory Bandwidth on the 2400G + DDR-4 3200 will not be that much of a problem.
For comparison GT1030 has 48,06GB/s of memory bandwidth and 2400G with DDR-4 3200 will have 51,2GB/s
Again what stops someone buying a 2200g and putting a 2030 on it?..nothing...how is raven ridge a flawed product because of this revelation?
It's not flawed logic at all, memory prices are hideously expensive top to bottom, 2667mhz ddr4 is not that much more expensive than 3200mhz..(2x4gb)..2400g with 3200mhz ram will likely give ballpark 1030 performance...giving the amd system the option to play budget gaming from the get go with little hassle or add a dgpu in future or upgrade to a 7nm cpu without building a new system.There's one more. The GT1030 system has 65W for the CPU, and 30W for the GPU.
It's not, but the point tential is making is solely relying on a more-powerful-than-an-average iGPU is a flawed logic. I think it will have its own niche uses, but beyond that is very doubtful.
There won't be a GT 2030 this year.You talk about moving goal posts, then say buyers will not care about the igpu perf, but the compute performance?
Seriously? We expect the average buyer to do that?
Also, AMD gpus with igpus do need drivers/have driver complexity, what are you talking about? That's complete misinformation to say otherwise. It's intel igpus that are almost always supported/plug and play since they're the most widely used. If you're talking compatibility, AMD loses there by a longshot...
The dream of people gaming on AMD's extremely weak igpus, in hopes they'll upgrade to a discrete AMD card later is just that, a sad sad dream that won't ever happen.
Volta will seal the deal as the comparison to a GTX 1030 will not even make sense when a GTX 2030 blows it away. It's never been a competitive strategy to go for the lowest of lowest end GPU performance and expect it to be relevant.
AMD APU performance won't even be relevant once Volta hits.
There's one more. The GT1030 system has 65W for the CPU, and 30W for the GPU.
It's intel igpus that are almost always supported/plug and play since they're the most widely used. If you're talking compatibility, AMD loses there by a longshot...
The dream of people gaming on AMD's extremely weak igpus, in hopes they'll upgrade to a discrete AMD card later is just that, a sad sad dream that won't ever happen.
Volta will seal the deal as the comparison to a GTX 1030 will not even make sense when a GTX 2030 blows it away. It's never been a competitive strategy to go for the lowest of lowest end GPU performance and expect it to be relevant.
If Vega was so good, we would have seen midrange Vega on desktop.
It's not, but the point tential is making is solely relying on a more-powerful-than-an-average iGPU is a flawed logic.
Considering classic NVIDIA cadence that they followed like clockwork? Volta is somewhere in the next couple of months. Probably in Q2.When GT2030 is launching ?? Because Ryzen 2200G/2400G are launching in 2 weeks
It won't be in a few months.There is no need for AMD to design and produce low/Midrange VEGA since Polaris is still competitive against NV low/mindrange offerings.
Linux/hackintosh,browser acceleration, x264/x265 both playing and encoding.Really i have no idea what you are talking about here. There is no compatibility problem with any AMD APU that im aware of, from Llano to BR.
If RR does get close to 1030 performance maybe it'd inspire nVidia to do something, but I think it would be more likely something based upon GP107 than Ampere.
Linux/hackintosh,browser acceleration, x264/x265 both playing and encoding.
.
The video engine is now extremely capable, supporting hardware-accelerated decoding of CODECs such as VP9 10-bpc and HEVC 10-bpc at frame-rates of up to 240 for 1080p, and 60 for 4K UHD. It can also encode H.265 8-bpc at frame-rates of up to 120 at 1080p, and 30 at 4K UHD. You finally get to use the display connectors on your socket AM4 motherboards, as the iGPU supports DisplayPort 1.4 and HDMI 2.0b, with resolutions of up to 3840 x 2160 @ 60 Hz with HDR, 1440p @ 144 Hz, and 1080p @ 240 Hz.
There won't be a GT 2030 this year.
GT 630 - 2012
GT 730 - 2014
GT 1030 - 2017
So based on this, the earliest launch would be next year.
GT 2030 is not going to launch this year. See my post above on this page.Considering classic NVIDIA cadence that they followed like clockwork? Volta is somewhere in the next couple of months. Probably in Q2.
It won't be in a few months.
You talk about moving goal posts, then say buyers will not care about the igpu perf, but the compute performance?
Seriously? We expect the average buyer to do that?
Also, AMD gpus with igpus do need drivers/have driver complexity, what are you talking about? That's complete misinformation to say otherwise. It's intel igpus that are almost always supported/plug and play since they're the most widely used. If you're talking compatibility, AMD loses there by a longshot...
The dream of people gaming on AMD's extremely weak igpus, in hopes they'll upgrade to a discrete AMD card later is just that, a sad sad dream that won't ever happen.
Volta will seal the deal as the comparison to a GTX 1030 will not even make sense when a GTX 2030 blows it away. It's never been a competitive strategy to go for the lowest of lowest end GPU performance and expect it to be relevant.
AMD APU performance won't even be relevant once Volta hits.
Also you had mentioned meltdown blowback. How deep are we digging here to find things you don't like about the other side? I mean, do you seriouly expect people who are spending at the lowest end of the spectrum to care about a cloud computing bug that doesn't affect them in anyway?
I don't see how these extremely high expectations out of AMD help from yourself other than to leave many many users disgruntled when they don't come to fruition...
.
As i have explained before, memory bandwidth with DDR-4 3200 will not be that much of a problem for the 2200G/2400G.
The performance increase from Bristol Ridge (A12-9800) to Raven Ridge (2200G/2400G) will be in the same area as GT730 64bit GDDR-5 to GT1030 64bit GDDR-5.
There won't be a GT 2030 this year.
GT 630 - 2012
GT 730 - 2014
GT 1030 - 2017
So based on this, the earliest launch would be next year.
It's not just that.There's one more. The GT1030 system has 65W for the CPU, and 30W for the GPU.
It's not, but the point tential is making is solely relying on a more-powerful-than-an-average iGPU is a flawed logic. I think it will have its own niche uses, but beyond that is very doubtful.