• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

News Intel GPUs - we've given up on B770, where's Celestial already

Page 127 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I am not a loyal customer (no Intel CPUs in my household), but I may purchase the top model just to own a small piece of history, and also to play with.
I may get one too. I have so many older games I want to test, I haven't seen on any list yet. But this entry level one, don't care about higher performance tiers at the moment.
 
I may get one too. I have so many older games I want to test, I haven't seen on any list yet. But this entry level one, don't care about higher performance tiers at the moment.

Would you guys share how video encoding with AV1? Would be very interested in this.
 
Did Intel disclose the thickness and length of the first party A770? Curious if it will fit in a case such as the Dan A4-SFX.
 
75W board power, yet it sports an 8-pin power connector, on the ASRock Challenger board? Not sure I'm happy about that.

Sounds like Intel is measuring their GPUs the same way they measure their CPUs. Where a 125W 12700K will routinely use well over 200W.

I have seen plenty of small GPU's use an additional 6pin, for a total of 150W. But to use an 8pin makes me question just how much it will actually use.
 
Sounds like Intel is measuring their GPUs the same way they measure their CPUs. Where a 125W 12700K will routinely use well over 200W.

I have seen plenty of small GPU's use an additional 6pin, for a total of 150W. But to use an 8pin makes me question just how much it will actually use.


Did you check some A380 reviews? You should. The Gunnir has a TBP of slightly over 90W. The Challenger has a 10W lower GPU power limit, TBP should be lower.
 
A380 power can be tweaked with a absolute max of 800W PL4, PL1 & 2 at 298W & 395W respectively. Realistically it's sub 100W with Intel's own driver software, still a 6 pin would be more appropriate.

 
A380 power can be tweaked with a absolute max of 800W PL4, PL1 & 2 at 298W & 395W respectively. Realistically it's sub 100W with Intel's own driver software, still a 6 pin would be more appropriate.



It is entirely possible that the 8 pin connector is cheaper than the 6 pin due to production volume.
 
Good to hear. As of right now my only interest in ARC is for PLEX because of their excellent decoder/encoder support. I'm irked by nvidia artificially limiting number of transcode streams to only three on top of skimping on memory on its cards. It would be nice to have real competition there. Still unsure how well Intel will fare on gaming front, but more competition is always good.
 
Charlie at SA says that there was a 6-8 weeks delay due to lockdowns. Driver delays are due to Russia/Ukraine war because they had most of shader compiler and performance optimization people there. He says they took most of the employees out of Russia and relocated them but some stayed behind.
 
Well the Newegg reviews are about what you'd expect for the A380. The guy testing in the 6th gen can suck it. No one cares about that edge lord. 😛

The review with the pictures feels like a compensated review IMO. And providing a Time Spy score of all things, really set off the Spidey sense. And what? you test with 3DMark but won't shell out $5 for it on a Steam sale?

jim-carrey-loser.gif
 
Well the Newegg reviews are about what you'd expect for the A380. The guy testing in the 6th gen can suck it. No one cares about that edge lord. 😛

The review with the pictures feels like a compensated review IMO. And providing a Time Spy score of all things, really set off the Spidey sense. And what? you test with 3DMark but won't shell out $5 for it on a Steam sale?

One of them has a response from Asrock, and this line is a bit scary, and not just because of the grammar:
Please note: If the driver has been installed under OS, please DO NOT shutting down or restart immediately after booting, because Intel graphics driver might access the flash memory of graphics cards when entering OS.

I can see why most manufacturers bailed out of wanting to sell early cards. Asrock as a whole makes good stuff (one of my favorite motherboard manufacturers), but this poor release is going to come back on them. As their name is on the packaging.
 
Back
Top