Question RandomGaminginHD - Steve from UK takes a look at modern games @ 1440P with the RX 6700 10GB non-XT.

VirtualLarry

No Lifer
Aug 25, 2001
56,229
9,990
126

Cliffs: performs pretty well @ 1440P / high.

I would have liked to see a comparison with a 6700, 6700 XT, 5700, and 5700 XT. Along with pricing.
 
  • Like
Reactions: Tlh97 and Leeea

GodisanAtheist

Diamond Member
Nov 16, 2006
6,719
7,016
136
The background for that box shot leaves me with so many questions.

Anyhow, love the clean aesthetic on display from PowerColor, and the 6700 non-Xt is weirdly the best kept secret for a gamer on a "budget" out there. So little fanfare for what is easily the best mid-range card of the gen in terms of price/perf (esp after AMD's new OpenGl and DX11 drivers).
 

KompuKare

Golden Member
Jul 28, 2009
1,012
923
136
First professional written review, I think:
Overall about 12-16% slower than the 6700XT and about 8% faster than the 6650XT.
dZYkOvR.png

Strange that it took this long to get reviews.
 

KompuKare

Golden Member
Jul 28, 2009
1,012
923
136
Like 99% of reviewers won't review a product they aren't sampled. The 6700 was probally just intended to be an OEM only product so AMD didn't bother.
Yes, I suspect it was because they hadn't been sampled. Guess there's no one doing GPU reviews like rtings then.
 

KompuKare

Golden Member
Jul 28, 2009
1,012
923
136
Same price as ARC A770 8GB on Newegg. Poor ARC. How long will it stay there before they have to reduce price further?
I didn't bother to show the perf/watt table from the review. Now that makes ARC look very poor indeed, especially where PCGH lock them all to 60Hz 1080P:
RX 6700 (Sapphire Pulse)RX 6700 XTRX 6650 XTRTX 3060RTX 3070Arc A770Arc A750
Leerlauf (Desktop)8 Watt*7 Watt6 Watt11,5 Watt10,5 Watt47 Watt42 Watt
Dual-Display (UHD + FHD)30 Watt32 Watt24 Watt18,5 Watt15 Watt49 Watt44 Watt
UHD-Youtube-Video16 Watt33 Watt26 Watt17,5 Watt17 Watt55 Watt50 Watt
Gaming (MAXIMUM)190 Watt218 Watt181 Watt171 Watt221 Watt230 Watt226 Watt
Control (WQHD + RT)188 Watt210 Watt178 Watt170 Watt218 Watt226 Watt223 Watt
Anno 1800 (Full HD)189 Watt216 Watt178 Watt170 Watt219 Watt224 Watt198 Watt
Framelock @ 60 Fps (FHD)**67 Watt85 Watt50 Watt69 Watt51 Watt123 Watt118 Watt
(Impressed with vBulletin's paste HTML table, didn't think that would work.)
 
  • Wow
Reactions: igor_kavinski

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,382
2,419
146
First professional written review, I think:
Overall about 12-16% slower than the 6700XT and about 8% faster than the 6650XT.
dZYkOvR.png

Strange that it took this long to get reviews.
Looks like it does well at 1080p, but at higher resolutions is more lacking, probably due to thinner memory bus.
 
Aug 16, 2021
134
96
61
Same price as ARC A770 8GB on Newegg. Poor ARC. How long will it stay there before they have to reduce price further?
I doubt that Intel really meant to make Arc truly competitive. At best I see it as first not too awful release to see where and how it fails, to assess hardware mistakes and software mistakes to roll out a second or third gen product that is actually competitive. So the only goal for Arc gen 1 is to be not to prohibitively expensive test kit and perhaps give it to as many influencers to spread a word about it. IMO it more or less achieves that. Bonus points for nice box and good looks, it will be a nice shelf ornament later.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
4,723
4,628
136
I doubt that Intel really meant to make Arc truly competitive. At best I see it as first not too awful release to see where and how it fails, to assess hardware mistakes and software mistakes to roll out a second or third gen product that is actually competitive. So the only goal for Arc gen 1 is to be not to prohibitively expensive test kit and perhaps give it to as many influencers to spread a word about it. IMO it more or less achieves that. Bonus points for nice box and good looks, it will be a nice shelf ornament later.
This so weird.
Why do you need to release a product to see the hardware & software mistakes?
Intel doesn't have test labs?
Do you think the "influencers" are enhancing ARC standing among gamers?
 
  • Like
Reactions: Tlh97 and KompuKare
Aug 16, 2021
134
96
61
This so weird.
Why do you need to release a product to see the hardware & software mistakes?
Intel doesn't have test labs?
Do you think the "influencers" are enhancing ARC standing among gamers?
Mostly because I can't see Arc competing to anyone well as it is and it has some weird limitations like mandatory ReBAR support. I can't see how it wouldn't be a beta test. I mean, it can't be used in older machines due to ReBAR, it has tons of problem with old games due to driver issues, it lacks proper RT performance and upscaling tech to duke it out with nV and if you want just a cheap raster performance card, Arc loses to AMD, making Arc sort of non-buyable thing. Not to mention that their power efficiency isn't great either. I just can't see any other point in buying Arc cards, other than so test it out and have something that isn't Radeon or GeForce.

Regarding influencers, I absolutely think that they do a lot. Especially LTT, who made so many videos about Arc, explained how it is first gen and rough product and that whole very long video where Linus played a lot of games to see how well they work and run.

As for Intel, I can't read them anymore as their behaviour is seemingly too random. They have been working on discrete cards for over decade with many of them ending up cancelled just before release. Some of them turned out to be Xeon Phi co-processors not GPUs. Meanwhile some cancelled cards like Larabee even had functional (on HW side, drivers may or may not exist) prototype that LTT found. I honestly expected Arc to end up like previous attempts, but why Arc was released now is a mystery to me, as well as why Intel had so many prototypes that went nowhere and why Intel either doesn't intend Arc to be the best or can't make it the best. Maybe they want to turn it into some niche product like AV1 decoder/encoder card, maybe they don't even intend to compete at high end and hopes to compete at sub 300 USD budgets or maybe they expected to make way faster card, but it ended up so "slow" and limited and decided to lower prices.

They themselves have said little about Arc, but perhaps if I cared to investigate their annual reports, maybe I could find what they expect Arc to be. My own take is that Arc is 2 years too late and Arc cards were meant to launch during GPU shortages to sell and then after shortages they may have expected to release much more improved, robust products than first gen. Perhaps Intel is still recoverign from shitty management practices from decades ago, when CEO made Intel cut on RnD and then go downhill.
 

maddie

Diamond Member
Jul 18, 2010
4,723
4,628
136
Mostly because I can't see Arc competing to anyone well as it is and it has some weird limitations like mandatory ReBAR support. I can't see how it wouldn't be a beta test. I mean, it can't be used in older machines due to ReBAR, it has tons of problem with old games due to driver issues, it lacks proper RT performance and upscaling tech to duke it out with nV and if you want just a cheap raster performance card, Arc loses to AMD, making Arc sort of non-buyable thing. Not to mention that their power efficiency isn't great either. I just can't see any other point in buying Arc cards, other than so test it out and have something that isn't Radeon or GeForce.

Regarding influencers, I absolutely think that they do a lot. Especially LTT, who made so many videos about Arc, explained how it is first gen and rough product and that whole very long video where Linus played a lot of games to see how well they work and run.

As for Intel, I can't read them anymore as their behaviour is seemingly too random. They have been working on discrete cards for over decade with many of them ending up cancelled just before release. Some of them turned out to be Xeon Phi co-processors not GPUs. Meanwhile some cancelled cards like Larabee even had functional (on HW side, drivers may or may not exist) prototype that LTT found. I honestly expected Arc to end up like previous attempts, but why Arc was released now is a mystery to me, as well as why Intel had so many prototypes that went nowhere and why Intel either doesn't intend Arc to be the best or can't make it the best. Maybe they want to turn it into some niche product like AV1 decoder/encoder card, maybe they don't even intend to compete at high end and hopes to compete at sub 300 USD budgets or maybe they expected to make way faster card, but it ended up so "slow" and limited and decided to lower prices.

They themselves have said little about Arc, but perhaps if I cared to investigate their annual reports, maybe I could find what they expect Arc to be. My own take is that Arc is 2 years too late and Arc cards were meant to launch during GPU shortages to sell and then after shortages they may have expected to release much more improved, robust products than first gen. Perhaps Intel is still recoverign from shitty management practices from decades ago, when CEO made Intel cut on RnD and then go downhill.
I honestly think the entire purpose of the rushed chaotic release are these 2 words. Wall street. Everything else is, as they say, is trying to put lipstick on a pig.
 
  • Haha
Reactions: igor_kavinski
Aug 16, 2021
134
96
61
I honestly think the entire purpose of the rushed chaotic release are these 2 words. Wall street. Everything else is, as they say, is trying to put lipstick on a pig.
Not sure why, Wall Street hates Intel now and ruined their value, mostly due to various blunders in recent history with Arc being the latest one. I fail to see how Arc is supposed to help Intel.
 

blckgrffn

Diamond Member
May 1, 2003
9,111
3,029
136
www.teamjuchems.com
I would definitely recommend this as a minimum spec for those wanting a card to "last" - but for sure anyone that's wanting something more than a 6600 and flirting with the 6650. That 10GB of memory is the same as the xbox fast memory pool and the relatively popular 3080 10GB and so it should buy "high" texture settings for a long time and it should age a lot better than the 6650XT.

Of course, I found a 6700XT locally for $230 and sniped that so... I like that even better and I think hunting a used 6700XT is a move any value shopper should consider.