Discussion Radeon 6500XT and 6400

Page 31 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GodisanAtheist

Diamond Member
Nov 16, 2006
6,717
7,014
136
Just getting a thread up for this. Navi 24 will ride... Q1 2022... ish.


I fugure the 6500XT lands ~5500XT territory for ~$200.
 
  • Like
Reactions: Tlh97
Jul 27, 2020
15,749
9,815
106
Shows why Lisa was right in firing Raja (or made him quit, whichever is true). He might know how to make GPU computing work but trying to force a GPU compute device to run games is not something he has figured out after all these years.
 

maddie

Diamond Member
Jul 18, 2010
4,723
4,628
136
Shows why Lisa was right in firing Raja (or made him quit, whichever is true). He might know how to make GPU computing work but trying to force a GPU compute device to run games is not something he has figured out after all these years.
I've been wondering over the past 2 yrs if he's much overrated. People with big ideas are not that rare, those with realizable ones are, and the people who can navigate the pitfalls to actually accomplish them even rarer. Raja is definitely not in the last group, details matter.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
Raja is probably one of those types that's a really talented engineer, but an awful manager. The industry is rife with people who were promoted past their competence.
 

maddie

Diamond Member
Jul 18, 2010
4,723
4,628
136
Raja is probably one of those types that's a really talented engineer, but an awful manager. The industry is rife with people who were promoted past their competence.
Assuming Raja is really talented, contrast him and Keller in their "lust for power", and who appears to be more effective. Raja would never accept a subordinate role.
 

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
I disagree with all you guys bashing Raja.


Look what he did here. He went to Intel, and created a working graphics card from close to nothing.

Yea, it is inferior to all of its competitors.

But it works! With DX12! kinda of.


It is very impressive for a first effort. Yes, it is not the rtx 3070 we were promised.


I think that is the real problem with Raja, he over promises both on capabilities and timeline.

But he did make a working GPU where there was nothing before. That is amazing in its own right.
 

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
Correction: Second effort. This is DG2. Maybe third time will be the charm?
3rd time? no way

more like 5th time, if they are lucky, maybe. Intel just needs to hang in there for about a decade and take the hits, learn the drivers, and develop the talent. If Intel is not willing to lose lots of money while they play catch up they are never going to achieve anything.
 

R81Z3N1

Member
Jul 15, 2017
77
24
81
You guys are an abundance of information, I think in a few weeks or months we will see some of the fallout of these decisions. Would be kind of hard to not talk about an intel product which is struggling in the segment it was designed for.

I also think that some of these topics are almost interconnected, especially if you talk about GPU's and the economy, kind of hard not to think about Crypto Currency. Especially if your a gamer and want a recent product.

Granted we don't need to know the daily price of Bit Coin, or discuss Wallets, kind of like gas prices. If you drive to work you are impacted, and if you consider the the impact that manufacturing has on carbon output of electric cars your impacted as well. I think it's hard not to have an honest discussion without briefly mentioning some of those things.

But if you analyze the current market, or are doing research, it seems the time to make profits are drastically in decline. If your an Intel share holder, I could see some folks getting pretty upset, seems they missed the boat.

R81Z3N1
 

Asterox

Golden Member
May 15, 2012
1,026
1,775
136
That isn't the point. HW decode is more energy efficient then SW decode. With current insane energy prices, every little bit helps.

That plus the fact even the most basic Celeron (G6900) from Intel has full HW support. Its just a bit disappointing, that's all. AV1 does stand to become very common.

First you must find AV1 videos, or even on Youtube AV1 videos are like Elephants in a ballet studio. :grinning:

My real world usage, i dont care about AV1.The fact that Renoir/Cezanne VCN do not support AV1, for most people though completely irrelevant."From my example as 2/4 CPU", Desktop CPU+iGPU power consumption 4K/30 AV1 Youtube playback is around 50W.


1030 is not an upgrade over latest AMD APUs, it is even a downgrade in many games vs 5600G/5700G.

Red has no Nvidia NVENC, it is a big minus because every AMD APU has VCN.Rembrandt APU has AV1 support, or in AM5 socket every new CPU will have iGPU/VCN with AV1 support.

 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Desktop CPU+iGPU power consumption 4K/30 AV1 Youtube playback is around 50W.

You don't think it matters? Try paying 57¢ per KWh, and get back on that. Spot prices here have been above $1 per KWh during peaks. Probably with worse to come due to a gas crisis in Germany.

If you can reduce that to 25W, that is literally double the playback time per KWh you pump in.
 

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
You don't think it matters? Try paying 57¢ per KWh, and get back on that. Spot prices here have been above $1 per KWh during peaks. Probably with worse to come due to a gas crisis in Germany.

If you can reduce that to 25W, that is literally double the playback time per KWh you pump in.

3 hour movie, at 50 watts, at 57 cents per kilowatt: 9 cents
3 hour movie, at 25 watts, at 57 cents per kilowatt: 4 cents

A cost difference of five cents is insignificant.



Thing is, I do not believe it is a difference between 50 watts and 25 watts. Those numbers are to convenient. I suspect if we put a wall meter on both systems, if the systems are in the same class ( ie desktop vs desktop with same quality psu's ), the wattage would be very similar. For example, most displays consume 30-70 watts just to display the movie. Then their is the power the mainboard needs to run. The GPU power. And the speakers. Etc. Even with a hardware video engine the CPU still needs to be awake and shuffling data around.
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
A cost difference of five cents is insignificant.

50W = 20 hours video per KWh. 25W = 40 hours of video.

Across a whole year that is a significant difference. If you f.x. watch 3 hours of whatever video/tv a day (about danish average), on a monthly basis that is 2.325KWh less. On a yearly basis 27.9KWh less.

That is a little less then a third of what I've used all this month to put things in perspective.

Most monitors today really shouldn't be using much more then 20W. My "TV" uses 8W. As measured with a power meter.
 

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
Most monitors today really shouldn't be using much more then 20W. My "TV" uses 8W. As measured with a power meter.
Wow! That is amazing.



My LG c1 TV pulls down 70watts when it is on. Although it is not hooked up to the computer, but runs its own OS like all the other smart TVs.


My Samsung monitor pulls about 40 watts continuously.

My Acer is around 30 wattish.


My computer box by itself pulls down 25-45 watts when it is browsing the internet, which is less then the the monitors I am looking at 30 + 40 = 70 watts.

My 2.1 computer speakers are always pulling down a few watts, not sure how many.

But the main sound system speakers are around 55 watts when idle. But this is likely because I am still using 80s tech from when I was a child.

Even my printer leeches 2-3 watts in idle.


I guess my watt consumption is in a completely different ballpark compared to yours. Along with that, what I view to be significant consumption.


( my electricity is at around 15 cents per kilowatt hour )

All of these electric costs seem irrelevant to me, as soon as I plug my car in they cease to be a significant component of my bill.
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
I guess my watt consumption is in a completely different ballpark compared to yours. Along with that, what I view to be significant consumption.

I've always been slightly obsessed with efficiency. Am I glad about it right now? Being a bit weird has saved me a small fortune with the current climate.

There are people being forced from their homes due to energy prices right now. Electricity is bad enough, natural gas is almost unpayable, and gas prices are though the roof.

All of these electric costs seem irrelevant to me, as soon as I plug my car in they cease to be a significant component of my bill.

Multiply your current bill five times. That gives a reasonable approximation of what we've experienced.

This time last year raw electricity cost before taxes was 0.54DKK. That's about 8¢ per KWh at the then exchange rate. Currently you'll be lucky if cost is below 2DKK (28¢ @ current exchange rate). It often passes double that during peak.
 
  • Like
Reactions: Tlh97 and Leeea

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
50W = 20 hours video per KWh. 25W = 40 hours of video.

Across a whole year that is a significant difference. If you f.x. watch 3 hours of whatever video/tv a day (about danish average), on a monthly basis that is 2.325KWh less. On a yearly basis 27.9KWh less.

Electricity average price in Denmark is ~$.35 per KWh from a quick Google search. That makes the cost difference a bit under $10. Even doubling for higher prices lately is only $20 per year. For most people that's just eating out for lunch one less time.

If you really care about power use, just watch the video on a tablet. Those are going to save enough so you could eat out an extra day each year.