• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question 6900 XT and 12 volt requirements

Mir96TA

Golden Member
Oct 21, 2002
1,913
33
91
Hi,
I can't seems to able to find the 12 volt requirement for this GPU.
I did see 850 Watt PS requirement. Which I do not have one.
I think 6900 XT uses 350 watts, so at 12 volts it would be 30 amps.
Now I do have 750 Watt PS with 60 Amp, I think should be good enough ?
other system load is as follow
MB: B450 AORUS PRO WIFI
Memory : 4 DDR4 XMP profile 1.35 Volt
CPU Ryzen 2700 (65 TDP) No OC
SSD: 1 NVMe,
SATA: 1 SSD
SATA 1 HDD 7200 RPM
Cooling AiO
FAN : 4 PWN
What you all think ?
 

Shmee

Memory and Storage, Graphics Cards
Super Moderator
Sep 13, 2008
5,413
1,032
126
I am not sure exactly, but requirements for the 6900XT may vary by model. If you are good, really depends on the quality and age of your PSU IMO. What is the make/model and how old is it?

On a side note, I suspect a 6900XT may be bottlenecked by your Ryzen 2700. Luckily, your board may support a 5800X, or at least a 3700X, with a BIOS update.
 

Justinus

Platinum Member
Oct 10, 2005
2,749
863
136
power supply capability isnt just about steady state load power. The newer GPU's from both AMD and Nvidia have awful transient spikes (Nvidia worse than AMD due to the Samsung 8nm node vs. TSMC 7nm). The rating stated is to provide sufficient headroom for stable steady state operation and to have additional capability to deal with the transient spikes due to load changes.

It really will come down to which 6900xt (several aib brands have higher power limits and clocks out of the box) and your PSU model, age, and quality.
 
  • Like
Reactions: Shmee

Mir96TA

Golden Member
Oct 21, 2002
1,913
33
91
Also GPU is going to drive 4K display at 75 hz
And 2k k display at 70 hz
So far I have not seen my cpu went over 70 % usage.
I had to be mindfull if CPU TDP and core count.
I can’t go below 7 core and I like to stay at 65 TDP bracket.
 

Shmee

Memory and Storage, Graphics Cards
Super Moderator
Sep 13, 2008
5,413
1,032
126
Sounds like it, if it is direct from AMD. In the case of your PSU, you are likely fine IMO. The gold supernovas from EVGA are pretty good, and it is pretty much new. If need be you can always decrease the power limit on the GPU as well.

I would still recommend a possible upgrade to a Ryzen 5000 CPU if possible, considering you have a 6900XT. You can always limit power consumption in BIOS/Ryzen Master if need be, though you should be fine as long as your AIO cooler is decent. Of course it will likely depend on the game played if there is any bottleneck, and at least your display is 4k, but only 75Hz, so it is not like you are chasing high FPS minimums.
 

Mir96TA

Golden Member
Oct 21, 2002
1,913
33
91
Yes GPU is made by AMD and I bought it from AMD.
AiB cards were little too expensive for my willingness to purchase.
If I see my CPu start hitting a duty cycle of 90 % then I will go for newer CPU.
Hopefully by that time cpu would have V cache.
 

Justinus

Platinum Member
Oct 10, 2005
2,749
863
136
I briefly tested a reference 6900XT on an evga G3 550W and found no issues. Unfortunately that specific model of power supply is known for not triggering overcurrent protection when the power supply is overloaded, so it's a bad sample set.

I think your 750W is likely fine, so long as you don't get into any of the power modding like MorePowerTool to increase or remove the power limits.

RDNA 2 can draw a lot of power if you let it ;)

unknown-39.png
 

Mir96TA

Golden Member
Oct 21, 2002
1,913
33
91
So it can use 50 amps.
Err..... I just learn, either I buy new PS or used what is left over by son. :confused:
He is using the my intended EVGA 750 watts P.S.
Now I have left with this XFX P.S. which is 8 year old P.S. :eek:
It is XFX Pro 750W with 60 Amp on 12 volt rail
Which is 80 + Bronze :rolleyes:
I don't think XFX don't even sell P/S any more :tonguewink:
 

Justinus

Platinum Member
Oct 10, 2005
2,749
863
136
So it can use 50 amps.
Err..... I just learn, either I buy new PS or used what is left over by son. :confused:
He is using the my intended EVGA 750 watts P.S.
Now I have left with this XFX P.S. which is 8 year old P.S. :eek:
It is XFX Pro 750W with 60 Amp on 12 volt rail
Which is 80 + Bronze :rolleyes:
I don't think XFX don't even sell P/S any more :tonguewink:
It can't use 50 amps unless you use MorePowerTool to remove power limits and heavily overclock it.

As for the XFX, if it's in working shape you'll just have to try it and see. Most likely if there's an issue it'll just be shutdowns or reboots while gaming or other heavy loads.

Although an older unit like that may not have the best protections, it's a tough situation. If you had that new evga 750W G+ I'd say to use it without reservation.
 
  • Like
Reactions: Shmee

Justinus

Platinum Member
Oct 10, 2005
2,749
863
136
Run the 3D FireStrike Bench mark It got 21200
However I did saw Pwr usage went to 400 Watts which is 33 Amp.
33 Amp is a pretty easy for this PS to handle.
I thought 6900 XT are equals to 3080 Ti!
Most likely it's being bottlenecked by your Ryzen 2700, as Shmee mentioned earlier.

Try running Time Spy, Time Spy Extreme, or Fire Strike Extreme to better isolate your GPU from the CPU bottleneck.
 
  • Like
Reactions: Leeea

kschendel

Member
Aug 1, 2018
120
56
71
CPU's don't bottleneck GPU's, at least not in a data flow sense. (The data flow is the other direction, CPU to GPU.) It's possible that the benchmark is sufficiently CPU-heavy that the CPU becomes the limiting resource. That doesn't mean that the 6900XT is somehow being made to run slower by the CPU, because that's not how it works. CPU and GPU are completely independent computing units.

If you are running a 75Hz monitor for gaming then you don't need any more CPU than will deliver 75+fps consistently in the games you play. That base rate has nothing to do with resolution and very little to do with (most) game graphics settings.
 

Justinus

Platinum Member
Oct 10, 2005
2,749
863
136
So I run the TimeSpy and highest cpu I saw was 67% except CPU test.View attachment 50661
1) CPU can most definitely bottleneck GPU performance in gaming scenarios, depending on the game and resolution. In the case of Firestrike, you are absolutely CPU bound.

1632521703089.png

Even in a demanding game like RE VIII, at 1080P Zen+/Zen2 processors have significantly lower lows and averages than Intel 10/11 series and Zen 3 processors. This is just one example I picked at random, there are many, many, many more.

2) You can't rely on reported CPU utilization to indicate if you are CPU bound. The CPU might have idle cycles (that are reported as idle time, a.k.a. not load) waiting on cache, memory, or other communication that eventually leads to bottlenecking the GPU or it might be hammering a single core or set of cores, which won't report as 100% utilization (because loading one core is only reported as that fraction of the total utilization).

I implored you to run Time Spy, Time Spy Extreme, or Firestrike Extreme (or all 3!) because they are all at higher resolutions and much less likely to show CPU bottlenecking. You'll get a better read on if your 6900XT is performing correctly, especially at the 4k resolution of your primary display.
 

Mir96TA

Golden Member
Oct 21, 2002
1,913
33
91
1) CPU can most definitely bottleneck GPU performance in gaming scenarios, depending on the game and resolution. In the case of Firestrike, you are absolutely CPU bound.

View attachment 50663

Even in a demanding game like RE VIII, at 1080P Zen+/Zen2 processors have significantly lower lows and averages than Intel 10/11 series and Zen 3 processors. This is just one example I picked at random, there are many, many, many more.

2) You can't rely on reported CPU utilization to indicate if you are CPU bound. The CPU might have idle cycles (that are reported as idle time, a.k.a. not load) waiting on cache, memory, or other communication that eventually leads to bottlenecking the GPU or it might be hammering a single core or set of cores, which won't report as 100% utilization (because loading one core is only reported as that fraction of the total utilization).
I am saying, I was hoping 6900 XT to perform like 3080 Ti.
What I notice performance was below 3080.
There was no visual clue , which indicated the CPU was pegged 90% duty cycle or more.
I know if I slip 3080 in this setup 3D mark test would go up.
 

Justinus

Platinum Member
Oct 10, 2005
2,749
863
136
I am saying, I was hoping 6900 XT to perform like 3080 Ti.
What I notice performance was below 3080.
There was no visual clue , which indicated the CPU was pegged 90% duty cycle or more.
I know if I slip 3080 in this setup 3D mark test would go up.
And I'm telling you to run a benchmark I listed that runs at a resolution higher than 1080p. Your systems gaming performance at 1080p quite literally doesn't matter because your primary display is 4k.

Run the 1440p/4k benchmarks I outlined and we can determine if your performance is as expected.
 

kschendel

Member
Aug 1, 2018
120
56
71
1) CPU can most definitely bottleneck GPU performance in gaming scenarios, depending on the game and resolution.
CPU's can absolutely be the rate-limiting part. I am wary of the term "bottleneck" because it implies an incorrect mental model, whereby either a) one imagines that the GPU runs slower if the CPU is slow, or b) data is moving from the GPU through the CPU and the CPU is the bottleneck. Neither is true. A slow CPU might cause the GPU to wait for the next frame, but once it arrives, the GPU renders it at its full speed independently of the CPU. And it's the CPU that generates each frame for the GPU to render and display on the monitor.
 
  • Like
Reactions: Leeea

Leeea

Golden Member
Apr 3, 2020
1,178
1,413
96
I was able to run a Vega 56 + AMD reference* 6900xt simultaneously* on a EVGA 650 watt power supply with minimal issues.

You will not have a problem with your 750 watt EVGA.

*the exact same card you have
*I would eth mine with one and game on the other. The Vega was setup to run 150 watts continuously.

I thought 6900 XT are equals to 3080 Ti!
Your CPU is the limiting factor.

And the RTX series is far harder on the CPU, anything RTX would yield even poorer performance then what you have.

A 2700 is a great CPU, but if your trying to do good on benchmarks it is going to be your bottle neck in a big way.


I suggest ignore the useless benchmark, it is not even a 4k bench. Use the card the way you plan it. You will know soon enough if it meets your needs, or if you need to upgrade your CPU.

I started with a weak cpu also. I did eventually upgrade from a 4790k to a 5900x. My performance did increase noticeably. For example, the very poorly coded MechWarrior Online went from around 50 fps to a 150 fps at 1440p. Other games saw less of an increase, but they all noticeably improved.

This is relevant to you because your 2700x is very close to my old 4790k in a lot of games:

( My biggest problem is I have been unable to get the RGB on the rx6900xt to sync with the RGB on my mainboard. The horrors, the horrors. )
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
5,506
1,297
136
CPU's can absolutely be the rate-limiting part. I am wary of the term "bottleneck" because it implies an incorrect mental model, whereby either a) one imagines that the GPU runs slower if the CPU is slow, or b) data is moving from the GPU through the CPU and the CPU is the bottleneck. Neither is true. A slow CPU might cause the GPU to wait for the next frame, but once it arrives, the GPU renders it at its full speed independently of the CPU. And it's the CPU that generates each frame for the GPU to render and display on the monitor.
You are the first person I have ever seen say this. And it does not make any sense.

If the CPU is unable to keep up with the work load, it will act as a bottleneck to the GPU. Thus causing the GPU to perform poorly compared to what it is capable of.

Its along the same lines of having a water pipe with a pump at the end with a literal bottleneck mid way through the pipe in front of the pump. Sure the pump may be able to output 100 GPM, but the flow is being cut in half by the bottleneck, so now it can only output 50GPM.

The end result is the same.
 

Leeea

Golden Member
Apr 3, 2020
1,178
1,413
96
You are the first person I have ever seen say this. And it does not make any sense.
kschendel is technically correct.

But your analogy is also applicable, and the end result is the same.



The GPU will always wait until it has all* of the data to render the frame. It will then render that frame as fast as it possibly can while the CPU works on the next frame. If the GPU finishes rendering the current frame before the CPU finishes the next frame, it has to wait for the CPU to finish before it can start the next one.

A good CPU + well coded application will keep up with it and already have the data for the next frame transferred to the GPU, so it does not have to wait to start rendering the next frame.

However, with CPUs that are not able to keep up, like the OPs 2700x, the GPU will sit there and do nothing while waiting for the data it needs to render the next frame.

*most of the data will already be on the GPU. Textures and models are both cached in GPU memory and do not need to be resent every time. The CPU just has to update the changes in the scene and models before submitting the frame to be rendered. A well designed application will do this while the GPU is rendering the previous frame. However, if the GPU renders the frame before the CPU has finish updating the data for the next frame, the GPU has to wait for the CPU.

This is sometimes referred to as "double buffering". Where the buffer has the frame being rendered, and the next frame being built in it. Triple buffering is when the buffer has the frame being rendered, the next frame ready to go, and another frame being built.
 
Last edited:

ASK THE COMMUNITY