Who got a 7900XTX or 7900XT?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

videopho

Diamond Member
Apr 8, 2005
4,185
29
91
Is the hot spot temp is the same as junction temp?
And it is mostly read in the 90c in game.
GPU is mostly read in the 60c while in game.
Would it be a concern on a in-game long duration?
Also note I run everything on default speed.
 

videopho

Diamond Member
Apr 8, 2005
4,185
29
91
Hey guys and gals...My junction and gpu delta temps is around 40c or high 90c and high 50c respectively.
I've been reading off some AMD ref cards had had a history of extremely high junction temps that sort of worries me.
Would the wide delta gap be a cause for concern?
 
  • Like
Reactions: Joe NYC

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,403
2,439
146
Hey guys and gals...My junction and gpu delta temps is around 40c or high 90c and high 50c respectively.
I've been reading off some AMD ref cards had had a history of extremely high junction temps that sort of worries me.
Would the wide delta gap be a cause for concern?
It sounds to me like the hot spot shouldn't be that hot. I don't think my temps ever got that hot, and my card is air cooled. You might want to reach out to Powercolor, and see what they say about those temps.
 

videopho

Diamond Member
Apr 8, 2005
4,185
29
91
Not all games I play reach that high. This might just do the trick for me. Games that run the junction temp up. I tuned it to 95% underclock and 97% undervolt which brings Junction temp goes below 100c to low 90c.
 

pj-

Senior member
May 5, 2015
481
249
116
AMD has a lot of kinks to work out in multi monitor setups. One of my displays doesn't receive a signal on PC startup and I have to power cycle it before it shows anything. Sometimes HDR will freak out on my TV when I'm playing a game and I alt tab out to windows for something. It also seems like display numbering is random and not persistent so a game that launched on one display yesterday will launch on a different one today. Also the idle power consumption is around 90w still.

None of those things happened on my 3090 with identical displays.
 

In2Photos

Golden Member
Mar 21, 2007
1,628
1,651
136
AMD has a lot of kinks to work out in multi monitor setups. One of my displays doesn't receive a signal on PC startup and I have to power cycle it before it shows anything. Sometimes HDR will freak out on my TV when I'm playing a game and I alt tab out to windows for something. It also seems like display numbering is random and not persistent so a game that launched on one display yesterday will launch on a different one today. Also the idle power consumption is around 90w still.

None of those things happened on my 3090 with identical displays.
I don't have any of those problems with my 6800XT and 3 displays.
 

videopho

Diamond Member
Apr 8, 2005
4,185
29
91
Despite a few quirks in tweaking games (I'm new to AMD g/c) I strongly believe the 7900xtx is a keeper. It has allowed me to play games such as COD BO and Vanguard without a hitch whereas the previous NVIDIA (3090ti) could never pass shader compilation w/o crashing to desktop. The sim games are even better with massive frame rate increase. The Adrenaline is one tuner that NVIDIA was never able to match or offer to its consumers and shame on it!
 

Leeea

Diamond Member
Apr 3, 2020
3,623
5,366
136
Hey guys and gals...My junction and gpu delta temps is around 40c or high 90c and high 50c respectively.
I've been reading off some AMD ref cards had had a history of extremely high junction temps that sort of worries me.
Would the wide delta gap be a cause for concern?
that seems high delta, but,

if your junction temp is not going over a 100C, seems like non-issue.
 

pauldun170

Diamond Member
Sep 26, 2011
9,133
5,072
136
After several years of flawless service, RTX2060 officially retired and replaced with Asrock Phantom 7900 XT. Taking advantage of the drop to 799 + last of Us bundle
For the first time in a very long time I now have the joys of troubleshooting games crashing.
Despite strong disagreements with RDR2 and the Witcher 3, enjoying the quiet vs the (single fan) 2060 which would get pretty loud under load.
 
  • Like
Reactions: Shmee and Joe NYC

pj-

Senior member
May 5, 2015
481
249
116
After several years of flawless service, RTX2060 officially retired and replaced with Asrock Phantom 7900 XT. Taking advantage of the drop to 799 + last of Us bundle
For the first time in a very long time I now have the joys of troubleshooting games crashing.
Despite strong disagreements with RDR2 and the Witcher 3, enjoying the quiet vs the (single fan) 2060 which would get pretty loud under load.

DX12 in witcher 3 is pretty busted with 7900s. I put about 130h into it with my 7900xtx and probably 90% was DX12 with RT. I had dozens of crashes in total but after a while I figured out some patterns to avoid some crashes and enjoyed the look of RT enough that I dealt with the purely random crashes every couple of hours. Save and switch to DX11 if: you need to use Igni or Aard, or you're entering an area with portals, or you're in a story mission with scripted events and you get 1 crash (it will almost certainly be repeatable when you try again), or if there are particle effects on screen at the time of the crash and it's something you know you have to do again (e.g. crash when a monster uses a certain ability)
 

pauldun170

Diamond Member
Sep 26, 2011
9,133
5,072
136
DX12 in witcher 3 is pretty busted with 7900s. I put about 130h into it with my 7900xtx and probably 90% was DX12 with RT. I had dozens of crashes in total but after a while I figured out some patterns to avoid some crashes and enjoyed the look of RT enough that I dealt with the purely random crashes every couple of hours. Save and switch to DX11 if: you need to use Igni or Aard, or you're entering an area with portals, or you're in a story mission with scripted events and you get 1 crash (it will almost certainly be repeatable when you try again), or if there are particle effects on screen at the time of the crash and it's something you know you have to do again (e.g. crash when a monster uses a certain ability)
Fortunately I finished Witcher 3 awhile back and only loaded it up to see see what max everything @1440p would look like now that I have a card with horsepower.

With RDR2, I was just looking to benchmark it.(lost interest in that game and never finished the story). Switching to DX12 from Vulcan appeared to clear things up.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,485
20,571
146
Not all games I play reach that high. This might just do the trick for me. Games that run the junction temp up. I tuned it to 95% underclock and 97% undervolt which brings Junction temp goes below 100c to low 90c.
If you are in North America, and the room the PC is in, will be hotter in August than it is now, I would seriously consider an RMA.
 

hardhat

Senior member
Dec 4, 2011
422
114
116
So I purchased a saphire pulse 7900xt as an upgrade from an Nvidia 1660. It has been interesting. Great performance, but two issues so far...
1. Any time I was watching a video in full screen I would get slow downs and video playback issues after switching from windowed to full screen. I did some research and fixed that issue by adding a reg key. Seems completely fixed now. Here's a thread on the issue, it seems like a problem caused by Microsoft primarily: Disabling Multi-Plane Overlay (MPO) fixed all desktop flickering/stuttering on my 6900XT : Amd (reddit.com)
2. I was getting crashes in Cyberpunk. I had disabled the initial launcher and startup movies with some launch options. Turns out that this was causing an issue with the game detecting the GPU change correctly. After I removed the launch options the issue is fixed, and I haven't had any more crashes.

I hope that my issues are over now. This was my most expensive GPU purchase by far to date, and I would be very disappointed if I continue to have issues.
 

hardhat

Senior member
Dec 4, 2011
422
114
116
Question: Now that I have a GPU that can generate massive FPS, far beyond what my display requires, are there any advantages to letting the GPU render frames above 60fps? What is the best way to limit the FPS in the AMD software?
I have a 1080p 60hz monitor for now. I will probably go to a 1440p 120hz monitor later this year.
 

DrMrLordX

Lifer
Apr 27, 2000
21,629
10,841
136
Question: Now that I have a GPU that can generate massive FPS, far beyond what my display requires, are there any advantages to letting the GPU render frames above 60fps? What is the best way to limit the FPS in the AMD software?
I have a 1080p 60hz monitor for now. I will probably go to a 1440p 120hz monitor later this year.

The best way is probably to use a frame limit built into the game itself. Most modern games have this option. Alternatively you can turn on vsync and enjoy all the sweet input lag.
 
  • Haha
Reactions: igor_kavinski

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,485
20,571
146
Yes. It is a joke.
I LOL'd because it's such an ambiguous answer. It leaves it open as to if you are saying you were cracking a joke, or if the latency penalty at locked 60 is a joke.

If the latter; I use the anti-lag feature when I play at locked 60. I don't have any difficulty with hitting my timing during boss battles in Wo Long, so it must be working?
 

DrMrLordX

Lifer
Apr 27, 2000
21,629
10,841
136
I LOL'd because it's such an ambiguous answer. It leaves it open as to if you are saying you were cracking a joke, or if the latency penalty at locked 60 is a joke.

If the latter; I use the anti-lag feature when I play at locked 60. I don't have any difficulty with hitting my timing during boss battles in Wo Long, so it must be working?

Well that's good to know. Personally I haven't used vsync in I-don't-know-how-long and with all the adaptive sync and gsync implementations out there now, it's hard to see why anyone would use traditional vsync anymore.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,485
20,571
146
Well that's good to know. Personally I haven't used vsync in I-don't-know-how-long and with all the adaptive sync and gsync implementations out there now, it's hard to see why anyone would use traditional vsync anymore.
I play a lot on the TV in the living room. I'll get a gaming TV one of these days.
 

videopho

Diamond Member
Apr 8, 2005
4,185
29
91
I wonder the lesser powerful sibling 7900xt would be more worth it for me since I have to undervolt and underclock in oder to keep the junct tamps below 100c with my 7900xtx.
And how much in % would the overall performance be with the 7900xt in a 5k monitor?