It sounds to me like the hot spot shouldn't be that hot. I don't think my temps ever got that hot, and my card is air cooled. You might want to reach out to Powercolor, and see what they say about those temps.Hey guys and gals...My junction and gpu delta temps is around 40c or high 90c and high 50c respectively.
I've been reading off some AMD ref cards had had a history of extremely high junction temps that sort of worries me.
Would the wide delta gap be a cause for concern?
I don't have any of those problems with my 6800XT and 3 displays.AMD has a lot of kinks to work out in multi monitor setups. One of my displays doesn't receive a signal on PC startup and I have to power cycle it before it shows anything. Sometimes HDR will freak out on my TV when I'm playing a game and I alt tab out to windows for something. It also seems like display numbering is random and not persistent so a game that launched on one display yesterday will launch on a different one today. Also the idle power consumption is around 90w still.
None of those things happened on my 3090 with identical displays.
that seems high delta, but,Hey guys and gals...My junction and gpu delta temps is around 40c or high 90c and high 50c respectively.
I've been reading off some AMD ref cards had had a history of extremely high junction temps that sort of worries me.
Would the wide delta gap be a cause for concern?
After several years of flawless service, RTX2060 officially retired and replaced with Asrock Phantom 7900 XT. Taking advantage of the drop to 799 + last of Us bundle
For the first time in a very long time I now have the joys of troubleshooting games crashing.
Despite strong disagreements with RDR2 and the Witcher 3, enjoying the quiet vs the (single fan) 2060 which would get pretty loud under load.
Fortunately I finished Witcher 3 awhile back and only loaded it up to see see what max everything @1440p would look like now that I have a card with horsepower.DX12 in witcher 3 is pretty busted with 7900s. I put about 130h into it with my 7900xtx and probably 90% was DX12 with RT. I had dozens of crashes in total but after a while I figured out some patterns to avoid some crashes and enjoyed the look of RT enough that I dealt with the purely random crashes every couple of hours. Save and switch to DX11 if: you need to use Igni or Aard, or you're entering an area with portals, or you're in a story mission with scripted events and you get 1 crash (it will almost certainly be repeatable when you try again), or if there are particle effects on screen at the time of the crash and it's something you know you have to do again (e.g. crash when a monster uses a certain ability)
If you are in North America, and the room the PC is in, will be hotter in August than it is now, I would seriously consider an RMA.Not all games I play reach that high. This might just do the trick for me. Games that run the junction temp up. I tuned it to 95% underclock and 97% undervolt which brings Junction temp goes below 100c to low 90c.
Chill your card: https://www.amd.com/en/technologies/radeon-software-chillWhat is the best way to limit the FPS in the AMD software?
Question: Now that I have a GPU that can generate massive FPS, far beyond what my display requires, are there any advantages to letting the GPU render frames above 60fps? What is the best way to limit the FPS in the AMD software?
I have a 1080p 60hz monitor for now. I will probably go to a 1440p 120hz monitor later this year.
Is this meant to be a joke?Alternatively you can turn on vsync and enjoy all the sweet input lag.
Is this meant to be a joke?
I LOL'd because it's such an ambiguous answer. It leaves it open as to if you are saying you were cracking a joke, or if the latency penalty at locked 60 is a joke.Yes. It is a joke.
I LOL'd because it's such an ambiguous answer. It leaves it open as to if you are saying you were cracking a joke, or if the latency penalty at locked 60 is a joke.
If the latter; I use the anti-lag feature when I play at locked 60. I don't have any difficulty with hitting my timing during boss battles in Wo Long, so it must be working?
I play a lot on the TV in the living room. I'll get a gaming TV one of these days.Well that's good to know. Personally I haven't used vsync in I-don't-know-how-long and with all the adaptive sync and gsync implementations out there now, it's hard to see why anyone would use traditional vsync anymore.
What exactly is a gaming TV?I play a lot on the TV in the living room. I'll get a gaming TV one of these days.