DrMrLordX
Lifer
- Apr 27, 2000
- 19,160
- 7,917
- 136
Anyone want to trade me a 6800XT for my Radeon VII? It mines faster than that!!!ETH mining performance.Its pretty bad so lucky us gamers.Only 60Mh/s
(lol)
Anyone want to trade me a 6800XT for my Radeon VII? It mines faster than that!!!ETH mining performance.Its pretty bad so lucky us gamers.Only 60Mh/s
There might be a reason for thatI have not read the Anandtech reviews of the Nvidia 3XXX and the Radeon 6XXX cards. If Anandtech has decided not to review GPUs which are not really available to the general public, this is commendable but you should put at least an editorial explaining why you are doing that.
You may be right about the bus, likely it is somewhat limiting for ETH, that said, optimized miners/settings can help a lot. We have seen this before with other cards. I am not saying it will be a better miner than Ampere, just that I expect some improvements.@VirtualLarry
Why do you guys think that it'll be true? I said mining performance will be mediocre, because of the 256-bit bus. At 65MH/s its already taking advantage of the 128MB cache, as 65 x 8 = 520GB/s, which is actually higher than the memory bandwidth of the card plus you have to take off 10% since it won't be 100% efficient.
As for RT performance, few sites are showing worse image quality, so even the lower performance is over represented. Tomshardware results show the 6800 RT is blurry. I wouldn't use it at all if I was considering RT, as even Turing is superior when you count the IQ.
It's a very good effort. Let's see RDNA3.
Well, at the moment it is, if you only take current gen hardware into account :>Hardware unboxed RX6800 review. Makes 3070 look like entry level card.
Maybe you should just change your profile photo to the AMD logo.Hardware unboxed RX6800 review. Makes 3070 look like entry level card.
One of the reviews I read had tomb raider going. But its not what would be considered a good experience.But can you even game on an M1 mac?
Obviously not even close to a modern Windows PC. Though likely better than on my 12 year old Windows PC, so it won't be a step backwards for me. Codeweavers (Crossover) were already demonstrating running Windows game in emulation, better than my old PC would handle it.But can you even game on an M1 mac?
Well, yes. Just not AAA titles in 4k with RT (gag).But can you even game on an M1 mac?
Poking every IHV on twitter last couple of weeks...??? I found it un professional for a site like AT.There might be a reason for that
Maybe you should refrain from making it personal ??Maybe you should just change your profile photo to the AMD logo.
"superior" is a strong word for just 1% lower Cost per Frame at 1440pIf you changed the manufacturer name on the cards, you would be touting the superior "Cost per Frames" of the 3070, as was AMDs previous generation position.
With just 8GB of Vram the 3070 is not a good card for a 500$ product in late 2020/start 2021They are both good cards.
It's not as outrageous as you might think. I'm using strobing (v-sync on) on my monitor to remove blur. So wherever possible I will try to use anywhere from 75 to 120Hz refresh rate (pretty much as high as I can go), because 60Hz is a bit flickery and dark (and laggy). And I need the fps to stay above refresh rate otherwise stutters/judders.I've seen the reviews. Who the heck buys a 3080 to play at 1080p?
Multi game averages from multiple sites show it ahead. Pointing out a few specific titles is 'creating a narrative you want to believe.'There is a variance in the reviews depending on scene, CPU, RAM etc. There are titles that 6800XT matches or even beats the 3090 at 4k (.eg. Dirt 5, AC Valhalla) so stating "3080 wins at 4k" is more of a narrative you want to believe and not a fact.
Maybe, maybe not. AMD has had consoles for a long time now and you'd think you would already see amazing optimizations on current hardware. I bought AMD before on that premise and it didn't really materialize. We'll see. I'm evaluating it on how things stand now. Honestly both are going to see improvements through driver optimizations. Its early days for the ampere drivers as well.Once PS5/XSX focused cross platform titles start pouring in, it won't look nice at all for Ampere. At least this time they have RT as a strong point else it would have been kepler vs GCN again.
Sure, I have a ULMB monitor as well and know all about it. Still they both tend to be more than fast enough on the titles that matter for strobing displays to work well at 1440p. I'm less concerned about single player adventure type games with it. Adaptive sync is more important for those generally.It's not as outrageous as you might think. I'm using strobing (v-sync on) on my monitor to remove blur. So wherever possible I will try to use anywhere from 75 to 120Hz refresh rate (pretty much as high as I can go), because 60Hz is a bit flickery and dark (and laggy). And I need the fps to stay above refresh rate otherwise stutters/judders.
And what the hell is happening with the 5700XT at 1440p? It is often as fast as a 2080 and sometimes even matches a 2080Ti...Hardware unboxed RX6800 review. Makes 3070 look like entry level card.
AMD's fine wine, once again.And what the hell is happening with the 5700XT at 1440p? It is often as fast as a 2080 and sometimes even matches a 2080Ti...
But the AIB won't be the value that the reference is. They'll inflate the price.As has been said many times already.
Lets wait for AIB's inventory.
AMD may simply have had very, very small inventory of REFERENCE designs, and much, much larger inventory of AIB GPUs.
I don't really understand why do we lose minds about reference inventory. IF, and that is a big IF, AIB inventory is as bad - that is the real problem.
It's just the tide of new games getting the spotlight in reviewsAMD's fine wine, once again.
As we have seen in the past, AMD GPUs only become faster with new driver generations.
Gamers buying those monitors bellowWho the heck buys a 3080 to play at 1080p?
Its the new games that can take advantage of RDNA hardware.AMD's fine wine, once again.
As we have seen in the past, AMD GPUs only become faster with new driver generations.
Oh, the great TPU, I didn't know they also do reviews of new hardware.Heheheheh look what happened once real reviews came out...Unlike those shown by AMD these are different..Too bad when i said this 1 month ago everyone downvoted me
View attachment 34168
Thread starter | Similar threads | Forum | Replies | Date |
---|---|---|---|---|
![]() |
nVidia 3090Ti reviews thread | Graphics Cards | 111 | |
I | Article Changes Coming in the Video Card Reviewer Industry — HardOCP | Graphics Cards | 44 | |
J | Question 3050 Reviews | Graphics Cards | 147 | |
![]() |
nVidia restricts 3080 12GB reviews | Graphics Cards | 75 | |
T | Question Trustworthy GPU reviews (until AT resumes)? | Graphics Cards | 6 |