8GB VRAM not enough (and 10 / 12)

Page 51 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
First properly documented case of 12GB not enough.

Hogwarts Legacy 4K + RT, 4060Ti 16GB is faster than 4070.


This is exactly how the thread started about 8GB, with just one single game to begin with (Doom Eternal). And now we have ~25 games.

But...but...we were told 8GB has been "fixed" with patches. Maybe they also need to patch 12GB. o_O
 
Last edited:

Jaskalas

Lifer
Jun 23, 2004
35,406
9,601
136
First properly documented case of 12GB not enough.

Hogwarts Legacy 4K + RT, 4060Ti 16GB is faster than 4070.


This is exactly how the thread started about 8GB, with just one single game to begin with (Doom Eternal). And now we have ~25 games.

But...but...we were told 8GB has been "fixed" with patches. Maybe they also need to patch 12GB. o_O
To be fair, the 8gb thing is for 1080p.
Where as your 12gb example is 4k. But no card with 12gb is actually meant for that resolution. :wink:
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,681
31,538
146
4070Ti can match the 3090Ti so I don't think saying it is not a 4k card is accurate.
The wink indicates he wasn't serious.

There is no real debate about vram to be had. From the start of PC gaming, hardware demands have constantly increased. That has not, and will not for the foreseeable future change. 8GB was flagship many years ago, now it is entry level, or it should be anyways (Which is why we are looking at you Nvidia.) It isn't a debate, just irrational users being stuck in the first stage of the Kubler-Ross model.

More vram is better for gaming until it reaches a point of diminishing returns. And that point definitely ain't 8GB.

"Hurr Durr they have to cater to us low vram owners we are the majority!" They will. Enjoy your last gen console experience, possibly worse, in new AAA games. While the rest of us see what nanite can really offer.
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
First properly documented case of 12GB not enough.

Hogwarts Legacy 4K + RT, 4060Ti 16GB is faster than 4070.


This is exactly how the thread started about 8GB, with just one single game to begin with (Doom Eternal). And now we have ~25 games.

But...but...we were told 8GB has been "fixed" with patches. Maybe they also need to patch 12GB. o_O
You are being serious right now? I mean really serious? And THIS is how the 8GB thread started? And people are getting their pitchforks every time a post like this is posted?

You are showing 19fps vs 15fps, which are both USELESS and you draw your conclusions? Both of these cards are NOT meant for these 4k/high/rt settings. Not even the 4070ti.

How about that 1440p result which is 51 vs 36 though, which makes the 4070 42% faster, on usable settings? Not a peep about that, hmmm? And what is the saddest part, is that you are missing the implications. Meaning that the 4070 could do 4k/high/rt/dlss balanced, at 60fps, while the 4060ti 16gb cannot.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
You are being serious right now? I mean really serious? And THIS is how the 8GB thread started? And people are getting their pitchforks every time a post like this is posted?
If you're expecting someone to click every link presented in this thread and read the contents aloud to you, that's not going to happen.

Likewise, filling the forum with screenshot masturbation doesn't change facts, no matter how badly you want it to.

I assume you don't own NV shares? And I also assume you're very happy with your $800 12GB card?

Why then, instead of basking in the warm glow of your GPU(tm), you spend time trying to turn anonymous internet people from their wicked ways, like a crusading apostle fighting on behalf of the leather jacket messiah?
 
Last edited:

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
Another Hardware Labs vid showing why $300-$400 8GB cards are a bad purchase. This time exclusively UE5 games.

So let's see

Fort solis
1440p 6700xt=20fps 4060ti8GB=25fps
1080p 6700xt=44fps 4060ti8GB=54fps

Remnant II
1440p 6700xt=46fps 4060ti8GB=46fps
1080p 6700xt=70fps 4060ti8GB=74fps

Immortals of Aveum
1440p 6700xt=41fps 4060ti8GB=37fps
1080p 6700xt=72fps 4060ti8GB=70fps

So out of the three, we have one with one draw and one loss for each card. Actually according to gamegpu.tech, the 4060ti is faster in Immortals of Aveum, in both of these resolutions, but lets say AMD improved the drivers and surpassed the 4060ti.

Screenshot 2023-09-18 at 13-34-40 Immortals of Aveum PC Performance Benchmarks for Graphics Ca...png

There is zero indication of anything having to do with VRAM for these UE5 games, but I am glad you chose your wording carefully, and focused on the price and not the vram.

Also the vram findings for all three, show nothing about 8GB not being enough, at least for the resolution these cards are meant for.

Screenshot 2023-09-18 at 13-50-11 Fort Solis PC Performance Benchmarks for Graphics Cards and ...png

Screenshot 2023-09-18 at 13-33-16 Immortals of Aveum PC Performance Benchmarks for Graphics Ca...png

Screenshot 2023-09-18 at 13-42-04 Remnant II PC Performance Benchmarks for Graphics Cards and ...png

And one more thing, these are the Remnant II becnhmark results from gamegpu.


Screenshot 2023-09-18 at 13-53-09 Remnant II PC Performance Benchmarks for Graphics Cards and ...png

As you can see, the rx6600 gets 41fps at 1080p. That's bad. However with correct settings, you can play at 60fps easily. Case in point my yesterday's run of this game, on the ancient 2500k too (non monetized channel). So yes, settings are important.

 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
If you're expecting someone to click every link presented in this thread and read the contents aloud to you, that's not going to happen.
At the very least, I am expecting people to read and view the contents for THEMSELVES, because they are not even doing that. What they ARE doing, is they see a post against lower vram and they are quick to like it, without even knowing wtf it is all about.

Likewise, filling the forum with screenshot masturbation doesn't change facts, no matter how badly you want it to.
So, actually SHOWING data, is screenshot masturbation for you? What exacly are the FACTS for your

"First properly documented case of 12GB not enough."

15 vs 19fps at 4k/RT for cards that are not meant for that? Wow really? Is that your idea of PROPER documentation? You want a medal or something?

I assume you don't own NV shares? And I also assume you're very happy with your $800 12GB card?
No I don't own NV shares and yes I am super happy with the 4070ti. I can use it from 8W to 270W and I set it up however I want, because I CAN.


Why then, instead of basking in the warm glow of your GPU(tm), you spend time trying to turn anonymous internet people from their wicked ways, like a crusading apostle fighting on behalf of the leather jacket messiah?
People that post stupid videos, without a shred of real substance, while they show a fixation for the CEO of a tech company, are wicked and blindsided fanatics. I am not crusading FOR Nvidia or any other commercial entity. I am crusading for the PC and how people should view it's gaming capabilities. That's why I put personal work behind my words.

Btw its super funny calling me an apostle, while you have a list of COMMANDMENTS in the OP!? Nooo you are not going to set your settings correctly or you will buuuurrrnnn in heeelllll.
 

Thunder 57

Diamond Member
Aug 19, 2007
3,808
6,418
136
At the very least, I am expecting people to read and view the contents for THEMSELVES, because they are not even doing that. What they ARE doing, is they see a post against lower vram and they are quick to like it, without even knowing wtf it is all about.


So, actually SHOWING data, is screenshot masturbation for you? What exacly are the FACTS for your

"First properly documented case of 12GB not enough."

15 vs 19fps at 4k/RT for cards that are not meant for that? Wow really? Is that your idea of PROPER documentation? You want a medal or something?


No I don't own NV shares and yes I am super happy with the 4070ti. I can use it from 8W to 270W and I set it up however I want, because I CAN.



People that post stupid videos, without a shred of real substance, while they show a fixation for the CEO of a tech company, are wicked and blindsided fanatics. I am not crusading FOR Nvidia or any other commercial entity. I am crusading for the PC and how people should view it's gaming capabilities. That's why I put personal work behind my words.

Btw its super funny calling me an apostle, while you have a list of COMMANDMENTS in the OP!? Nooo you are not going to set your settings correctly or you will buuuurrrnnn in heeelllll.

Dude, no one is going to go through all of your last posts. The fact that you worked so hard to put them together proves your agenda. Also, comparing last gen cards to current ones? LoL That's as far as I got.
 

Bigos

Member
Jun 2, 2019
197
513
136
I agree with one point of psolord - that comparing 15fps against 19fps is just not really useful. We would need a theoretical 4070/4070ti with >=16GB of VRAM to see if it would be "playable" (at least 30fps) to show that 12GB hinders this card. Assuming 4070Ti is +50% faster than 4060Ti in general, that could be enough. Regular 4070 would still be bad (Pokemon Scarlet/Violet level), but not as bad.

This does not change the fact that 12GB of VRAM on $800 card is very stupid.
 

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,631
136
So let's see

Fort solis
1440p 6700xt=20fps 4060ti8GB=25fps
1080p 6700xt=44fps 4060ti8GB=54fps

Remnant II
1440p 6700xt=46fps 4060ti8GB=46fps
1080p 6700xt=70fps 4060ti8GB=74fps

Immortals of Aveum
1440p 6700xt=41fps 4060ti8GB=37fps
1080p 6700xt=72fps 4060ti8GB=70fps

So out of the three, we have one with one draw and one loss for each card. Actually according to gamegpu.tech, the 4060ti is faster in Immortals of Aveum, in both of these resolutions, but lets say AMD improved the drivers and surpassed the 4060ti.

View attachment 85901

There is zero indication of anything having to do with VRAM for these UE5 games, but I am glad you chose your wording carefully, and focused on the price and not the vram.

Also the vram findings for all three, show nothing about 8GB not being enough, at least for the resolution these cards are meant for.

View attachment 85902

View attachment 85903

View attachment 85904

And one more thing, these are the Remnant II becnhmark results from gamegpu.


View attachment 85905

As you can see, the rx6600 gets 41fps at 1080p. That's bad. However with correct settings, you can play at 60fps easily. Case in point my yesterday's run of this game, on the ancient 2500k too (non monetized channel). So yes, settings are important.


I know you've been told this dozens of times already, but in case it's really the thirty-third time that is the charm, posting a single example does not disprove the thesis of this thread. Nor does any number of cherry-picked examples.

If I were to take your own line of argument to extremes I could find a game where 4 GB or even 2 GB of VRAM is sufficient and make the same facile argument as you are. Hell, I could pick something like Dwarf Fortress to demonstrate that GPUs are in fact outright useless.

Then I can post a bunch of charts and graphs to make my useless line of argumentation seem impressive. When no one is convinced by this inane display, I'll just do it again and again because clearly everyone else must be wrong for not having recognized the brilliance of my argument rather than there being some obvious flaw with it that's been pointed out over and over again. Of course I won't have time to read those since I'll be too busy bottling my own farts for future use.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I agree with one point of psolord - that comparing 15fps against 19fps is just not really useful.
That isn't what anyone's specifically saying. If that's all someone focuses on, they're gonna have a bad time(tm).

The real issue here is:
  1. It's the first objective sign of cracks on 12GB, a harbinger of things to come. Much like Doom Eternal on 8GB which eventually opened the floodgates.
  2. A cheaper card should never be faster than a more expensive card from the same generation.
As for playable vs unplayable, we have many examples in this thread. Here's one from the OP, I really can't make it any clearer:


64FPS playable vs 28FPS unplayable.

But when you have someone arguing the equivalent of "I can't see any elephants in my backyard, here are 37 screenshots of my empty yard, therefore elephants don't exist!", there's either a serious lack of comprehension going on, or some kind of willful agenda.

We would need a theoretical 4070/4070ti with >=16GB of VRAM to see if it would be "playable" (at least 30fps) to show that 12GB hinders this card.
We already have multiple examples of this as well. Both versions of 4060TI have been tested where the sole difference is VRAM. Also both a professional and modded variant of the 3070 with 16GB have been tested against a stock 3070. They all prove the OP.

It's around about this time some people start giving us the 20 reasons "none of this is NV's fault".
 
Last edited:

nurturedhate

Golden Member
Aug 27, 2011
1,767
773
136
Looks like we got a Rollo 2.0 here, folks. Added to permanent ignore. Bye bye!

Wow, such a relief, no longer will I have to constantly scroll through his screenshot splooge.
It's not Rollo 2.0... It's Happy Medium telling us how the GTX 960 2gb is perfectly fine.

This isn't even the first, second or third time this thread has existed either. It's basically every new console cycle we have this same conversation where the new minimum changes, then In comes corporate sponsors and the unfortunate cognitive dissonance brigade... again and again and again. Want to know the best part? We'll eventually have posts in a year or two from people telling us how glad they are they bought the higher vram cards.

How do I know this? Here we are again my old friend.
1695044743412.jpeg
 

KompuKare

Golden Member
Jul 28, 2009
1,224
1,582
136
It's not Rollo 2.0... It's Happy Medium telling us how the GTX 960 2gb is perfectly fine.

This isn't even the first, second or third time this thread has existed either. It's basically every new console cycle we have this same conversation where the new minimum changes, then In comes corporate sponsors and the unfortunate cognitive dissonance brigade... again and again and again. Want to know the best part? We'll eventually have posts in a year or two from people telling us how glad they are they bought the higher vram cards.

How do I know this? Here we are again my old friend.
View attachment 85911
Thing is, in a lot forums the "is X enough VRAM" threads eventually get locked. Not because of "heated" discussions but rather that those threads seem to attract posters who go out of their way to argue that... it's sort of hard to say what their argument is, but it almost always ends up with the thread being locked.

Now, I am not always sure what passes for entertainment on internet forums, but equally shutting down VRAM threads might even serve some agenda.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,681
31,538
146
Wasn't Diablo 4 the first documented case of 12GB not being enough, sometimes even 16GB is not enough. Or is D4 so bad that it doesn't matter?
While player retention and game play quality are generally consider teh suck, it's a massive financial success. The headline everywhere was that it made $666 million in the first 5 days.🤘 They all thought that was clever I guess. So yeah, it matters.

As @nuturedhate, myself, and others have written; It's the same as it ever was. There is no debate. Just people in the denial, anger, and bargaining stages. I am not certain the depression and acceptance ever happen though. The denial is often so strong that it leads to delusion. I submit into evidence that they have already convinced themselves DLSS is better native res. They will apply more aggressive upscaling or whatever else they need to do to avoid accepting reality.

The problem IMO, is that by buying the crippled cards, they send Nvidia the message that they can continue overcharging for vram.
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
Dude, no one is going to go through all of your last posts. The fact that you worked so hard to put them together proves your agenda. Also, comparing last gen cards to current ones? LoL That's as far as I got.
I did not post the 6700xt vs 4060ti video mate, nor did I create this comparison. I just wrote down the results, because noone actually watched it and they are applauding the non essential difference. The comparison is basically a draw. It could be the same with a 3060ti for all we know, since the 3060ti is so close with the 4060ti.

Looks like we got a Rollo 2.0 here, folks. Added to permanent ignore. Bye bye!

Wow, such a relief, no longer will I have to constantly scroll through his screenshot splooge.
Ingoring screenshots with data and then adding the poster to a block list....

The first guy that goes against his commandments in this thread, by actually providing data and personal work and adds him to an ignore list. Lol, what a biased snowflake.
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
I know you've been told this dozens of times already, but in case it's really the thirty-third time that is the charm, posting a single example does not disprove the thesis of this thread. Nor does any number of cherry-picked examples.
If you told me a number of times already, while I've been posting examples that show something different than what the title of this thread says, then there must be quite a few examples I have posted already, in just a few days of contradictions. This thread has 25 games since it started. How are MY examples cherry picked and not YOURS?

If I were to take your own line of argument to extremes I could find a game where 4 GB or even 2 GB of VRAM is sufficient and make the same facile argument as you are. Hell, I could pick something like Dwarf Fortress to demonstrate that GPUs are in fact outright useless.
I have not shown extremes though. You are the ones that are showing extremes, ie using cards with wrong settings, in order to FORCE them to provide a bad gaming experience. Like that 19vs15fps example, or Deliver us Mars at 1440/Ultra/AND RT.

On the contrary I talked about these three UE5 games, Immortals of Aveum, Fort Solis and Remnant 2, plus Layers of Fear (lighter UE5 game), Lies of P, Starfield and Baldurs Gate 3, all of which have no VRAM problems, for cards that are meant to be used in their respective resolutions. None of these are indie games mate.

Then I can post a bunch of charts and graphs to make my useless line of argumentation seem impressive. When no one is convinced by this inane display, I'll just do it again and again because clearly everyone else must be wrong for not having recognized the brilliance of my argument rather than there being some obvious flaw with it that's been pointed out over and over again. Of course I won't have time to read those since I'll be too busy bottling my own farts for future use.
Why isn't anyone convinced? Easy. Because this is an AMD vs Nvidia thread in disguise and people in this forum hate Nvidia.

Can you tell me EXACTLY what part of the charts did not convince you? Because they prove what I say about the aforementioned games and yet people chose to discard them. This is willful blinkering one o one.
 
Last edited:

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
As for playable vs unplayable, we have many examples in this thread. Here's one from the OP, I really can't make it any clearer:

64FPS playable vs 28FPS unplayable.
Finally some good effin example. And so sad that you cannot see me agreeing with you, since you now blocked me.

I could still argue that I wouldn't get any of these cards for 4K, but it is a valid usable example, at the very least. Now if this happens for the 51% of the games, the thread title will have some meaning.
We already have multiple examples of this as well. Both versions of 4060TI have been tested where the sole difference is VRAM. Also both a professional and modded variant of the 3070 with 16GB have been tested against a stock 3070. They all prove the OP.
In how many games? HOW MANY? I showed you the techpowerup review, with only 2% difference, NO, what YOU say is correct. What percentage of these games are better on the 16GB card?

The real issue here is:
  1. It's the first objective sign of cracks on 12GB, a harbinger of things to come. Much like Doom Eternal on 8GB which eventually opened the floodgates.
  2. A cheaper card should never be faster than a more expensive card from the same generation.
1. Objective sign of crack 19vs15fps, is NOT an objective sign of crack.

2. Correction. A cheaper card should never be faster than a more expensive card from the same generation, if the results are actually gaming usable, meaning above 60fps. Showing 19vs15fps while the exact previous result shows +41% better performance for the more expensive card, at usable settings, is pure bias.
 
Last edited:

Ranulf

Platinum Member
Jul 18, 2001
2,822
2,423
136
2. Correction. A cheaper card should never be faster than a more expensive card from the same generation, if the results are actually gaming usable, meaning above 60fps.

Well, if we're going by 60fps... what is the motivation to move beyond a 2060/2060S/3060? Frame generation? It certainly isn't 8GB of ram at $400. This Nvidia slide is for 1080p at max settings.

geforce-rtx-4060-gaming-performance.png
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,681
31,538
146
Bar charts are another anachronism. If you are lucky, they are based on 3 60 second runs. Most reviews are shorter than that. Otherwise it's canned benchmarks that are rarely as stressful as the game play.

Next: you have no idea if assets like textures are loading, popping in, are reduced quality etc. 1% and .1% only go so far in helping to explain how bad the hitching, stuttering, and freezing can be. All of these things can take far longer than a bench run lasts to rear their ugly heads in game.

Everyone has been slinging bar charts around for so long in discussions and debates that most just accept them. I don't.

In the hardware labs vid: testing right off the bat in Fort Solis, a $400 8GB card is failing to load textures. It looks like petroleum jelly smeared on the screen. Bar charts don't show that. I'll keep my own council on how to evaluate gaming performance. Because it damned sure isn't the Bullcrap now being shoveled in this thread.
 
Last edited:

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
Well, if we're going by 60fps... what is the motivation to move beyond a 2060/2060S/3060? Frame generation? It certainly isn't 8GB of ram at $400. This Nvidia slide is for 1080p at max settings.

View attachment 85940
I never used framegen as an argument here. I only once said, that it could help some cases with cpu limits, nothing more. None of my own 8GB cards, have framegen capabilities. For now at least, they may get it with fsr3, but I don't care. I think.


Speaking for myself, I moved from a 1070 to a 3060ti to a 4070ti. All of these were X2 performance upgrades. The rx6600 is a sidegrade for a backup system but it is the most fun to test, because it is a frankenPC. I still use my 3060ti for proper gaming. I continue my saves from the 4070ti (different house), but at 1080p and sometimes I reduce the settings too. Due to processing power 99% of the times and not vram. Also to be clear, the 4070ti, is a 4k/60/dlss card for me, not a straight 4k card.

Now what Nvidia is trying to show in this slide, aside from framegen, I am not sure, but if you want to see real differences of 8GB cards, you can see the UE5 slides I uploaded above. I am not posting them again, because if you post charts in this forum, apparently you have an agenda...

Speaking of UE5, the aforementioned UE5 games, Immortals and Remnant run at 720p base resolution for the 60fps performance mode on the consoles. You can see it at DF videos. I am not posting screenshots because I have an agenda. I don't know about Fort Solis, but I'm 100% sure it will be the same. In Layers of Fear, the consoles have performance mode at 60fps no RT while the 8GB rx6600 can do both. What I am sayind is, the extra vram ain't helping in UE5. For now at least. If things change, I will recall. I am not a pussy.
 
Mar 11, 2004
23,444
5,849
146
It's not Rollo 2.0... It's Happy Medium telling us how the GTX 960 2gb is perfectly fine.

This isn't even the first, second or third time this thread has existed either. It's basically every new console cycle we have this same conversation where the new minimum changes, then In comes corporate sponsors and the unfortunate cognitive dissonance brigade... again and again and again. Want to know the best part? We'll eventually have posts in a year or two from people telling us how glad they are they bought the higher vram cards.

How do I know this? Here we are again my old friend.
View attachment 85911

Oh jeez, do they still post here?

Maybe all 3 are Rollo?

I forget what other of the old tech sites it was but I recall scrolling to the comments of one of the articles and it was that poster openly gloating about how they supposedly tricked this forum by trolling and getting away with it because the mods weren't willing to ban them over it. Funnily though they were also simultaneously whining about how biased this forum is against Nvidia and that the mods were unfair to them and I forget what all. It was like a masterclass in an idiot thinking they were tricking anyone whilst they deluded themselves more than anyone. They legit thought everyone disagreeing with them (that they admitted was outright trolling) was trolling them, hence their justification for acting like that. It was...not sure what word describes confirmation of something you already knew but just a bit of enlightenment to see it fully revealed wholly as fact because the idiot tried to brag about it whilst genuinely believing they were fooling anyone? Not since a recent former US President and the antics of literally mentally ill teenagers have I seen such behavior.