• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Budget PC Gaming is dead! [Byte Size Tech]

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.


Diamond Member
Jun 24, 2004
Maybe you're right, I didn't realize the consoles had become just as hard to find now. The ebay prices on them don't seem as bad as video cards though.


Nov 27, 2001
To start off, I've never seen these two before, but man... the arrogance of "I'm right and you're wrong" is something that I have a hard time swallowing.

Anyway, as others have mentioned, I think their problem is that they define a card's or computer's usefulness by its capabilities in a single game. The guy then states that pointing out games like Shadow of the Tomb Raider isn't acceptable because the game came out two years ago. The thing is... you can't cherry pick a single game to make your point and then state that any other example isn't acceptable. To be fair, I understand that their point is to use a more modern game, because it better reflects the demands of today's games.

Here are a few reasons why that doesn't work:
  1. One game isn't a good metric. I'm not a huge fan of focusing solely on Cyberpunk, because not everyone wants to play Cyberpunk. One important aspect of looking into a new graphics card isn't how well it does on all games, but rather, how well it does on the games you want to play. Sure, you can point out how well it does in Cyberpunk may allude to future performance, but you can chase the elusive "future" goal for way too long.
  2. What's in a game? This relates to #1 in that I think it's important to understand why a game may not perform well. For example, if a game includes a new method of non-RT lighting that's likely to be widely adopted in the future, it may be worthwhile to pay attention to its performance in cards. However, to my knowledge, Cyberpunk 2077 doesn't use any newer technologies. The game is simply very geometrically dense.
    1. Also, as a side note, I think we need to be wary about whether a game might not be optimized the best just yet and that using it as a metric too early could be misrepresenting the usefulness of hardware.
  3. How often do we change resolution? I don't know about you, but I would say that monitors are probably one of the more static parts of my computer. I only recently changed one of my monitors due to an issue with a KVM, but even then, I stuck with the same size and resolution (27", 1440/QHD). (It's my second monitor too.) Part of budgeting for a computer is to understand your end goal. Do you want to play on 24", 1080p monitor(s), or large 32", 4K monitor(s)?


Diamond Member
Nov 16, 2006
I'll play devil's advocate:

The "point" of the video is to stir up fear like prices won't eventually come back down again. As I said before, this has happened before, it's happening now, and it will happen again. Prices go up and hardware becomes scarce whenever something big comes out or there's some kind of shortage. Remember when bitcoin farming became a huge thing? You couldn't find any decent card for under $1,000. Remember when Crysis came out and everyone said you're going to need tri-SLI configurations just to max it?
- I think one pattern a lot of people, especially the "old timers" are picking up on is the time between scarcity incidents is shortening. We went from "no scarcity" as GPUs were coming of age as a luxury/boutique good, to a huge boom huge bust during the HD7xxx /GTX7xx series, to a huge boom small bust with the RX4xx/5xx/GTX 10xx series, to increased MSRPs with Turing (as a result of the prior boom), to whatever god awful cluster**** we find ourselves in now. We cannot always rely on historical trends of a "return to normal" to occur, and its very possible that we are in the middle of a sea-change in demand for hardware that the suppliers could not have anticipated and will take years to properly adjust to.

There's also another factor nobody has mentioned yet - covid. Due to the pandemic, many people lost their jobs or were working from home or working reduced hours in all facets of employment. I can't imagine that the GPU manufacturers kept their people working like normal through 2020 so that puts a strain on supply. With more people staying home and doing nothing (those with money), a lot of people have been jumping into PC gaming which caused demand to skyrocket. High demand, low supply... you do the math. Videos like the one you embedded serve no purpose other than give people cause to freak out and think the end of PC gaming is nigh simply because prices are high and some people are saying it's impossible to have a budget PC anymore.
- I agree with the core of your point, but the assumption is still "and then covid will disappear and everyone will go back to 2019 and pick up where they left off". There will be a huge, sustained shift to alternate work schedules and making the home the center of entertainment (as opposed to going out). And tariffs. And a growing global middle class. Consolidated suppliers (TSMC is kinda the only game in town for virtually all cutting edge silicon fabbing). Other stuff that I'm surely forgetting.

Covid is definitely hugely disruptive, no argument there, but it is also hitting the fast forward button on society that I don't expect to suddenly snap back. Additionally, while some 1st world countries might get their poo sorted out in the next 6-9 months, there is a huge slice of humanity in India and Asia that has growing purchasing power but a second rate vaccine distribution that will continue to lag another 6 to 9 months behind.

Give the world time to rebound from the pandemic and get everything as close to normal as it once was and give companies like eBay and Amazon time to figure out how to deal with the scalpers (other than using bots to auto-inflate the scalper's prices). I agree that it really sucks that prices are so high right now because I had fully planned on having a new PC by December of last year, but that's life. I wish it was different, but waiting a little longer to build my PC won't kill me and by the time I get around to building it, I'll be happy I didn't spend $1,500 on a single GPU.
- Agreed with "that's life" and there is definitely a certain amount of rolling with the punches that has to happen on the part of consumers in this circumstance. Hard disagree on any sort of return to even close to normal though. It won't be shelter in place levels of isolation, but we're never going back to the way things used to be after this.

Anecdotally, my daughter's school here in the Bay Area has lost fully half its enrollment, and thanks to my wife being a member of the PTO, she knows those student's families are uprooting and moving out to the Sierra foothills, to towns around Tahoe and elsewhere. We may get a handful of rebounds as things settle down, but people aren't getting up and moving out to remote locations because they expect things to snap back (and in a self fulfilling kind of way, their moving means it won't). They fully expect to lean into the isolated lifestyle more.


Oct 10, 2017
Just me. Playing on a Predator Helios 300 laptop. Not exactly old tech, but not cutting edge either. Anyway, I do not see a lot of new games I want to play...happily playing "old" games like Witcher 2, 3 (with some mods), Skyrim (modded), and lots of other "old" games (Metros, Far Cry 4, Wolfenstein New Order, etc) These games I will probably be playing for a long time to come without needing to buy a new computer every year or so. My old laptop served me well for 10 years.
  • Like