Any clues as to when a new generation of graphic cards will be out?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Rifter

Lifer
Oct 9, 1999
11,522
751
126
I'll be shocked if we have new high end GPU's this year. I would expect end of Q1 or early Q2 next year. I mean sure AMD is saying fall this year but BD is also supposed to have been out for 2 months now and look how thats working out, we will be lucky to see BD this year. That and GF/TSMC is not ready for 28nm yet. We might see a low end 7xxx series but it wont even beat the high end 6xxx thats out now, so if waiting to upgrade you will be waiting till next year.
 

Chaosblade02

Senior member
Jul 21, 2011
304
0
0
Thanks to most games being made with the intention of being ported to console, we shouldn't need to be upgrading every year to stay on top of gaming.

I was running most games with a 3 year old piece of Emachines junk with an archaic AMD core 2 duo at 2.2 GHZ, and a Geforce 7600GS until recently. Witcher 2 killed my PC, I think I got 5 FPS on Witcher 2, then I realized I desperately needed to upgrade. That setup ran Dragon age and Dragon age 2 just fine.

Minimum PC specs for graphics still looks better than console graphics, even on my piece of crap emachines.
 
Last edited:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
What, do you run games at 1024x600? The first Dragon Age should have majorly strained your PC at any respectable resolution; it's worse than game consoles. Dragon Age II is much the same in DirectX 9, and even worse in DirectX 11 but you can't even try that. The Witcher 2...you didn't have a hope in hell of running The Witcher 2.

Most games are made to be playable on console, so the minimum requirements often remain the same. But devs often throw in high-res textures and better effects settings on the PC version, so if you want to stay on top you really do need to upgrade.
 

Chaosblade02

Senior member
Jul 21, 2011
304
0
0
What, do you run games at 1024x600? The first Dragon Age should have majorly strained your PC at any respectable resolution; it's worse than game consoles. Dragon Age II is much the same in DirectX 9, and even worse in DirectX 11 but you can't even try that. The Witcher 2...you didn't have a hope in hell of running The Witcher 2.

Most games are made to be playable on console, so the minimum requirements often remain the same. But devs often throw in high-res textures and better effects settings on the PC version, so if you want to stay on top you really do need to upgrade.

It was 1280X800 resolution. And my PC played Dragon Age and Dragon age 2 at playable frame rates. No major lag issues, except for a couple fights and even then it was playable.

My PC also ran Fallout New Vegas, on low settings and I had no lag or slow down issues with that game at all. Even the dam fight where 300 people were running around, no lag.

Specs:

Windows Vista home premium 32 bit OC SP2

Emachines model ET1161-07

Processor: AMD athlon dual core processor 4050e 2.1 GHZ.

Ram: 3gb DDR2

GPU: Geforce 7600 GS 512 MB (got this at best buy for like $120 when I bought my PC 3 years ago)

Did ok on most games up until Witcher 2. My PC ran Crysis better than Witcher 2, but Crysis was borderline unplayable frame rates when Witcher 2 was a frame rate slide show.

And by minimum settings on those games, I am talking everything that can be turned off for performance was turned off. And it still looked better than console graphics.
 
Last edited:

StinkyPinky

Diamond Member
Jul 6, 2002
6,883
1,096
126
I was going to look at a 7870 or 7950 but I only game at 1680x1050 and my 5850 runs everything fine even at basically max settings.

Bring on the Xbox 720.
 

gramboh

Platinum Member
May 3, 2003
2,207
0
0
Speaking of game development stagnation wouldn't it be easier to develop for high-end hardware and scale back accordingly depending on needs? Or are the console architectures that different that they can't do that? Or is there some other obvious reason I'm not seeing?

It's expensive to develop the art assets, models, maps etc. not to mention developing the engine itself to display revolutionary levels of detail. Then you run into the problem that no one (or a tiny % of PC gamers) can play at those levels, and massive downscaling has to occur for the console users. It could almost be considered a chicken/egg problem.

Think of the manpower/dollars required to develop Quake (which was truly revolutionary) versus a big budget game today. Granted a lot of that is stupid marketing dollars.
 
Oct 4, 2004
10,515
6
81
I bought a Radeon 5850 18 months ago and I'm just really glad I don't 'need' to upgrade. Have a Dell U2311H (1920x1080) LCD, can run just about everything maxed out and great (I mostly play Bad Company 2, Civilization V).

NFS Shift 2 was probably the first I couldn't max out and get silky smooth performance on. Playable, but you could tell the frame rate dipped below 30 occasionally. Crysis 2 ran well until the 1.9 patch and the DX11/high-res textures hit.

I just run them at a lower 16:9 resolution and use GPU scaling. I honestly can't tell the difference between a native 1920x1080 render and one that has been upscaled from 1280x720 or 1600x900.