Retail Sales of PC Games Drop 14% in 2008

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

LumbergTech

Diamond Member
Sep 15, 2005
3,622
1
0
PC gaming is doing fine....digital distribution..hello..

i bought more games in 2008 than i have in a long time
 

TheVrolok

Lifer
Dec 11, 2000
24,254
4,092
136
Originally posted by: LumbergTech
PC gaming is doing fine....digital distribution..hello..

i bought more games in 2008 than i have in a long time

This. ^

The economy is down, sales are down across the board. This is no surprise. I've also purchased more games this year than any other. They were mostly online digital purchases, though.
 

Eeezee

Diamond Member
Jul 23, 2005
9,922
0
76
Originally posted by: mindcycle
I'm not surprised by this at all, as most of you probably aren't either. I mean all we get anymore are shitty console ports (there are a few decent ones, I know) and DRM up the ass. I remember a few years ago when there were really good PC only games being released. What happened?

Here's a link to the article. http://www.shacknews.com/onearticle.x/56794

Actually, I'd blame this more on the deep recession than anything else. Every industry has been hit pretty hard, and only a 14% drop is actually pretty good
 

ja1484

Platinum Member
Dec 31, 2007
2,438
2
0
Originally posted by: ShawnD1
Originally posted by: ja1484
Originally posted by: ShawnD1
People are praising games released as alpha or beta versions (GTA 4) yet they complain about the declining quality of games. Here's a hint: stop buying shitty games. The developers are getting mixed messages here. When they see buggy games sell a million copies, they start to think that it's ok to sell a game that isn't finished.


This sort of attitude annoys me. Games were "less buggy" back in the day because they were far less complex.

No, they were less buggy because developers at least put some effort into it. A lot of today's games were obviously never tested before they were released. Look at how many people can't even run GTA 4. When I try to run it, it tells me I need to have Vista Service Pack 3. If they even tested this game ONE TIME they would have caught this. Unfortunately they didn't test it, they released an unfinished version of the game, and guys like me are left trying to fix their garbage just to get it working. Saint's Row 2 is even worse. The controls are so fucked that driving is impossible. Just tap the steering and your car is out of control. Did they test that game before releasing it? Obviously not.

It's Action 52 all over again.


First of all, you deserve whatever grief you get for attempting to play something like Saints Row 2.

Secondly, you just obliquely proved my point. They CAN'T test things as thoroughly as they used to because of the increased complexity of both the software and the PC environment. There are hojillions of hardware permutations available today, with hojillions of software permutations on top of them. This isn't 1994, when everyone was running a 486/Pentium and Windows 3.1 and five guys at Id Software could perfect a sprite-based rendering engine within a reasonable time period. Modern engines have complex 3d rendering, lighting systems (sometimes *several* within the same engine!), physics systems, animation systems, AI, scripting, shaders out the ass, particle systems, dynamic environments, massive draw distances, magical texture/content streaming, sound and dialogue, audio/video/physics/etc extensions, multi-threaded exceution, networking, and more. It's a fuster-cluck at best.


I'll grant you we all want less bugs in our games, but let me ask you this: Game development costs have steadily risen over the past decade. Would you be willing to steadily pay more money per game in order to keep the incidence of bugs down? How far would you be willing to take that? Prices have risen, but only modestly.

 
Apr 20, 2008
10,067
990
126
Originally posted by: BenSkywalker
One of the problems there is PC enthusiasts greatly underestimate the technical merits of what the consoles are capable of when coding directly to fixed hardware, not to mention how weak current CPUs in PCs are.

Weak? Are you kidding me? The cell is extremely overrated as it cant process large threads effectively, making it only good for quick, small threads. The XBOX has an inefficient 3.2ghz PPC processor that relies more on clock speed rather then cache and memory bandwidth (something almost all IBM PowerPC's are incredible at).

Now look at the i7. You can build a complete setup under $1000. It can work 16 simultaneous threads (two lane threading per logical processor, with each core enhanced with hyperthreading 4x2x2) and produces HUGE gains in multithreaded applications.

I would probably take back that statement.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The cell is extremely overrated as it cant process large threads effectively, making it only good for quick, small threads.

And what type of code do games normally have? What on the CPU side of the equation is going to be holding them back? Loads of heavily branched interger data, or is it perhaps raw FP performance?

Now look at the i7.

OK, it has ~1/4 the raw FP power of Cell.

It can work 16 simultaneous threads

And how about its' IPC? It could work 10 trillion threads at the same time, how much it can actually execute per clock/per core is far more important as long as it is keeping the processor 'fed'.

and produces HUGE gains in multithreaded applications.

That should be an absolute given. Cell is an utter dog if it isn't multithreaded, slower even then chips like the i7 in FP intensive tasks(and that's saying something).

You can build a complete setup under $1000.

Yuck, wouldn't want to touch that, but that's another topic ;) i7 is an exceptional general purpose CPU, it is utterly horrible in terms of raw FP performance which is Cell's singular strong suit. Vector based ops and physics calculations are extremely well suited for Cell, the same type of calculations that games tend to rely very heavily on. This is where even today, current PC CPUs still aren't really close to the performance available on Cell. PCs have enormous advantages in many different areas, the CPU just isn't one of them(for games of course ;) ).
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
I haven't bought a game at a brick-and-mortar store in many years.

Steam, Amazon.com, GoGamer. I haven't bought from GOG (Good Old Games) yet, but I'll be getting at least Fallout Tactics there, maybe Freespace 2 and a few more.
 

Tequila

Senior member
Oct 24, 1999
882
11
76
Some of the best PC games ever were created in the 1993-2000 era. There's hardly any substance nor imagination anymore. The only modern games that I have enjoyed are CoD4, BF2, WiC and EQ2.

I have played(and still play some) these titles more than any modern game:

Baldur's Gate 1 and 2
XCom
Panzer General and Allied General
Half-Life
Jagged Alliance and JA2
BF1942
Homeworld
EQ1
Civilization
Quake2 mods
Steel Panthers
Xwing
Descent
Counterstrike
Fallout
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,417
33,398
146
Originally posted by: Eeezee


Actually, I'd blame this more on the deep recession than anything else. Every industry has been hit pretty hard, and only a 14% drop is actually pretty good
The industry has proven fairly recession proof thus far. Despite the recession, "the industry full-year sales hit $21.33 billion (up 19 percent from 2007)" "2008 was indeed another record shattering year for the U.S. video game industry." Nintendo being the biggest success, obviously.

 

Skunkwourk

Diamond Member
Dec 9, 2004
4,662
1
81
Originally posted by: JSt0rm01
I bought the orange box and l4d last year and thats it. With the economy the way it is i think a lot of people are just gonna play what they have. I mean I could entertain myself with L4D, CS:S and TF2 for a long time.

Same, those are the only two games I purchased last year as well.
 

PingSpike

Lifer
Feb 25, 2004
21,765
615
126
We don't really know how digital distribution has effected all of this. There aren't any numbers included here so its hard to say.

My gut feeling is at least part of that 14% is just a "loss" to the digital column. Retail stores are in a negative feedback loop with computer games. No one goes there to buy PC games because their selection is bad and poorly maintained. So the retailer says that PC games don't sell and slashes their PC Games shelfspace even further. This, combined with the fact that PC Gamers are more likely to be older internet bargain shoppers with broadband access then console gamers just means that this distribution channel is going to continue to be cannibalized by digital sales.

It is a recession, and its not like other industries are throwing a party. But video games are pretty cheap entertainment comparatively and it seems the industry overall is growing so that probably explains fairly little of the drop.
 

Nik

Lifer
Jun 5, 2006
16,101
3
56
Keep up the good work, gamers. If the gaming industry doesn't stop selling us shitty games at outrageous prices, we'll just stop buying them :)
 

WaitingForNehalem

Platinum Member
Aug 24, 2008
2,497
0
71
Let's see...we're supposed shell out $50 from a description on a box and reviews. I want demo's! What happened to those? Games used to have SP and MP demo's and you could decide if you like it or not. Valve was great in having a L4D demo, but they annoyingly ended it and forced you to buy the full game. BF2 is a role model for demo's and developers could learn from them.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: BenSkywalker
The cell is extremely overrated as it cant process large threads effectively, making it only good for quick, small threads.

And what type of code do games normally have? What on the CPU side of the equation is going to be holding them back? Loads of heavily branched interger data, or is it perhaps raw FP performance?
Despite what Sony told you, games are held back by branched integer data. Xbit Labs tested this with the E1200 and the results were conclusive.
http://xbitlabs.com/articles/c...y/celeron-e1200_8.html

If you look at the first two graphs, you'll see that an overclocked Celeron E1200 (3.4ghz) is a top performer when it comes to raw computing power. In the CPU-only test, it's actually quite a bit faster than the E6750. Now look down at real world gaming performance. Suddenly the E6750 with lots of cache sprints ahead to a 30% lead in Quake 4, 50% lead in Half-Life 2, 30% lead in Crysis, 50% lead in UT3, 50% lead in World of Conflict.

A processor with limited cache is not suitable for gaming. This is why you can't just buy a cheap processor, overclock, and expect it to work.

Secondly, you just obliquely proved my point. They CAN'T test things as thoroughly as they used to because of the increased complexity of both the software and the PC environment. There are hojillions of hardware permutations available today, with hojillions of software permutations on top of them. This isn't 1994, when everyone was running a 486/Pentium and Windows 3.1 and five guys at Id Software could perfect a sprite-based rendering engine within a reasonable time period. Modern engines have complex 3d rendering, lighting systems (sometimes *several* within the same engine!), physics systems, animation systems, AI, scripting, shaders out the ass, particle systems, dynamic environments, massive draw distances, magical texture/content streaming, sound and dialogue, audio/video/physics/etc extensions, multi-threaded exceution, networking, and more. It's a fuster-cluck at best.
They only need to program for 2 operating systems - Windows XP and Windows Vista. Actually they don't even need to officially support Vista; just program for XP if you want.

Everything I bolded is programmed to well understood specifications and do not require special tweaking to account for system variations.
*3D rendering is done to DirectX or OpenGL specifications, the very same DirectX your Xbox 360 uses. As long as you code within the specifications, it will work on any video card.
*Lighting is still within DirectX.
*Physics is just a set of calculations like 4 x 5. As long as your game is programmed for an x86 compatible CPU, it will work on any modern x86 computer.
*Animation does not change when moving between platforms. No compatibility required.
*AI is just simple math; no compatibility issues between different x86 processors.
*Xbox and PC use the same DirectX shaders. Same particles. Dynamic environments??
*Draw distances will not cause the game to crash unless it runs out of memory (which actually can happen in GTA 4 because the draw distance is huge). Just make it possible to turn down the draw distance and there won't be a problem.
*Texture and content streaming changes quite a bit when moving from Xbox to PC, but your computer and my computer are the exact same. We both use Windows, we're both using x86 processors, both computers have hard drives, both have at least X amount of ram specified on the game box.

It's actually quite amazing how standardized computers are these days.
video = Direct3D or OpenGL
sound = DirectSound or OpenAL
input devices = handled by the operating system
math/instructions = x86 processor architecture

The reason these standards exist is because of the compatibility problems you were talking about. Games in the early 90s were not standardized at all. The game would ask you what your input device is, what your sound device is, what the IRQ is, and various other stupid questions that should be handled by the operating system. Since Windows 98 or so, I haven't seen any games asking questions like this. The video works right away, the sound works right away, the keyboard and mouse work right away.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Despite what Sony told you, games are held back by branched integer data. Xbit Labs tested this with the E1200 and the results were conclusive.

I'm sorry, but where exactly was that conclusion reached? Two processors with identical FP capablities but one running on a considerably slower FSB with 1/8th the cache will show performance limitations when poor OoO code creates lots of cache misses is more like it.

If you look at the first two graphs, you'll see that an overclocked Celeron E1200 (3.4ghz) is a top performer when it comes to raw computing power.

We aren't talking about how the current Intel chip benefit from the amount of cache on die. But the charts do show a few rather interesting things. First off, even using the Intel architecture with all of the other limitations mentioned above(reduced FSB on the Celery is the biggest one), an 8x increase in cache only offers a fraction of the performance increase that a doubling in clock speed does- why is that? If the code was truly as branch/cache dependant as you state, why is a straight clock speed showing a much higher performance return then an increase in the amount of cache? When even an OoO CPU running highly unoptimized code is STILL showing that raw computational power is the biggest factor in performance scaling it should give you an idea.

Suddenly the E6750 with lots of cache sprints ahead to a 30% lead in Quake 4, 50% lead in Half-Life 2, 30% lead in Crysis, 50% lead in UT3, 50% lead in World of Conflict.

800% increase in cache size. Poorest result when OCing is 66% improvement in Crysis, that's more then the largest gain seen from eight times the cache.

A processor with limited cache is not suitable for gaming.

Not suitable isn't accurate, not ideal for general purpose x86 platform based gaming all else being equal is more like it(seems to me. Why would you see such a big difference in performance based on cache amounts? Probably due to cache misses. How could anyone ever work around such a problem? Maybe if they could come up with a way so that there was some direct memory access control over the on die memory on the CPU they could completely avoid cache misses altogether. Kind of like, exactly how Cell is built :)

The most demanding game they used on the platforms- Crysis- judging this based on the one resulting in the lowest framerates- the OCd Celery still managed to topple the E4500 that has four times the cache. It isn't just about one factor, but the largest element- as the article you link clearly demonstrates- is raw FP performance.
 

ArmchairAthlete

Diamond Member
Dec 3, 2002
3,763
0
0
Originally posted by: BenSkywalker
I'm willing to wager the Wii games cost a hell of a lot less to develop then any of the top 10 PC games.

If you make a Wii game you have to pay Nintendo for licensing, dev kits, etc. And finding/paying people with Wii skillsets has to be harder than those with PC background.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: BenSkywalkerTwo processors with identical FP capablities but one running on a considerably slower FSB with 1/8th the cache will show performance limitations when poor OoO code creates lots of cache misses is more like it.
You have the numbers backwards. A Celeron E1200 has a stock frequency of 8x multiplier at 200mhz FSB (1.6ghz total). These guys got it to 3.4ghz by cranking the FSB up to 425mhz. The stock frequency of the E6750 is 8x 333mhz (2.66ghz total).

That means the Celeron has 28% more bus speed, 28% more clock speed, and it still gets its ass kicked by a considerable margin simply because it lacks cache memory. In the mhz to mhz comparison, the lack of cache causes the performance to drop from 1/3 to almost 1/2 (46% in Quake 4).

We aren't talking about how the current Intel chip benefit from the amount of cache on die. But the charts do show a few rather interesting things. First off, even using the Intel architecture with all of the other limitations mentioned above(reduced FSB on the Celery is the biggest one), an 8x increase in cache only offers a fraction of the performance increase that a doubling in clock speed does- why is that? If the code was truly as branch/cache dependant as you state, why is a straight clock speed showing a much higher performance return then an increase in the amount of cache? When even an OoO CPU running highly unoptimized code is STILL showing that raw computational power is the biggest factor in performance scaling it should give you an idea.
Diminishing returns. Having 2 video cards in SLI does not double your frame rate. Having 2 CPU cores instead of 2 does not double your zip file compression rate. Could I say those tasks are not GPU or CPU dependent simply because it's not linear? Of course not. In that above link, we see that increasing the cache 8x increases the performance in Quake 4 from 78fps to 145fps (86% faster). Similarly, if I added a bunch of ram to my computer and it suddenly jumped 86% in load time performance, I wouldn't hesitate to say that there was a significant memory bottleneck. If I change from a GeForce 6800 video card to a GTX 280, which is orders of magnitude faster, I don't see orders of magnitude gains in my frame rate. Should I be disappointed? Of course not. Nobody expects linear gains with these kinds of things.

Not suitable isn't accurate, not ideal for general purpose x86 platform based gaming all else being equal is more like it(seems to me. Why would you see such a big difference in performance based on cache amounts? Probably due to cache misses. How could anyone ever work around such a problem? Maybe if they could come up with a way so that there was some direct memory access control over the on die memory on the CPU they could completely avoid cache misses altogether. Kind of like, exactly how Cell is built :)
You're right that I worded that badly. The Cell does have a lot of memory bandwidth and it does have a lot of raw processing power. The problem is that this never seems to translate into better gaming performance. Gamespot did a pretty good article about this:
http://www.gamespot.com/features/6202552/index.html

"Even draw distance [in Fallout 3] is better on the PC, as the rocks and a fence near the burned-out bus aren't even visible on the consoles. "

This might not be true for all games, but for every game I have ever played, the draw distance has been a major CPU bottleneck. World of Warcraft was very easy on the graphics card, and the frame rate was fantastic... until you turn up the draw distance in a major city. Then it turns into a slide show and Task Manager shows the CPU maxing out. This is very true for GTA 4 as well. On minimum draw distance, the game is actually playable. On max draw distance, the game is ridiculously CPU bottlenecked and the frame rate drops like a rock; even at lowest resolution the game is unplayable with that setting.
Here in the article we see that the consoles, including the PS3, need the draw distance turned down. I'm not saying this is a fact, but I would bet this is a CPU bottleneck since it is in every other game I own.

"When your character [in COD5] gets pulled up from the water at the start of the level in the Xbox 360 and PlayStation 3 versions of the game, he remains stationary. In the PC version, the waves actively push your character around, making screenshots more difficult to capture."

I don't know if this is processor related or if it's because consoles use different controls, but whatever. No conclusions, but it seems worthy of mentioning.

"Even with all the problems, GTAIV looks better on the PC by a wide margin. The PC's high resolution and draw-distance levels keep higher-quality textures, lighting, and transparency effects visible farther into the distance."

Again, high draw distance. There is no reason for the PS3 version to not have a ridiculously high draw distance if it's all floating point. The PS3's floating point power is like 10x that of any Intel CPU, so I don't know what to say here. It shouldn't be a video card bottleneck either since the consoles blur over after a short distance and run at a very low resolution. I think the PS3 version only runs at 640p - very low GPU demand. You still need CPU power to acknowledge that the object is there, but the GPU demand is still low because of the half-ass job rendering it.


Something is up. The PS3 has a lot more memory bandwidth and a lot higher floating point power than your regular Phenom or C2Q. The PS3 should have much better physics and draw distance in every case, but it doesn't. What is left? I'm thinking it's integer/cache.


edit: I also just realized this has nothing to do with PC game sales.
 

skace

Lifer
Jan 23, 2001
14,488
7
81
NPD continues to fail at accurately portraying PC numbers. Eventually they are going to be called to task for misrepresenting data.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Diminishing returns. Having 2 video cards in SLI does not double your frame rate.

If you are fully vid card limited and their are perfect(or as close as one can get) drivers it will be very close to doing exactly that.

Having 2 CPU cores instead of 2 does not double your zip file compression rate.

Try encoding, that is more performance oriented and show close to linear gains.

If I change from a GeForce 6800 video card to a GTX 280, which is orders of magnitude faster, I don't see orders of magnitude gains in my frame rate.

Run Crysis on very high at say 19x12, you will in fact see orders of magnitude difference in performance.

This is very true for GTA 4 as well. On minimum draw distance, the game is actually playable. On max draw distance, the game is ridiculously CPU bottlenecked and the frame rate drops like a rock; even at lowest resolution the game is unplayable with that setting.

Going to pull a quote from the article you linked for this one-

At the moment, no video card has enough RAM

They were using a GTX280 which has twice the RAM of the entire PS3(vid and main memory). Given that the PC is still held back because it doesn't have enough vid ram(400% more then the PS3) I would wager rather heavily that that is where the issue lies. Shear amount of RAM is one of the major edges PCs have, as is graphics processing power.

World of Warcraft was very easy on the graphics card, and the frame rate was fantastic... until you turn up the draw distance in a major city. Then it turns into a slide show and Task Manager shows the CPU maxing out.

One of the systems in my house is using a 256MB 6600, one of the kids was in Dal and their framerate plummetted, swapped the color setting to 16bit from 32(isolating the problem) and their framerate increase ten fold(from 2fps to 20fps). While this was happening it was showing the CPU pegged as it was busy trying to stream all of the data non stop, but the CPU utilization dropped markedly once I made the swap to reduce the strain on the vid card.

Something is up. The PS3 has a lot more memory bandwidth and a lot higher floating point power than your regular Phenom or C2Q. The PS3 should have much better physics and draw distance in every case, but it doesn't. What is left? I'm thinking it's integer/cache.

RAM. When talking about huge draw distances RAM is going to be your biggest factor. How about this, try running WoW on a PC with 512MB of system RAM and a 256MB vid card, or better yet- run GTAIV with that setup. Grab the fastest i7 processor you can find, I'd wager rather heavily you will find out that the PS3 would obliterate it, badly.

edit: I also just realized this has nothing to do with PC game sales.

It does though, as the overall gaming market is showing break neck sales rates, and the PC is showing declining retail sales. Something must be causing the shift, we are discussing one of the elements that cause this shift.

NPD continues to fail at accurately portraying PC numbers. Eventually they are going to be called to task for misrepresenting data.

NPD accurately reports what they track, they have never claimed otherwise. NPD does not track on line sales for either consoles or PCs and it doesn't track a small chain retail store called Wal Mart. They haven't ever claimed to track these things, they report the sample that they do track, and I have never seen anything to indicate that those numbers are not accurate.
 

skace

Lifer
Jan 23, 2001
14,488
7
81
Originally posted by: BenSkywalker
NPD accurately reports what they track, they have never claimed otherwise. NPD does not track on line sales for either consoles or PCs and it doesn't track a small chain retail store called Wal Mart. They haven't ever claimed to track these things, they report the sample that they do track, and I have never seen anything to indicate that those numbers are not accurate.

Well the misrepresentation of PC game sales is going to fall on someone, whether it be NPD or someone else. You can only go on lying for so long. I realize they only report B&M sales, however it has become painstakingly obvious that, in doing so, they are no longer creating an accurate portrayal of PC game sales.

Is Steam not an online retailer?