[ArsTechnica] Next-gen consoles and impact on VGA market

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
Why don't you try clicking on his links and actually reading them-

Huh? The point the guy was making was that you have no proof the PS3 plays 3d 1080p blu rays on the cpu without gpu acceleration, especially since the gf 7000 series has that ability.

That AND that you can play 3d 1080p on far weaker CPUs than an athlon II x2 440.

http://www.cyberlink.com/prog/support/cs/product-faq-content.do?id=2576


Geeze, guy, don't tell me to go read his links if you don't seem to have done so yourself...
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Intel's highest performing part is now a tiny in order core used in large quantities
I'd love to see how well it performs as a telephone server, LAMP server, gaming CPU, etc.. The world mostly runs on serial code with light concurrency. Right now Intel's highest performance parts, for anything I'd care about, come with 4 giant cores a piece.

of course AMD and nVidia are taking the same approach).
NVidia is all rumors, ATM, but AMD is definitely taking the approach of fixing BD, and making better billion-xtor CPUs.

So here is my question, where do you think the line should be drawn?
Made for C89 (and similar/derived languages) applications in a virtual memory supporting OS top to bottom with an apparently flat address space. Breaking the illusion for optimization is fine, and is done all the time. Not only that, but abstractions likely to hurt performance should be thin, and specifically implemented to have holes poked through them, where needed. Not abstracting enough will just make people want to use somebody else's hardware.

A dumbed down processor easy to deal with like the i7
An i7 is hardly dumbed down. It's an extreme in the opposite direction, but hardware speculation and scheduling works, for application CPUs.

Do we aim for mediocrity, or is that too high? I understand, Cell makes it take the best doing their best work, but how far over should we move things to compensate?
Enough to make it usable by a wider range of programmers, and to allow those programmers to test and throw away code easily enough. It's not mediocrity, it's reality. With simple embedded software, having to mess with weird processors with weird memory views is OK, because there's not much else going on that might compete for CPU time or memory I/O--at least not that you don't have control over. Hell, it can be fun to see what works.

Nobody sane is going to argue that MMUs and related features don't have a real cost. But, they make actually using hardware well far easier, and it's hardly ever been much of a performance hit, outside of microbenchmarks, resulting in more and better utilization of hardware resources. Now, cache-coherency can be a huge cost, but so can doing it in software. There's no free lunch with that one. But, the ability to bypass cache-coherency as desired would get awfully close, in practice.

If it runs 10% as fast as what it could, and it's wasting time with global memory, or doing dumb things with instruction order, the other 90% can be gotten by doing it right in assembly, with a little fiddling...or at least it can be gotten to a point where memory bandwidth isn't enough, which aught to be fine. If that 10% is slower than the CPU can do it, it's probably not worth pursuing. If it's twice the CPU's speed, it'll clearly be worth it. In a big application, though, you won't know enough without experimenting with it in the context of the rest of the program running. Or, rather, by the time you could reason it out fully, you'd be well past all your deadlines.

The easier it is to program in the first place, the more chances for good uses and good optimization there will be. There is no dumbing down, or aiming for mediocrity, just accepting that complicating hardware can and does provide real benefits. Pushing all the complexity off to software makes software too complex, to the point of making up for any chip speed gains, negating all the speed and simple logic.

CPUs don't have all these features to help dumb people or programs, they have them because they are the best ways to improve overall performance and reliability, and get people to actually want to use them and buy them.

The key is finding the right balance. The right balance for something going in a set-top is not going to look like a fat desktop CPU core. But if it's also got to run games, the hardware shouldn't be too dumbed down, because real code will have problems reaching spec sheet performance numbers, no matter how good the programmers are.

http://www.hardwarecanucks.com/foru...-a8-3850-apu-review-llano-hits-desktop-9.html

Funny that is supposed to be the next generation, fails a trivial task without help that the weak old Cell processor from years gone by does without issue(i3 also failed).
Other chips from 15+ years ago could probably do it, if someone was around to program them to. Video formats have been designed with simple vector processors in mind for ages, now. Desktop CPUs aren't that, and have only recently needed to care about it (previously, by the time anything like that mattered, CPUs were fast enough, but slow clock increases mean we need to get good vector extensions, something that's really long overdue). That specialized hardware can be hundreds, even thousands, of times faster at certain things is nothing new. The Cell can do it just like the A8+IGP.
 
Last edited:

cplusplus

Member
Apr 28, 2005
91
0
0
GeForce 7 GPUs supported 3D blu-ray later on;
Core 2 Duo supported 3D blu-ray playback;
3D Blu-Ray playback has nothing to do with gaming performance, so using it to show Cell's superiority vs. modern x86 CPUs that can play BluRay just as easily is a red-herring (see FX8150 3D BluRay vs. Core i3 and then compare their gaming performance. It's meaningless to compare 3D blu-ray playback. If in 5 years from now some future CPU could run 100 separate 3D Blu-Ray vidoes on 100 displays and not slow down, does it mean it can run tessellation, HDAO, Depth of Field, contact hardening shadows, global illumation lighting model in games? There is no correlation.

I'm just pointing out the inaccuracies in your post, not trying to use 3D Blu-Ray playback to prove anything. I haven't actually given my opinion on the issue at hand.

Which is that the Cell was probably a good enough processor in 2006, and wasn't the reason that the PS3 was so priced so expensively (it was the Blu-Ray drive that gave it the extra cost compared to the 360). Their actual problem was going all the way to 2005 thinking that they'd be able to put a Cell in as a GPU, which screwed them over because they had to rush to find a replacement GPU that ended up not being as good as the one in the Xbox, and probably wasn't as fully optimized for the console as it could have been if they'd had nVidia working on it for years instead of months.

Edit: That and the split RAM pool.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
I'd say the PS3's complex design overall is attributed to the Cell. The addition of the RSX makes up for the Cell's overall inability to act as a full GPU. The VRAM was added in later to make up for the low amount of XDR and give the GPU it's own VRAM to mess with. I would figure it sped up access time since it's directly attached to the RSX.

And it's all attributed to the Cell being the "big thing" for the system, and in a way, it's a case study as to why a graphics processing setup should be taped out before the CPU is if you think about it. The graphics processor is arguably more important since the main appeal of interactive video games is the visual element. Driving the commands to a GPU to build a 3D image seems relatively easy (hence why the CPU only needs to be modest in comparison), it's the actual construction of it that seems to be the real hard part, hence focus on your graphics setup. Cell focused on graphics, but also, it looks like Sony was just caught off-guard by how fast Nvidia and ATi had progressed. They knew the Cell was doomed as a competitive GPU by early 2005 I bet. E3 2005 probably solidified what they figured as inevitable, though at the same time I think Sony had announced they had gone with Nvidia already...../can't remember/

So yeah, next go around means building your system around familiar and proven hardware. GCN, and AMD's Fusion processors are proven somewhat. Fusion offers developers the chance, like with Cell to employ heavy vectorization off the main CPUs for the purpose of graphics, particles, physics, etc, but even something so modest as a four core Jaguar, the ability to ignore the stream units and just use the FPUs if that is all they really need. If anything, 128 GCN style stream units on the APU probably is enough to give Jaguar the GPGPU kick it needs to be something different and crazy. At a modest 500 MHz, that's 128 GFLOPS, and purely for graphics, enough to run the main system (as in Xbox Live) menus, DVD playback, etc. so the system doesn't have to use the dedicated GPU for comparatively mundane things. Of course 750 MHz would get us 192 GFLOPS, and 1000 MHz gets us 256......
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Huh? The point the guy was making was that you have no proof the PS3 plays 3d 1080p blu rays on the cpu without gpu acceleration, especially since the gf 7000 series has that ability.

If the GF7 series had that capability you would have a good point, but it doesn't, and you don't :)

http://www.nvidia.com/object/3d-vision-system-requirements.html

GeForce 8 series can't accelerate BR3D, nor can the nine series(nor most of the 2xx series for that matter). As you will find in pretty much everything in our back and forth, if you do a smidge of research he is wrong.

NVidia is all rumors, ATM, but AMD is definitely taking the approach of fixing BD, and making better billion-xtor CPUs.

Sorry, forget about that legacy money pit CPU division they have, meant the only part of AMD that has produced anything worth while in the last several years ;)

An i7 is hardly dumbed down. It's an extreme in the opposite direction, but hardware speculation and scheduling works, for application CPUs.

Sorry, that's what I meant. The i7 devoted all of its engineering efforts to making atrocious code run as fast as possible, it uses a tiny fraction of its' transistors for doing any actual number crunching.

Video formats have been designed with simple vector processors in mind for ages, now.

Semantics issue, but I would say its' far more accurate to say they have been designed for DSPs, and vector processors tend to mimic their capabilities in a superior fashion. I keep pounding on the BR3D as let's face it, it is a trivial task for any processor with a reasonable amount of compute functionality.

To the heart of your points.

How much do you expect the economy as a whole to pay to make your job easier? This is a *very* serious point. To give you everything you are looking for you are looking at chewing up ~half the die of the CPU(not counting cache/eDRAM in that). If we apply that to a generic console, let's just use the PS720 as a purely hypothetical solution, and we take a close to pure vector CPU and add everything in to make it into something you are looking for we either need to double the die size, or halve the functional units. Over the course of a console life cycle at cost half of the CPU probably averages out to ~$20, for a system that sells 70Million units that is $1.4Billion dollars paid by consumers to make your job easier. Do you think that is reasonable? I'm not asking this to be obtuse or rude, it is an honest question. How much additional money should consumers be expected to pay to make the life of a coder easier?

CPUs don't have all these features to help dumb people or programs, they have them because they are the best ways to improve overall performance and reliability, and get people to actually want to use them and buy them.

If you modify CPUs to x86 CPUs I could agree with that. Outside of that I would vehemently disagree with you. Everyone outside of x86 is going for lower power, or more compute power. Even inside the x86 world, Intel is dropping billions going down and up(although they are also paying attention to the bulk of their market which is as you describe).

If it runs 10% as fast as what it could, and it's wasting time with global memory, or doing dumb things with instruction order, the other 90% can be gotten by doing it right in assembly, with a little fiddling...or at least it can be gotten to a point where memory bandwidth isn't enough, which aught to be fine.

Whoa now, Cell is too hard and you want people to drop to assembly? Meh, I haven't even touched Hex in probably a decade :p

That specialized hardware can be hundreds, even thousands, of times faster at certain things is nothing new.

Yes, but we aren't talking about a DSP or fixed function hardware, they are 'general use' vector units.

The graphics processor is arguably more important since the main appeal of interactive video games is the visual element.

This perspective is held almost exclusively by PC gamers. Building around a GPU isn't a good idea, unless you plan on using the GPU for reasonable computation loads. Pair dual 7970s with a PentiumII 400MHZ.... yeah, you have to figure out what your CPU can handle as a base line. The PC world CPU advancement is glacial, that's why most new games are OK with several generation old processors. The embedded market doesn't work like that.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Sorry, that's what I meant. The i7 devoted all of its engineering efforts to making atrocious code run as fast as possible, it uses a tiny fraction of its' transistors for doing any actual number crunching.
Occasional music and video playback, which work fine on 5+ year old computers (most of it on a Core Duo, these days), are most of my number crunching, so that works out fine. How fast is MIC, or a Radeon, going to be at running GCC, or Clang? Or, really, anything else dominated by branching. I care about branches, stacks, and cold caches, when I need a fast CPU. ALUs are secondary, because the fastest ALUs on the planet are junk if they don't have anything to execute. They spend precious little on number crunching because that's not the most important performance metric, nor the main limiter of performance. By far the greatest bottlenecks are DRAM latency and DRAM bandwidth. Limited DRAM bandwidth being thin means you can't predicate everything and do software speculation, because it bloats the code too much, which will eat up power, space, and time, negating any faster ALU benefits. High DRAM latency means you can't keep ALUs fed without speculation and caching (the bandwidth limits keep software loading to local memories from being a viable alternative to caches).

That classes of computational problems can't be solved effectively by loops over arrays hardly makes code handling them atrocious. Having big old crusty code-bases and A-hole maintainers can do that, of course (GCC isn't so bad, itself, except for Greenspun's 10th rule, but there's always bloat problems with glibc).

How much additional money should consumers be expected to pay to make the life of a coder easier?
N/A. On practical side of things, they could just get an XB360, which had software and hardware that was easier for developers to exploit in the first place. Also, in our actual reality, Sony's decision may have forced devs to work with the Cell, but the difficulties in using it are part of what killed it. Consumers will pay as much as Sony demands, or not, regardless of the chip, because those are their options. Other potential users of the chip, however, can make a decision based on many other factors. It doesn't matter how much space or power adding a little abstraction would take, because not having it limits wider adoption.

If you modify CPUs to x86 CPUs I could agree with that. Outside of that I would vehemently disagree with you. Everyone outside of x86 is going for lower power, or more compute power. Even inside the x86 world, Intel is dropping billions going down and up(although they are also paying attention to the bulk of their market which is as you describe).
So, what are disagreeing on? X86 is following the rest of the world (see AVX2, FI). It just happens to also fill a historical niche where 100W+ CPUs are acceptable, as long as they provide more performance than the last ones did. Even going lower power, they aren't giving up any die-eating speculation or scoreboarding (Atom being an anomaly, for the moment).

Meanwhile, ARM has been steadily adding those evil die-eating features to their cores, because they work much better at keeping power down than simpler CPUs do, since those simpler CPUs will need to run at so much higher speeds to compete, that power will be unreasonable.

Whoa now, Cell is too hard and you want people to drop to assembly? Meh, I haven't even touched Hex in probably a decade :p
Or intrinsics, or clever PGO with annotations. Either way, at some point you have to tell the hardware how to do what it's doing, rather than just what to do, when the compiler isn't doing what you expect/want.

I don't see what hex would have to do with anything, though it's hard not to touch hex, unless you're only working in very high level languages, since it's pretty pervasive in common coding patterns of C and C derivatives.

Yes, but we aren't talking about a DSP or fixed function hardware, they are 'general use' vector units.
A vector processor is not a general use processor. It is specialized hardware, however programmable it may be. Truly fixed-function, or DSPs made specifically with video formats in mind, would be even faster and lower power still.

This perspective is held almost exclusively by PC gamers.
I don't even understand it, as a PC gamer. We had a couple years, around 2002-2003, when CPUs were less important than GPUs. Since then, every game has offered the ability to run on weaker GPUs than the fastest, but the CPU must be able to process a certain minimum amount of work before it can finally draw a frame, and keeping that under 16ms keeps taking faster and faster CPUs, much more than faster GPUs.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
I think what we need to be discussing is the impact of the next gen consoles on the PC gaming space. Will it be like the last generation where the cost of development requires multiplatforming, hence software parity across all platforms? Or will the advances in hardware, dev tools, etc allow developers to return to making exclusive mainstream titles for the PC again, or at least give special focus to the PC and not be afraid about profitability?

Unless the consoles just fall flat on there face in terms of sales, mainstream PC gaming and consoles are directly tied to eachother now. Indie games are tied across all mainstream platforms (console, PC, mobile). As of late, I've figured that it's going to be less about hardware platforms, and more about software. Going x86 with the next 360 is a smart move when you could effectively use the hardware to run all the same software you do with a PC and vice versa with minimal work involved with switching platforms. In fact, I expect the next 360 to be Win8 based (just heavily modified and most of the power user features not included), and Win8 software to be available for it. The only difference from the true PC Win8 versions of software just being download only in order to have tighter control over what is allowed and not on the system. Dedicated work on AMDs APU due to being on a console can trickle into the PC space as well, and perhaps those who adopted Llano or Trinity (or Kaveri in the future) may end up being quite thankful they did.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Cerb- In general you seem to have a very introverted perspective on what should be done, it seems like what the market is doing in general doesn't matter much as long as things work well for you. Not saying there is anything wrong with that, it just is a different approach.

Also, in our actual reality, Sony's decision may have forced devs to work with the Cell, but the difficulties in using it are part of what killed it.

With the billions in revenue the most popular Cell platform is generating this quarter, I'm curious what you mean by 'killed it'? Just curious.

Or will the advances in hardware, dev tools, etc allow developers to return to making exclusive mainstream titles for the PC again, or at least give special focus to the PC and not be afraid about profitability?

If the market as a whole wants to see PC exclusive titles they need to do something radical and something that isn't seen often enough- buy PC games. It isn't a question of development tools, hardware or anything else. If people will buy games in large numbers, someone will make them. Unfortunately that just doesn't happen. Outside of MMOs and a Blizzard game here or there the PC platform considers something a big hit if it clears a million units. For a AAA game, budget wise, that is pushing break even. That isn't enough revenue being generated. We know what PC hardware sales look like, take that and apply it to software sales and its' clear piracy is insanely detrimental to the PC gaming market.

Unless the consoles just fall flat on there face in terms of sales, mainstream PC gaming and consoles are directly tied to eachother now.

No, they aren't. The consoles will go without PCs and not bat an eye. The PC gaming market needs the console market to subsidize it. You can look over the financials for any of the major publishers, if you take out MMOs(which are showing large problems this year even, well, by their standards) PCs just aren't a major source of revenue anymore. IMO I'd much rather see the top tier FPS, RTSs and MMOs stick to the PC with the consoles handling most of the rest. Unfortunately now, all the top FPSs seem to be console first, luckily that isn't the case with the other two yet, but if sales trends keep going the way they are it may only be a matter of time. This generation the set top consoles broke 2 billion in boxed retail unit sales(somewhere in the $100 billion dollar range, that is just boxed retail). If PC gamers start spending anything remotely approaching that on traditional core games then we will see improvement in the PC gaming sector. That is *far* more important then the development tools we have at our disposal.

The PS3 gets bashed constantly for how obscenely difficult it is to program for, but if you go our and ask most cross platform gamers it has by far the best selection of exclusive games. Why? Because the audience will buy them.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
it seems like what the market is doing in general doesn't matter much as long as things work well for you. Not saying there is anything wrong with that, it just is a different approach.
Quite the opposite. I see more speculative OOOE CPUs everywhere, now even in telephones, with compute cores along for the ride. The general programmability is only increasing, as is real performance at low power envelopes, and it's not happening by way of high-GHz in-order CPUs. If iOS and Android, or x86 notebooks, don't represent major computing markets, we're not in the same reality. Today, iPhones and iPads are even popular gaming devices.

With the billions in revenue the most popular Cell platform is generating this quarter, I'm curious what you mean by 'killed it'? Just curious.
The only major user of it was and is Sony. It had 7 years to become more than a minor niche part, and hasn't. The best it's been able to do beyond that has a little HPC for IBM, and the follow-ons to those quietly haven't appeared. The platform is dead (or maybe zomibifed? Frozen? In suspended animation?). It was a neat effort, was technically novel, didn't suffer from more common problems like design-by-comittee, and what follows it will be better off from its commercial failure...but the G80 family and its decedents have been following what the market has wanted, as one example, rather than what big suits have wanted to tell everyone they should have.

Assurance of overlapping development of future versions, which will improve both flexibility and performance, in ways that matter to users (which includes devs), is quite important. The G80 effectively evolved into Kepler, and everyone considering investing in the CUDA ecosystem expected that sort of thing, to help make it worth their while over time. G80 wasn't anything to write home about compared to the Cell (and, for any timing-critical work, Kepler would fall flat compared to it), but the fact that between the Cell's release and development death, NVidia had 3 uarchs released, and 2 more publicly in the pipe, compared to IBM giving the Cell DDR2 support, is a difference that's hard to overlook, especially when NVidia was behind by 2 years to start with.
 
Last edited:

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
No, they aren't. The consoles will go without PCs and not bat an eye. The PC gaming market needs the console market to subsidize it. You can look over the financials for any of the major publishers, if you take out MMOs(which are showing large problems this year even, well, by their standards) PCs just aren't a major source of revenue anymore. IMO I'd much rather see the top tier FPS, RTSs and MMOs stick to the PC with the consoles handling most of the rest. Unfortunately now, all the top FPSs seem to be console first, luckily that isn't the case with the other two yet, but if sales trends keep going the way they are it may only be a matter of time. This generation the set top consoles broke 2 billion in boxed retail unit sales(somewhere in the $100 billion dollar range, that is just boxed retail). If PC gamers start spending anything remotely approaching that on traditional core games then we will see improvement in the PC gaming sector. That is *far* more important then the development tools we have at our disposal.

MS, Sony, and Nintendo (especially Sony) are starting to make profitability a teeter totter knife edge of a process. Their success could turn to complete hell in an instant the way they have to spend money to compete with eachother. I think the PC market has much more growth, and F2Ps will become more popular. The console guys need to let them in if they want that part of the pie. Console players, as loyal as they love to act, I wouldn't be surprised to being getting sick of the console process of rinse and repeat. While the same issue exists on the PC, overall the plethora of titles and titles willing to deviate off the norm within the same genre is amazing. Even if the PC isn't the most profitable or desirable sector, it's certainly the most stable, and developers makers might find themselves having to rely on it and mobile to make their profits in the future *if* the console makers ever dig themselves into a hole so deep.

PC gaming is cheap these days. A laptop with fairly decent dedicated graphics isn't expensive, and the process is much more streamlined than ever. Unless the new system are powerhouse, perhaps the PC will finally be back on the podium in terms of people noticing how much more fluid, dynamic, and smooth PC games can be in comparison.
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Quite the opposite. I see more speculative OOOE CPUs everywhere, now even in telephones, with compute cores along for the ride. The general programmability is only increasing, as is real performance at low power envelopes, and it's not happening by way of high-GHz in-order CPUs. If iOS and Android, or x86 notebooks, don't represent major computing markets, we're not in the same reality. Today, iPhones and iPads are even popular gaming devices.

Intel is removing a great deal of the abilities of the i7 to get their UP parts into the mobile sector. nVidia, AMD and Intel are all dropping huge die budgets into pure compute logic(for Intel not for their mainstream parts, the Xeon Phi). The most profitable CPU AMD has designed was an in order core part(although, it didn't make them much money, the Snapdragon part has been making Qualcomm money hand over fist). Pretty much desktop CPUs, and to some extent the UP market(Intel is heading in the exact opposite direction) are the only areas we see the focus on running simple code.

The only major user of it was and is Sony.

It was designed for an embedded device by Sony? Do you consider the EE to be a failure? It's looking like it will move around 75 million units, not sure I consider that a failure, actually most models of CPU from Intel would be considered an enormous hit if they moved that many units(given, Intel spreads their sales around across many models, but they also spend a *lot* more on R&D).

The platform is dead (or maybe zomibifed? Frozen? In suspended animation?).

I linked an article earlier already pointing out that report you just linked was wrong. What IBM has stated was that Cell was getting put into the main POWER line(I believe POWER 8 revision). It makes a lot of sense for them, they can use the main line POWER8 core along with as many SPUs as needed for the given configuration.

G80 wasn't anything to write home about compared to the Cell (and, for any timing-critical work, Kepler would fall flat compared to it), but the fact that between the Cell's release and development death, NVidia had 3 uarchs released, and 2 more publicly in the pipe, compared to IBM giving the Cell DDR2 support, is a difference that's hard to overlook, especially when NVidia was behind by 2 years to start with.

nVidia went in the direction of Cell. Using clusters of relatively simple vector units to produce massive computational power per/mm or watt. The rise of GPGPU has more to do with the possibility of Cell's removal from mainstream use then anything else. Now, if you need compute power you can get it without using an exotic CPU design. Even Intel has dropped billions trying to get into the compute power market, they failed horribly with Larrabbee, but XeonPhi may work good in the HPC space.

Their success could turn to complete hell in an instant the way they have to spend money to compete with eachother.

What huge amounts of money do you think they are spending? Sony's push for Blu Ray penetration, which worked, and MS's huge layout for building up XBLive, which also worked, were the main cost factors that they faced that gave them negative results pressure. At this point in the game, their platforms are doing pretty good from a balance sheet perspective. Not going to say the profitability factor won't be more conservative next generation, but they don't have any major new platforms they are trying to push either(in terms of in conjunction with their consoles).

I think the PC market has much more growth, and F2Ps will become more popular.

The PC market has been trending down for years now. This generation the consoles have cleared 200 million hardware units with an average tie rate closing in on ten. They could clear 3 billion retail units sold in terms of games for this generation. I'd love to see PC gaming make a strong come back, but nothing exists to indicate that it will. The F2P model is nice, but a moderate hit on the consoles is pushing a quarter of a billion dollars in revenue generated. I'm not saying that's how I'd like to see things play out, but that is reality. Developers make games for the PC, lots of them, overall they just don't sell nearly enough to be competitive.

Even if the PC isn't the most profitable or desirable sector, it's certainly the most stable, and developers makers might find themselves having to rely on it and mobile to make their profits in the future *if* the console makers ever dig themselves into a hole so deep.

It is also the least stable. If a team starts work on a 360 today it is *exactly* the same hardware it was ~seven years ago when it came out. On the PC side, you have hundreds of different configurations that have appeared in the last 18 months. The PC's versatility is both an enormous blessing, and an enormous burden.

PC gaming is cheap these days.

If people would buy games it would take care of the market balance. No matter how cheap it gets, if people don't pay to play the games, then the consoles will continue to become more and more important(yes, it can still get a lot worse, really hope it doesn't come to that :( ).
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Intel is removing a great deal of the abilities of the i7 to get their UP parts into the mobile sector.
Haswell and its future ilk look awfully powerful, to me. The next Atom we'll have to see about (if Silvermont is indeed OOOE, was Intel going to do it anyway, or should we thank Bobcat? Or, is it all just a rumor? :)).

nVidia, AMD and Intel are all dropping huge die budgets into pure compute logic(for Intel not for their mainstream parts, the Xeon Phi).
What else should they do with it (the CPUs are already big enough as it is, and more caches won't help desktops much)? Very few of us have any use for >4 cores, yet most people can use a good GPU, today. For the most part, the parallel computing bits are icing if you can use them, today. In a few years, I hope that will have changed, though, as software companies should have more motivation, with pretty much everybody having the necessary hardware.

The most profitable CPU AMD has designed was an in order core part
I don't get what you mean, unless you're making a sarcastic jab at their typical financial performance. They didn't make enough money from their ARM division, so it can't be that, and their last strictly in-order CPU was the K5, that came out 15 years ago. The MediaGX Geodes, while technically newer, they bought and improved upon, but didn't fundamentally design. The K6 and newer have all performed instruction re-ordering, and you could even argue that one wasn't a pure AMD design. The 29K died for the sake of x86, so I can't imagine it was more profitable than what followed, either. OTOH, the K8, Llano, and Bobcat have done quite well for AMD, but none of them are in-order.

It was designed for an embedded device by Sony? Do you consider the EE to be a failure?
The EE was never marketed by anyone as anything but the PS2 chip, so I don't see where it had a chance to succeed or fail. The market chooses consoles by game and network, and the games are chosen by publishers. The PS3 was going to sell, and did, by game selection and console pricing. It could have been as weak as the Wii, and still sold plenty (not as much, granted), or it could have been 2-3x better, and not sold more at all. The market success or failure of the Cell is based on users buying Cell products because it has a Cell, like the supercomputer(s) made with PS3s (Sony's loss was their gain, and then Sony removed the Linux option), or the IBM systems; and 3rd parties choosing to integrate the Cell into devices they are designing, where they could choose something else.

When the Cell was alleged to be followed by more configurations and future versions around launch, and then everything but Toshiba licensing single SPEs pretty much fell flat...that's a failure, for a design that was intended/marketed to flourish and keep on evolving into more mainstream uses, which never appeared. It went into a few high-end Sony and Toshiba TVs, as part of the early push, and then not much else happened. IBM made some blades with DP-optimized versions, and then they didn't follow those up, despite having new versions planned.

I linked an article earlier already pointing out that report you just linked was wrong. What IBM has stated was that Cell was getting put into the main POWER line(I believe POWER 8 revision). It makes a lot of sense for them, they can use the main line POWER8 core along with as many SPUs as needed for the given configuration.
Do you have a link for that? All I've seen is the 32 series died, and then they made typical vague statements. Even so, that tells me the blades and workstations must have sold like crap, because there's no way they could get Power money for them, and I have a hard time believing potential customers would pay Power prices, either, when Teslas are almost affordable, today.

nVidia went in the direction of Cell.
They went a direction the Cell didn't: an evolving architecture growing up to have the features of other modern computer parts, and making deals with major companies to help them take advantage of the hardware. CUDA, its SDK, and moving towards more sane memory models have helped make them successful in the market, because it has allowed good programmers to go ahead and extract value quickly, and great programmers to make people :eek: over what can be done. They could also show people getting actual value from their hardware in reasonable amounts of time, rather than just hypothetical demos. Some of that was technical, but most of it was JHH having more good business sense than all the suits at IBM and Sony combined.

Using clusters of relatively simple vector units to produce massive computational power per/mm or watt. The rise of GPGPU has more to do with the possibility of Cell's removal from mainstream use then anything else.
IMO, if a Cell 2 came around in 2007-8 with bigger stores, faster buses, a better PPE, etc., and a Cell 3 came out in 2010-11 with more of that and a reduced or removed need for the semi-coherent explicit DMA push/pull mess, and all the associated marketing each time, NVidia might not have been the only strong player (software support for the Cell has improved a lot). I mean, I get all the limitations for a 1st gen part, but only researchers and DoD contractors are generally willing to invest in what looks like it could be a dead end. One of the several things that has kept NVidia's products compelling is that you can be reasonably assured there will be even better hardware and software from them every 2-3 years.

Now, if you need compute power you can get it without using an exotic CPU design. Even Intel has dropped billions trying to get into the compute power market, they failed horribly with Larrabbee, but XeonPhi may work good in the HPC space.
Intel will succeed, if just by force. Even if their success ends up only modest, they have too much to lose by letting JHH and others corner the market, and they are the only other company with a technology (x86+libs+compilers) that can come out and rival or best CUDA.
 
Last edited:

Arzachel

Senior member
Apr 7, 2011
903
76
91
@BenSkywalker: You seem to be of the opinion that triple A game development is the only valid and viable way which is so untrue it hurts. The concept of consoles and dedicated GPUs could vanish over night and PC gaming would still be going strong.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I don't get what you mean, unless you're making a sarcastic jab at their typical financial performance. They didn't make enough money from their ARM division

They sold of the designs from their ARM division to Qualcomm- the Snapdragon, for $65 Million. It has made Qualcomm significantly more profit over the last couple of years then all of AMD combined.

Do you have a link for that?

Yep-

Development around the original Cell processor hasn't stalled and IBM will continue to develop chips and supply hardware for future gaming consoles, a company executive said.

"I think you'll see [Cell] integrated into our future Power road map. That's the way to think about it as opposed to a separate line -- it'll just get integrated into the next line of things that we do," Menon said. "But certainly, we're working with all of the game folks to provide our capabilities into those next-generation machines."

http://www.pcworld.idg.com.au/article/363760/cell_processor_development_hasn_t_stalled_ibm_cto_says/

That is from Q3 2010, I had a link where he explicitly stated they were integrating it into the POWER8 line, can't find that one.

IMO, if a Cell 2 came around in 2007-8 with bigger stores, faster buses, a better PPE, etc.,

Cell+ did come out in 2008, though the main focus of that chip was significantly sped up DP performance.

Intel will succeed, if just by force. Even if their success ends up only modest, they have too much to lose by letting JHH and others corner the market, and they are the only other company with a technology (x86+libs+compilers) that can come out and rival or best CUDA.

I think JHH has more to do with Cell not being as big as IBM hoped then anything else. Yes, nV made the programming model easier for CUDA over time, although Kepler seems to a bit of a step backwards compared to Fermi, but the rise of GPGPU is the largest hindrance to greater Cell success anyway that I look at it. The Cell blades sold for ~$10K to $14K each, hard to argue them as worth while when they require custom coding anyway and you can get a Tesla for ~$2K-$3K.

@BenSkywalker: You seem to be of the opinion that triple A game development is the only valid and viable way which is so untrue it hurts.

The enormous advantage PCs or consoles have over the mobile platforms is AAA games. F2P/casual titles are growing significantly faster in the UP space then any other platform.

The concept of consoles and dedicated GPUs could vanish over night and PC gaming would still be going strong.

Farmville may do OK, maybe some legacy titles like WoW, other then that PC gaming would be in horribly bad shape without dedicated GPUs and particularly without the consoles as of now. It isn't the way I want things to be, it is the way they are.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
The enormous advantage PCs or consoles have over the mobile platforms is AAA games. F2P/casual titles are growing significantly faster in the UP space then any other platform.

Farmville may do OK, maybe some legacy titles like WoW, other then that PC gaming would be in horribly bad shape without dedicated GPUs and particularly without the consoles as of now. It isn't the way I want things to be, it is the way they are.

I presume you're a time traveler from the early 2000s. Minecraft, LoL, DayZ and Guild Wars 2 are all pretty damn huge in their own way and those are just the names off the top of my head and ignoring Valve and Blizzard completely. Kickstarter shows that not only people are willing to pay for games, they are willing to pay just on a pitch. And for every well known success, there are five lesser known indie games that are amazing on their own right (everyone reading this post, go try Love, you won't regret it even if you just stumble around blindly for a bit).

I've heard people say that PC gaming is dead ever since I've played games on the PC.

EDIT: And I totally forgot about Mechwarrior Online and Planetside 2. And freeware/open source games, I love Dungeon Crawl.
 
Last edited:

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
So, BenSkywalker, the PS4 has been announced, and it seems Sony has kicked Cell to the curb. What do you think about that? Do you agree with their decision?

This thread has been dead for 9 months. Please feel free to PM Ben or to start a new thread
-ViRGE
 
Last edited by a moderator:
Status
Not open for further replies.