Playstation 4 using PowerVR-gpu?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Ben, what's your take on this ancient AT article:

http://groups.google.com/group/alt.games.video.sony-playstation2/msg/62ff83d96ea78ea9?hl=en

The point is Anand (and devs according to him) was disappointed in Cell's performance for gaming use. This was written before the real thing was out, but still.

Thanks for posting that. I was looking for a copy of that article for some time since it was taken down. This article has been discussed in other threads and if I recall correctly Ben believes the article is complete nonsense. I don't entirely agree; I'm inclined to believe Anand knows what he's talking about.
 
Last edited:

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Well I will say this much. Games programmed for the PS3 look better than games programmed for the 360. Granted the PS3 is (from what I hear) a super pain in da butt to program for. But it's pretty clear the Cell is doing good for a sub $300 machine. The Cell also went into production well before the i7. Although it's kind of an apples/oranges discussion.

As for what the PS4 will use? I think it's all BS to grab page hits right now. I don't think we will see real specs for a long time.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Heh, I got a good laugh out of that article when it was first published, and it is even more amusing to read through now with hindsight on our side.

Several different points- first they talk about how the SPEs on Cell aren't going to be useable by devs, there is a YouTube vid floating around with UC2 devs running in debug mode showing 6 of them loaded at 100% and the seventh was also being used. That is a reality that advancement of the libraries brings to the table. The article was wrong when it was first published, hindsight amplifies this. When this was first published I told them they should get ahole of NaughtyDog or PolyphonyDigital and ask them what they thought of the development environment, they seemed to think that wasn't necessary as somehow the PC native devs that were being forced into the console market would know better then the guys who were quite used to in order processors with funky architectures(for all its' shortcomings on developer ease of use, Cell is easier then the EE it replaced).

Another area that is amusing is their stance that multi core gaming wasn't going to take off in the next 3-5 years. Clearly, we know that they were wrong there also. Probably the best example we have of that to date is GTA4, but many other titles are also pushing into utilizing quad cores even on the PC where such configurations aren't a given.

They focus extensively on ease of use, and that is an interesting angle to take. In the real world ease of use is absolutely nice to have, but if something is capable of more with a lot of extra work, odds are pretty high that someone is going to do it. For a developer coming from a P6 and later OoO architecture with Intel's incredibly good compilers I'm sure Cell and Xenos came across as cryptic and damn near unuseable. For those coming off of the old MIPS EE, they had a very different perspective.

I guess the best way to look at that article is they were comparing the chips to Pentium Ds and saying that they were inferior. Carmack is now talking about how much extra he can do on Cell versus i7 after working with it for a few years. I pointed out to them back then the mistakes they were making in that article, told them who to contact for a counterpoint but they weren't interested, the article was posted to try and build up the confidence of PC gamers or x86 devs who were dumbfounded by this funky technology.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Well I will say this much. Games programmed for the PS3 look better than games programmed for the 360. Granted the PS3 is (from what I hear) a super pain in da butt to program for. But it's pretty clear the Cell is doing good for a sub $300 machine. The Cell also went into production well before the i7. Although it's kind of an apples/oranges discussion.

As for what the PS4 will use? I think it's all BS to grab page hits right now. I don't think we will see real specs for a long time.

What's the best looking ps3 game? So far the best looking games I've seen this generation are the Gears of War games, though generic unreal engine titles usually aren't far behind.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
What's the best looking ps3 game? So far the best looking games I've seen this generation are the Gears of War games, though generic unreal engine titles usually aren't far behind.
I believe Uncharted 2 is by far the best looking console game. really shows that if someone fully utilizes the PS3 it can be pretty darn impressive.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
What's the best looking ps3 game?

Probably out of those shipping currently UC2. I think GT5 looks a decent amount better, but that doesn't ship for another few months. GoW falls fairly far behind the pack compared to the best on the PS3- UC2, UC, GT5 P, KZ2 all fairly easily are superior on the visual side of things. None of the Unreal engine games have been impressive at all on the PS3 IMO, they have been rather mediocre to flat out bad. The PS3 isn't well designed to leverage cross platform technology, it does far better when it is dealing with purpose built engines.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Probably out of those shipping currently UC2. I think GT5 looks a decent amount better, but that doesn't ship for another few months. GoW falls fairly far behind the pack compared to the best on the PS3- UC2, UC, GT5 P, KZ2 all fairly easily are superior on the visual side of things. None of the Unreal engine games have been impressive at all on the PS3 IMO, they have been rather mediocre to flat out bad. The PS3 isn't well designed to leverage cross platform technology, it does far better when it is dealing with purpose built engines.
yeah I agree. basically every cross platform game look slightly better on the XBOX 360. parts of Ghostbusters looked like poo on the PS3 compared to the 360.
 

Rezist

Senior member
Jun 20, 2009
726
0
71
They weren't exactly wrong about the cell when I think about it the 360 seems like an easier design with similar capability. UC2 and KZ2 are the best looking games. GoW 2 looks better then UC1 IMO (I have both). Maybe Ben can remember which ended up with more XTOR's 360 CPU or PS3 CPU?

The cross platforms are getting to equal out now but don't forget the PS3 came out a full year later thats huge in terms of hardware just think of video cards alone.

The 360 given it's launch price, power, and release date seemed like the most capable all factors considered but it failed due to inadequate cooling. Even the DVD9's haven't held it back as cartridges held back the N64.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
They weren't exactly wrong about the cell when I think about it the 360 seems like an easier design with similar capability. UC2 and KZ2 are the best looking games. GoW 2 looks better then UC1 IMO (I have both). Maybe Ben can remember which ended up with more XTOR's 360 CPU or PS3 CPU?

The cross platforms are getting to equal out now but don't forget the PS3 came out a full year later thats huge in terms of hardware just think of video cards alone.

The 360 given it's launch price, power, and release date seemed like the most capable all factors considered but it failed due to inadequate cooling. Even the DVD9's haven't held it back as cartridges held back the N64.

165m transistors for 360 CPU, 168m for 8xCell SPE's and then add the PPE.
PS3 has more hardware, and more theoretical power, but it's all about using the power.
In terms of graphics, I thought they weren't all that far off in terms of comparable GPU generations? (HD2000 vs 7800), plus the Xbox has the EDRAM for almost free AA?
Not to mention the RAM differences, 256 + 256 for the PS3 vs 512MB shared for the xbox (more flexibility, not sure if it's slower though)
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
yeah I agree. basically every cross platform game look slightly better on the XBOX 360. parts of Ghostbusters looked like poo on the PS3 compared to the 360.

That's mainly because the 360 is easier to program for. So they then port it to the PS3. They also then port another stinker over to the PC. :mad:
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Maybe Ben can remember which ended up with more XTOR's 360 CPU or PS3 CPU?

234Million transistors for Cell, a decent amount more then Xenon at 165Million.

Even the DVD9's haven't held it back as cartridges held back the N64.

DVD9 is starting to be an issue, by the end of the life cycle it is likely to be a reasonably large one. Certain games already can't run with all content without an install, others requiring disk swapping. Obviously the gap isn't as large as it was with cart compared to CDs(32MB-650MB is a much larger gap then 9GB to 25GB obviously), but it is increasingly a factor(not to focus too much on id, but they have made comments about how the 360 may end up with the poorest visuals because of DVD9 despite being on multiple discs, although they weren't saying that would be the case, just that it was possible).

In terms of graphics, I thought they weren't all that far off in terms of comparable GPU generations? (HD2000 vs 7800), plus the Xbox has the EDRAM for almost free AA?

360's GPU is clearly superior to the one used in the PS3. The problem when looking at it from that angle is that Cell is really good at processing graphics itself. The original design goal was to use 2 Cell chips in the PS3 without a GPU at all, much like Intel was shooting for with Larrabee. At some point they figured out this was a bad idea(much like Intel again) but the fact remains that Cell handles certain types of graphic effects extremely well and it has a good deal of bandwidth between it and RSX. This makes a situation where a game that is pushing the 360's GPU will look like crap on the PS3 unless the devs significantly rework their assets and code base to make use of the Cell/RSX hybrid rendering setup that top PS3 devs use for exclusive content. This doesn't happen very often(id is the only one I can think of doing ports with this level of attention paid to each version) and is the reason ports tend to look best on the 360. On the other side of that, titles that utilize Cell heavily may not run on anything else at all(if it is just a matter of graphics effects, PC GPUs will be able to handle it, if they are using Cell for more advanced physics code or comparable then it won't run without being scaled down).

Not to mention the RAM differences, 256 + 256 for the PS3 vs 512MB shared for the xbox (more flexibility, not sure if it's slower though)

This depends on how you look at it. While the 360 is much closer to a pure UMA, both Cell and RSX can write to each others memory. I don't want to say one is necessarily better then the other, there are certainly pros and cons to each setup. I think that both companies made the proper choice on memory topology given the chips they decided to use for their platforms. It would make littles sense for Xenos/Xenon to have the kind of memory that Cell/RSX has and vice versa.

That's mainly because the 360 is easier to program for.

It goes beyond that though. When you have one platform with a superior GPU, a port should pretty much always look better on it. The 360 without a doubt has a superior GPU to the PS3, with its' eDRAM AA is free a great deal of the time(could be always if devs weren't utilizing any creative frame buffer effects which are becoming far more popular) combine those factors together and you have a situation in which ports should look better on the 360. On the other end of that, a game that's really optimized to push the PS3 won't run on the 360 at all. An interesting trade off.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Does this mean Sony saw no hope in Larrabee? Thought Intel was working hard to get in the console business.


WELL iNTEL DOES OWN 25% OF IMGINATION. With larrabee being delayed one never knows . I think Intel has there eye on Imagination or they wouldn't have bought 25+% of the company . Intel has used there tech for years. Apple also has bought 15% of the company . So it is indeed interesting. But we went threw all this a year ago . now didn't we.

No way will Intel ever buy NV . Not when they have shown the interest in Power VR as does Apple and now its seems Sony also.
 
Last edited:

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
thats pretty intriguing. one would think though that if IMGTEC were really capable of putting out that level of performance in a chip of equal size to say an RV770 GPU that they would be dominating the entire high end market by now. one cant deny their involvement in intel's development of their IGP hardware (which admittedly sells really well), but if they could put that much performance under the hood you would think that they would find a way to put it on a dual slot card with a 350mm by 350mm die and 2 6 pin PCI-e power connectors, toss it in a box with some flashy CG chick with disproportonatly large tits and soul piercing eyes on the box, and position it in the market like an assault rifle in colonial america.


. The Timing here is intrigueing to say the least. The power of a desktop in SoC now thats intersting.

Multi-processing graphics technology delivers true scalability to 'super-high' performance points
Tokyo, Japan: Imagination Technologies, the leading multimedia chip technologies company, will begin shipping fully verified production quality IP for its POWERVR SGX543MP multiprocessor graphics cores to partners before the end of December 2009. Multiple lead partners have already been working with beta versions of the core for several months and several SoCs for a range of markets incorporating SGX MP cores are already in advanced design.

Imagination reports that three of its partners now have licences for SGX MP technology, which continues to extend Imagination's leadership of the embedded graphics acceleration market with multi-processor, shader-based solutions which address the rapidly growing demands for high performance graphics in a wide range of consumer electronics segments.

The technology can be delivered to customers in SGXMP2 (two-core) to SGXMP16 (16-core) variants.Graphics IP cores available from Imagination now range from SGX520, the world's smallest OpenGL™ ES 2.0 mobile core, to SGX543MP solutions for high-performance console and computing devices. POWERVR SGX543MP delivers performance comparable to many desktops, laptops and games consoles.

Further details of the innovative technology behind POWERVR SGX543MP are being revealed to press and customers at the Embedded Technology Show in Yokohama, Japan from 18-20th November 2009.

Tony King-Smith, VP marketing Imagination Technologies says: "With the ability to combine up to sixteen SGX543MP GP-GPU cores on a single SoC, we are now able to deliver capabilities to our licensing partners previously only thought the domain of the discrete GPU chipset vendors, while maintaining our unrivalled power, area and bandwidth efficiency."

The POWERVR SGX543MP family enables POWERVR SGX543 four-pipe programmable GP-GPU cores, to be integrated in a high performance, multi-processor graphics solution without performance or silicon area compromises. SGX543MP enables highly linear scaling of all aspects of GPU performance, specifically vertex shading, pixel shading, primitive setup and overall GP-GPU* functionality, whilst maintaining full software compatibility and with virtually no overhead in bandwidth usage.

At 200MHz core frequency an SGX543MP4 (four cores) will deliver 133 million polygons per second*** and fill rates in excess of 4Gpixels/sec**. Higher frequencies or a larger number of cores each deliver more performance. At 400MHz core frequency an SGX543MP8 (eight cores) will deliver 532 million polygons per second*** and fill rates in excess of 16Gpixels/sec**.

The highly efficient POWERVR SGX543MP family delivers near linear progression in vertex and pixel processing performance, unlike competitive solutions which scale only pixel performance. An SGX543MP2 delivers effectively twice the performance of a single SGX543 without compromise. And for a given workload the same bandwidth is required no matter how many cores are deployed – SGX543MP delivers faster performance by dividing the work on-demand, dynamically load balanced in parallel between cores.

The USSE2 (Universal Scalable Shader Engine v2), a key component of Series5XT architecture at the heart of SGX543MP is a scalable multi-threaded multimedia processing engine offering up to 2x the floating point throughput of earlier Series5 SGX IP cores. An extended instruction set with comprehensive vector operations and co-issue capabilities enables advanced geometry and pixel processing as well as GP-GPU tasks. These tasks are broken down into processing packets which are then scheduled across a number of multi-threaded execution units in the USSE2. This enables optimal hardware load balancing, maximum latency tolerance and efficient gate use, all accessed through a single software programming model and compiler.

POWERVR SGX543MP features:

the highest performance per mW of any embedded graphics core
highly linear scaling (over 95% efficiency) of performance in both geometry (vertex processing) and rasterisation (pixel/fragment processing)
dynamic load balancing and on-demand task allocation at the pipeline level
no fixed allocation of given pixels to specific cores, enabling maximum processing power to be allocated to the areas of highest on-screen action
scalable GP-GPU compute power, which can be fully utilised through all Khronos APIs including OpenGL ES 2.0/1.1, OpenVG™ 1.x and OpenCL™
use any number of cores, even or odd
no additional work for software developers; using one driver stack for all SGX cores means applications see a common SGX architecture via the standard APIs regardless of number of cores used
no additional CPU load when using multiple cores or loss of performance
individual cores can be disabled based on workload for optimal power saving
Editor's Notes

* GP-GPU stands for General-Purpose computation on Graphics Processing Units.

** All fill rate figures stated assuming a scene depth complexity of x2.5

*** All polygon throughput figures are based on real and achievable sustained throughput in a real SoC; they are not theoretical figures that can never be achieved in any practical application

About Imagination Technologies

Imagination Technologies Group plc (LSE:IMG) – a global leader in multimedia and communication silicon technologies – creates and licenses market-leading processor cores for graphics, video, multi-threaded embedded processing/DSP and multi-standard communications applications. These silicon intellectual property (IP) solutions for systems-on-chip (SoC) are complemented by strong array of software tools and drivers as well as extensive developer and middleware ecosystems. Target markets includemobile phone, handheld multimedia, home consumer entertainment, mobile and low-power computing, and in-car electronics.Its licensees include many of the leading semiconductor and consumer electronics companies. Imagination has corporate headquarters in the United Kingdom, with sales and R&D offices worldwide. See: www.imgtec.com.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
lol seriously though Nemesis when was the last time Intel Capital invested in a smallish company/startup that went on to actually turn into something?

Getting VC'ed by Intel Capital is kinda like the tech industry's equivalent of being on the cover of Sports Illustrated, its all downhill from there.

You ever notice we only ever hear of Intel investing in this company or that company? We never hear the other side of the story pan out with Intel selling their shares for zillions of dollars 5 yrs later or becoming filthy rich by catching the next big wave of killer apps by being in on the ground floor.

It is a vestige of Andy's paranoia strategy, throw money at anything with a pulse and by chance they should grow to become a threat in 20yrs you at least own a piece of the new action.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Well I for one didn't understand that Imagination Tech was a startup company . When was it they started. Add Intels 25%+ to Apples 15% = 45%+ Ownership by 2 companies. Desk top performance on a SoC is great . And getting that efficiency as a bonus is so much more glitter.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
lol seriously though Nemesis when was the last time Intel Capital invested in a smallish company/startup that went on to actually turn into something?

Getting VC'ed by Intel Capital is kinda like the tech industry's equivalent of being on the cover of Sports Illustrated, its all downhill from there.

You ever notice we only ever hear of Intel investing in this company or that company? We never hear the other side of the story pan out with Intel selling their shares for zillions of dollars 5 yrs later or becoming filthy rich by catching the next big wave of killer apps by being in on the ground floor.

It is a vestige of Andy's paranoia strategy, throw money at anything with a pulse and by chance they should grow to become a threat in 20yrs you at least own a piece of the new action.

How about Xscale? They sold that for a nice profit, right?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
How about Xscale? They sold that for a nice profit, right?

They sold it for $600m, whether they netted any profit or sold it for a loss when you factor in the R&D investments was never made a matter of public record. But if you had to fathom a guess, which would suspect is more likely the case?
 

IlllI

Diamond Member
Feb 12, 2002
4,927
11
81
Ehhhh? We use voxels every day (high-end scientific visualization) but you need a truckload of memory and bandwidth - it's never meant to be for mainstream gaming...

i meant in games duh
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Is there any need for deferred rendering if we have efficient z-culling?


The cool part about powerVr (imagination is when TB is operatiing anything that isn't shown on the screen isn't rendered not wasting transitors . Intels GMA doesn't do this with i3 but it will with sandy bridge. Looks like in a 2 horse race the 3rd horse was a sleeper to almost all. I was betting on Larrabee or Imagination Tech . Its going to be really interesting this 32nm process.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Bump to point out the stupidity of the other thread.

Actually Why is it stupid? Looks like Imgination is in our near future . Bigger than any thought (Almost any. A few exceptions of geeks out there the comprehended what the PowerVR articles were written about. )

2 Imagination Threads(topics is nice). A break from all the ATI/NV BS! Besides its be interesting to see who has better command of the FORCE. Me or unknown know nothings.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Actually Why is it stupid? Looks like Imgination is in our near future . Bigger than any thought (Almost any. A few exceptions of geeks out there the comprehended what the PowerVR articles were written about. )

2 Imagination Threads(topics is nice). A break from all the ATI/NV BS! Besides its be interesting to see who has better command of the FORCE. Me or unknown know nothings.

The other thread started with a link to a story from the same time as this thread was started, in 2009, about rumours of PowerVR in the PS4.

Other AT thread -> Xtremesystems thread -> some dutch website -> FNG story about PowerVR being used in PS4.

This thread -> FNG story.

The other thread is discussing a 3 month old PowerVR story, the exact same story as this thread was discussing. For some reason someone resurrected the XS thread and then someone here linked to it, so I am resurrecting this thread, since we don't need another thread about the exact same story surely?
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Well than by your reasoning the fermi thread is stupid . Because we had this thread in june of 09 . 9 months ago . How many fermi threads since than?

http://forums.anandtech.com/showthread.php?t=290742&highlight=300

nvidia first to market with DX11 GPUs?

--------------------------------------------------------------------------------

Well let's take bets.

I say Oct 24th for the Gt300 and Dec 14th for the 58xx. Wreckage lost his bets. I on the other hand was pretty much spot on .
 
Last edited: