[ArsTechnica] Next-gen consoles and impact on VGA market

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The Last of Us, Unchartered 3.

2 games of out 300 do not make for a solid case, especially when nearly all cross-platform games run way better on the 1 year old Xbox360. 360 also has the better looking racer in Forza 4 than PS3's best racer and Gears of War 3 is hardly worse looking. The Last of Us didn't even come out yet, so you using it is pointless right now.

That would be a fantastic point, if it were true. Go build a PC with an i7 and 7800GT and put 256MB of RAM in it and see how games run. The CPUs are not the only factor.

So what's the point of wasting $ on a complex CPU and crippling the console with a slower GPU? I am pretty sure 7800GT at 1024x768 or 1280x720 at LOW would provide similar graphics on the PC. Uncharted 3 runs below 1024x768. 7800GT can easily render that level of graphics.

You just keep saying "360 good, PS3 bad". As far as them using the RSX, the original plan was to use dual Cells and no GPU at all, everyone pointed out to KK that that was going to be an abject failure so the RSX ended up being swapped in to replace the second Cell very late in development- it *increased* the cost of the system.

I am not arguing 360 vs. PS3. I am arguing that the Cell's supposed superiority for games accounts for $nil in the real world against the 360. The performance difference between the RSX and R500 shouldn't be that dramatic. OTOH, what you are saying is that the Cell is dramatically better than any modern CPUs and thus by extension should more than makeup for what a 15-20% slower RSX? But it doesn't.

Face-Off: Sleeping Dogs

Xbox 360
360_015.png


PS3
PS3_015.png


"The Xbox 360 game appears to enjoy higher-resolution normal maps and textures (top) and has less aggressive LODs. The Microsoft console inches ahead in terms of frame-rates, despite enjoying a 17 per cent resolution advantage. On balance the 360 game is the better buy: minor controller issues aside, the cleaner presentation is preferable over the murkier look of the PS3 game."

A $400 1 year old console looks better than a $600 console that had 12 months to beat the specs of the 360 and failed. That in a nutshell is why the Cell is a failure and an overpriced bucket of turd. Sleeping Dogs and GameSpot's 5-6 series of game comparisons between PS3 and 360 continue to reveal that 360 is the more powerful console for cross-platform games. That supports many people's thoughts here that the Cell is not more powerful for games than even a tri-core PowerPC CPU, and even if it is theoretically, it's too hard to program for efficiently, which doesn't allow it to be more powerful in the end for 99% of games. What you end up with in the end is the Cell's touted superiority is MIA. The 4-5 generations ahead of Core 2 Duo claim you made isn't remotely true because that should have been enough to make up for what is otherwise is likely a 15-20% minor performance difference between the GPUs. A single core in the Core i7 Nehalem, that Metro 2033 developer said is faster than the entire Tri-core Xbox360 CPU, is not 4-5 generations ahead of Core 2 Duo (it's only 1 generation because Nehalem is around 17.5-20% faster in IPC than Core 2 Duo/Core 2 Quad Conroe/Kentsfield generations were). Surely if the Cell was far superior to modern CPUs for gaming, as you keep falsely claiming, it should have shown its strength and more than make up for the minor differences in GPU power. Using your logic, it'll take Intel until 2015-2016 to match the performance of the Cell for games.....ya ok! Basically so far you have not provided any sufficient support to justify your hypothesis that the Cell is actually: (1) very fast for games compared to modern x86 CPUs; (2) is superior to modern x86 CPUs in terms of performance/watt or performance/$. All the comparisons of Xbox360 vs. PS3 performances with more or less similar GPUs , developer/artist statements that current consoles have completely run out of processing power and developer remarks on how the Cell is only as fast as a single Core i7 Nehalem all contradict your view. You provided nothing to dispute these claims other than John Carmack's statements on Rage, where he himself admitted his team didn't allow for proper performance optimization on the PC.

None of this matters though because PS4 is ditching the Cell. I don't know any company that would throw out a $37.73 Emotion Engine CPU that was so amazing for games and replace it with a far more expensive AMD Fusion chip, unless Sony themselves are admitting that the Cell was a total failure for gaming.
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
360 also has the better looking racer in Forza 4 than PS3's best racer and Gears of War 3 is hardly worse looking.

Take your PS3 and your 360 and do this simple little experiment. Fire up Forza4 and GT5 and start a race at night, in the rain, in 3D mode and compare the graphics side by side. There are some rather huge differences in the technology of Forza4 and GT5, even if you just look at the visual aspect.

So what's the point of wasting $ on a complex CPU

The CPU was both cheaper *and* more powerful then anything else. The general CPU design was so good that MS stole the main core and used it, oh wow, the 360. If you ever care to enlighten yourself-

http://www.amazon.com/The-Race-New-G.../dp/0806531010

Uncharted 3 runs below 1024x768.

Really?

In terms of framebuffer resolution, Uncharted 3 appears to be rendering once again in full 720p (1280x720) just like the last two games in the series. However, unlike both of those titles, this time around Naughty Dog seems to have used a different anti-aliasing solution for UC3.

http://imagequalitymatters.blogspot.com/2010/12/tech-analysis-uncharted-3-gameplay.html

IQMatters says it runs at 1280x720. You saw somewhere else that said it rendered lower? That was probably running in 3D mode.

I am pretty sure 7800GT at 1024x768 or 1280x720 at LOW would provide similar graphics on the PC.

http://www.youtube.com/watch?v=MvQmNA-IurI&feature=related

What game on the PC looks like that on low settings? Crysis isn't even close on low settings(not that Crysis cranked up doesn't look better, but on low? Not even close).

I am arguing that the Cell's supposed superiority for games accounts for $nil in the real world against the 360.

I know what you are saying, you just don't know what you are talking about. Are the best looking PS3 games better then the best looking 360 games? Yes. Not too many people argue that. Does the 360 have a faster GPU? Yes. Does the 360 have more available RAM? Yes. Does the 360 have *significantly* faster FB access? Yes. Is the 360 easier to develop for? Yes. And yet it loses in best vs best when talking about image quality. Hmmmm, what could it be?

A $400 1 year old console looks better than a $600 console that had 12 months to beat the specs of the 360 and failed.

Wow, you don't follow consoles at all, do you? Couple of things, first off, the PS3 launched at $499(they had a $599 model available- it wasn't the base price). Second, and actually I should have covered this first, when arguing the merits of the 360 over the PS3 *never* bring up value. Ever. The 360 gets utterly obliterated, it isn't remotely close(go ahead over to the console forum, or the AV forum, or the AVS forums and ask, it is *not* a contest).

http://www.avsforum.com/t/650544/one-and-only-ps3-as-blu-ray-player-thread

32K posts in that particular thread. The PS3 was, by a *massive* margin, the best BluRay player you could buy when it launched, and the terrible pricing you mentioned?

http://www.pcmag.com/article2/0,2817,1977327,00.asp

That is one of the models it went head to head with. That is a $999 price point a few months before the PS3 shipped. That was the reality of the market. Was that a BDLive capable player? Nope, the spec didn't exist yet. Was that a BD3D player? Nope, the spec didn't exist yet. Was the PS3 $400 cheaper and has all of those features? Yes, yes it was. I know, you can quickly counter with the 360 was a DVD player, and those were still going for around $75 which is valid, a bad point for you to bring up but it is valid.

The launch PS3s also had full PS2 hardware included inside, so they had "perfect" BC, at the time the PS2 was still going for $150. Yes, the 360 kinda sorta had some BC too(it got better over time, it sucked at launch, I know, I was present).

So, the PS3 was the best Blu Ray player you could by, and was hundreds of dollars cheaper then the inferior products also available. It had built in BC for, by far, the most popular console in history and it cost $499. Oh yeah, it had the actual PS3 hardware going for it to.

In terms of what Sony put in the box versus MS there isn't an argument, Sony *slaughtered* MS from an overall value perspective. If you tried to buy *just the drive* that Sony included in a consumer form factor you were going to have spend quite a bit more, it would have been slower(BRD load times were terrible on all first gen players *except* the PS3), and it would have been outdated when BDL rolled out, and if you replaced it with a new player at that point it would have been outdated when BD3D came out. Cell allowed the PS3 to be the best BR player on the market, and it was also the most affordable.

The 4-5 generations ahead of Core 2 Duo claim you made isn't remotely true because that should have been enough to make up for what is otherwise is likely a 15-20% minor performance difference between the GPUs.

So if someone upgrades from a Core 2 Duo to an i7 they should have better graphics? Are you even thinking for a split second about what you are saying? You keep talking up cross platform, or bringing up accountants quotes for performance(Executive Producer? Really?). Can you even answer the trivial question I posed? If you can't, you really have *no clue* what we are even discussing.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Watching people argue about which crapbox is more "powerfull" is like watching midget-MMA :whiste:

It's outdated 2006 hardware...it sucks...hence the term "consolities"...ffs :cool:
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
David Shippy, the Former IBM technical architect that worked on both the PS3's Cell-specific PPU chip and Xbox 360's Xenon CPU concluded that:

"At the end of the day, when you put them all together, depending on the software, I think they're pretty equal, even though they're completely different processing models."

And that's all there is to it. Both consoles are severely outdated and no amount of proclaiming about the Cell's supposed superior performance will ever materialize into anything you can substantiate.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Watching people argue about which crapbox is more "powerfull" is like watching midget-MMA :whiste:

It's outdated 2006 hardware...it sucks...hence the term "consolities"...ffs :cool:

We are not at all arguing which console is better. What's being discussed is that the Cell's supposed superiority never showed up against Xenon CPU in the Xbox360. Thus by transitive property in mathematics, since modern x86 CPUs > Xbox360 Tri-core PowerPC CPU for games and since the Cell has not proven to be better at running games than the Xbox360 did in the last 6 years, it cannot be true that the Cell is superior to x86 processor for games, and it definitely cannot be true that the Cell is many generations ahead of current x86 processors for running game code in the real world environment.

So if someone upgrades from a Core 2 Duo to an i7 they should have better graphics? Are you even thinking for a split second about what you are saying? You keep talking up cross platform, or bringing up accountants quotes for performance(Executive Producer? Really?). Can you even answer the trivial question I posed? If you can't, you really have *no clue* what we are even discussing.

You can turn more settings up and have smoother framerate with a faster CPU. CPU limitations have been discussed for years. So yes a Core i7 CPU + GTX680 will have better graphics than a Core 2 Duo and GTX680 since the latter wouldn't be able to play the game smoothly with as many characters on screen, with as high physics effects, etc.

I am not going to continue arguing the Cell with you any longer. It's pointless since 99.999% of the world agrees that it's outdated junk and Sony will not be using it in their next console. If you want to debate how awesome the Cell is, please open a thread in the console section.
 
Last edited:

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
I thought we all figured this out years ago. Xbox GPU is marginally faster than the PS3 GPU. PS3 CPU is substantially faster than the Xbox CPU. But GPU is more important for games. So when they do multiplatform games, they don't code it to take advantage of the superior PS3 CPU. The CPU's designated workload for multiplatform games is thus the same on both consoles and consequently developers are left with a slightly slower PS3 GPU and have to code the game with slightly reduced effects (usually hardly noticable).

However, a game built from the ground up on the PS3 can take advantage of the Cell. GPU is more important, true, but when the PS3 CPU can theoretically be nearly twice as fast as the 360 GPU, it is capable of more than compensating for a marginally slower GPU. This is why Uncharted and Killzone series, along with some other exclusives, have visuals that can never be matched on 360.

So it goes. Or so they say and has been said since around 2009 when Uncharted 2 and Killzone 2 wowed the console world but the PS3 had multiplatform problems (Bayonetta).
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
The Cell was a retarded idea. All they really ended up doing was spending a ton of money on a CPU that developers ended up mostly using to aid the GPU, when they could have spent that same money on a much beefier GPU.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
As an impartial observer I judge RussianSensation to be the victor, hands down his arguments are rock solid and backed by evidence.
Quite a surprise since I had believed (due to hearing the big lie being told repeatedly) the hype of "PS3 has best looking games this console generation due to the most powerful hardware" and he has shown that it is clearly not the case.
Thank you for educating me RussianSensation.
I did suspect something was a bit off though when I found out PS3 was incapable of running skyrim because unlike the xbox which has 512MB of ram, it has 2x256MB of ram. half limited to GPU and half limited to CPU and neither being properly accessible by the other (extremely slow to access for the other). So skyrim chokes on it due to lack of ram which it does not do on the xbox360

Watching people argue about which crapbox is more "powerfull" is like watching midget-MMA :whiste:

heh :)
Although to be fair we are all tech nerds here so we can argue about something technical even if it is irrelevant in the big scheme of things.
 
Last edited:

dangerman1337

Senior member
Sep 16, 2010
357
23
81
IIRC a reason why Sony is probably ditching the cell is because its Compute functions (I think it has them) can be done on modern GPUs anyways since the RSX couldn't really then.

So it goes. Or so they say and has been said since around 2009 when Uncharted 2 and Killzone 2 wowed the console world but the PS3 had multiplatform problems (Bayonetta).

Bayonetta's problems on the PS3 are mostly derived from the fact that the PS3 version wasn't done by the main team and IIRC it was originally going to be an Xbox 360 exclusive.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
However, a game built from the ground up on the PS3 can take advantage of the Cell. GPU is more important, true, but when the PS3 CPU can theoretically be nearly twice as fast as the 360 GPU, it is capable of more than compensating for a marginally slower GPU. This is why Uncharted and Killzone series, along with some other exclusives, have visuals that can never be matched on 360.

Do you have any real world test that shows that a CPU such as the cell can do vertex and pixel shader operations faster in games than an RSX GPU? Never in the history of any CPU development has a CPU on its own ran a 3D game faster than a similar generation mid-range GPU. The Cell and the RSX GPU come from the same timeframe. Even a Core i7-3960X cannot run games faster than a 7800GT. GPUs are very parallel in nature and 7800GT had 24 pixel and 8 vertex shaders as well as 8 ROPs in the case of the RSX. There is no way a Cell can run games faster than the RSX GPU, which is why Sony got rid of the 2-cell prototype and had to put in a GPU since even 2 Cell CPUs couldn't run games on their own. The RSX was not a planned inclusion in the beginning of PS3's architectural layout. It was added later in the planning stages when people realized that the Cell cannot even run any games on its own, not even 2 of them can.

Also, we have no evidence that if Naughty Dog ported the game to the 360, that it wouldn't run just as well on the 360. Xbox360 vs. PS3 aside, what's being claimed in this thread is that the Cell is actually superior for running games than any modern x86 CPUs, has superior performance/watt, superior performance/$, a hypothesis that implies that PS4 will have a slower, more expensive, and worse performance/watt CPU for no reason but apparently to run inefficient x86 code?...........:sneaky: There is no logic that can actually explain why Sony would abandon the Cell if it was so much superior to x86 processors in all of these aspects for running games.

IIRC a reason why Sony is probably ditching the cell is because its Compute functions (I think it has them) can be done on modern GPUs anyways since the RSX couldn't really then.

But if the Cell was so much superior to running game code than a modern x86 CPU, why couldn't Sony use Cell 2.0/enhanced redesign and still pair with with a compute capable HD7000 GPU? The Cell just has 1 PowerPC core and 7 PPEs, but the main PowerPC core has been shown to be very slow for running modern game code on a MAC, before Apple woke up and replaced it with Intel Core processors. PowerPC CPU architecture was also used in the Gamecube and even then it didn't show a particular advantage against a Pentium III 733mhz in the Xbox1. Resident Evil 4 looked great on the Gamecube but overall I cannot say that the Gamecube had better graphics than Xbox1 did. Another point that I made is that the GPU is really what's responsible for driving graphics in games and it becomes a greater bottleneck sooner.

If Sony implements the Fusion APU + discrete GPU design, effectively enabling dual-graphics in the PS4, even if MS puts a 16-threaded (4C/4T) PowerPC processor in the 720, it will not compensate long-term for lack of GPU power against a quad-core CPU design with 2x HD7000 graphics cards. The CPU can help with some graphical aspects, AI, physics, but for the most part what you see on the screen is driven off the GPU. With DX11 games, the complex graphical effects such as global illumination/multiple area lights, contact hardening shadows, tessellation/geometry, HDAO/SSAO, will all push the GPU to the limits. Even a 16-core Cell will not help you with any of those effects because it has no shot at all against a 600-800 shader GPU. It can't do tessellation for starters. That's why we have 1536 shaders in a GTX680. The 7970 has 32 Compute Units, in total 2048 shaders. Both Kepler and GCN GPUs have dedicated geometry engines to perform tessellation. All these next generation DX11 features require as much GPU power as possible, which is why I think whichever console has the faster graphics card by at least a decent amount (40-50% faster), should have better looking games overall. This could explain why Sony is gunning for the possible Fusion APU + discrete GPU approach since they finally saw the light and realized the GPU is what's driving the games!
 
Last edited:

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
The Cell was a retarded idea. All they really ended up doing was spending a ton of money on a CPU that developers ended up mostly using to aid the GPU, when they could have spent that same money on a much beefier GPU.

True. Cell was a waste in the end. GFLOPS don't mean anything on a CPU when they're working to aid a relatively underwhelming graphics processor versus something so efficient and excellently designed like Xenos.

The rumors of Tahiti level graphics in the 720 or PS4 are somewhat unlikely I think. It's probably too expensive, and releasing $400+ consoles in this day and age of "cheap electronics" could spell disaster when consumers are more involved with their mobile devices. Set top boxes for HDTVs are extremely cheap, and while they don't have the same graphics processing power as the consoles, they could in theory deliver similar gaming experiences with some relatively cheap processing upgrades. Custom built HTPCs are a fine example of consumers making cheap yet powerful boxes happen.
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
True. Cell was a waste in the end. GFLOPS don't mean anything on a CPU when they're working to aid a relatively underwhelming graphics processor versus something so efficient and excellently designed like Xenos.

The rumors of Tahiti level graphics in the 720 or PS4 are somewhat unlikely I think. It's probably too expensive, and releasing $400+ consoles in this day and age of "cheap electronics" could spell disaster when consumers are more involved with their mobile devices. Set top boxes for HDTVs are extremely cheap, and while they don't have the same graphics processing power as the consoles, they could in theory deliver similar gaming experiences with some relatively cheap processing upgrades. Custom built HTPCs are a fine example of consumers making cheap yet powerful boxes happen.

They may have no choice but do this as they may need to sell the new consoles on their "wow factor". Many consumers think that because this generation is HD that you can't really improve the graphics nor games as a whole. Obviously we know this to be a load of cobblers, but the average consumer will need to see obvious improvements or they just will not see the appeal.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Cape Verde will get you those big improvements compared to Xenos, despite diminishing returns. The first titles out the door would be mentioned as looking like 360 titles but with higher resolution and AF. The 360 "suffered" the same description until fully formed 360 titles began appearing in late 2006.

Xenos to 1 GHz Cape Verde in terms of rendering power ratio would be similar to XGPU to Xenos, about five times the overall capability in the basic three areas:

GPUS: XGPU ------> Xenos ------> Cape Verde

GFLOPS: ~42 -----> 240 --------> 1280
Texels:~1800 ----> 8000 -------> 40000
ROP: ~ 950 -----> 4000 -------> 16000

It's quite a bit of room to grow in terms of graphical fidelity, especially with "real" tessellation and direct compute. It's not a big, expensive GPU core, and while I'm sure it can get hot due to how fast it can go and how small it is, it can be easily managed.
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
16,581
7,075
136
They may have no choice but do this as they may need to sell the new consoles on their "wow factor". Many consumers think that because this generation is HD that you can't really improve the graphics nor games as a whole. Obviously we know this to be a load of cobblers, but the average consumer will need to see obvious improvements or they just will not see the appeal.

There is no doubt this generation will be less of a wow factor than previous generations. This is something Sony and MS will have to deal with.

Xenos to 1 GHz Cape Verde

Ghz Cape Verde isn't going to happen.
 
Feb 4, 2009
35,862
17,402
136
Easy question here guys, lets face it 1080p is here to stay. How much more power is needed for 1080p games?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Ghz Cape Verde isn't going to happen.

Thoughts on why?

HD7950M Pitcairn 1280 Shaders, 256-bit bus uses just 50W of total power. I think they don't even need to focus on Cape Verde. If they can afford it financially, they can just go with a Pitcairn GPU.

Cost is the only issue why it may not happen. Power consumption, heat and cooling are not a problem at all.

Easy question here guys, lets face it 1080p is here to stay. How much more power is needed for 1080p games?

Actually that's not an easy question. It depends on how good looking the games will become and what framerate you require to have smooth gameplay in those games. If the next generation console games only look like BF3/Metro 2033 for the next 3-4 years, all you'd need is a $190-220 HD7850/7870 and FXAA/MLAA/TXAA without MSAA and it's fine. If they look like the new Dawn Demo, GTX680 SLI will be squashed like a bug.

This is with just 1 character on the screen.....
1343730217Q8scDBKVfx_2_4.gif


The current estimate is that GPUs need to become 2000x more powerful to be sufficient for life-like graphics. So no matter what will be in PS4/Xbox720, it will be too slow in 5-6 years from the time the console launches assuming graphical development continues to grow.

"Creating a game that operates on a level of fidelity comparable to human vision, Sweeney says, will require hardware at least 2,000 times as powerful as today’s highest-end graphics processors. That kind of super-hi-def experience may be only two or three console generations away." ~ Kotaku, May 18, 2012

Also, 1080P is likely to become 'standard resolution' in 5-7 years once 4K TVs enter the mainstream market; and then we'll want even faster graphics cards and consoles.
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Easy question here guys, lets face it 1080p is here to stay. How much more power is needed for 1080p games?
At least as much as current $200 PC graphics cards w/ 2GB VRAM, then more elsewhere for the non-graphical goodies. Possibly even more, for future games.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Easy question here guys, lets face it 1080p is here to stay. How much more power is needed for 1080p games?

Do you know how much computational power is required to generate Avatar level graphics in real time in 1080p?
Also, 2k, 4k, and 8k already exist.
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
Easy question here guys, lets face it 1080p is here to stay. How much more power is needed for 1080p games?

Well AMD may have given away a few tidbits in this blog post with regards to tessellation and overall efficiency for rendering. I wouldn't be surprised if this approach was adopted for the Xbox 720.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Well AMD may have given away a few tidbits in this blog post with regards to tessellation and overall efficiency for rendering. I wouldn't be surprised if this approach was adopted for the Xbox 720.

Wasn't that "stance" altered at AMD when NVIDIA released GF100?
From "tesselation is the new frontier"...to "TOO much tessealtion" (as NVIDIA stomped their tessealtion performance?

Hard to keep track of what AMD thinks...they bounce around alot ^^
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Wasn't that "stance" altered at AMD when NVIDIA released GF100?
From "tesselation is the new frontier"...to "TOO much tessealtion" (as NVIDIA stomped their tessealtion performance?

Hard to keep track of what AMD thinks...they bounce around alot ^^

Tessellation is a terrific tool to deliver more realism and visual fidelity. However, careless use of the technology can quickly overwhelm the GPU and cause it to perform less efficiently with no visible benefit in image quality
From the linked post.
Which is the same as they said when games came out.
Which also in fact agrees with what NV did by having a scaled number of tessellation units depending on GPU (while AMD was more fixed in their amount per GPU).

The idea of tessellation good. Unnecessary tessellation bad. Too much tessellation bad.
And AMD have had tessellation in their GPUs almost consistently since 2001 at least, and it was also included in the Xbox 360 anyway.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
From the linked post.
Which is the same as they said when games came out.
Which also in fact agrees with what NV did by having a scaled number of tessellation units depending on GPU (while AMD was more fixed in their amount per GPU).

The idea of tessellation good. Unnecessary tessellation bad. Too much tessellation bad.
And AMD have had tessellation in their GPUs almost consistently since 2001 at least, and it was also included in the Xbox 360 anyway.

Comparing TruForm to DX11 tessellation...:hmm::sneaky::D:thumbsdown:
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
From the linked post.
Which is the same as they said when games came out.
Which also in fact agrees with what NV did by having a scaled number of tessellation units depending on GPU (while AMD was more fixed in their amount per GPU).

The idea of tessellation good. Unnecessary tessellation bad. Too much tessellation bad.
And AMD have had tessellation in their GPUs almost consistently since 2001 at least, and it was also included in the Xbox 360 anyway.

Well there's no point in tessellation that you can't actually see, is there? :) AMD arrived at their designated sweet spot of one polygon every 16 pixels, which is much more sensible than 1 polygon per pixel, but I could imagine lower values than 16 being beneficial, perhaps as low as 9.

Also, with the other features that DX11 brings such as full resolution real time reflections, it may make game development a bit easier. As soon as more effects can be done in real time on the hardware, you don't have to pre-bake almost everything (like the Uncharted games and Gran Turismo 5) just to make it look good, it can also react to player interaction, providing a much more immersive experience.

With games such as Watch Dogs and Star Wars 1313 appearing, things look promising.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Based on what Cavet Yerli has stated on several occasions, in its current form tessellation is all pre-rendered assets in games. We have seen how extreme tessellation can make a good look worse than not having it enabled at all. We have also seen how excessive tessellation makes things look unnatural (Unigine Heaven Extreme vs. Normal). The next evolution should be adaptive tessellation that adjusts the level of tessellation in real time and doesn't require pre-rendered assets. CryEngine 3.4 shows a quick demonstration.
 

markydV2

Banned
Sep 2, 2012
10
0
0
Right, the replay mode looks great, but the actual game doesn't run at 1920x1080, and it doesn't run have that 60 fps sense of speed either. GameTrailers has the Review. They also mentioned that while GT5's core/premium group of cars (200) were well done, but the rest (800 standards cars) look very plain, lack cockpit views or ability to customize them in the same was as the premium models, in other words not up to the rest of the game's standards. In local split-screen mode with a friend, GT5 cannot render other AI cars in the game, which is most likely a limitation of PS3's graphics/CPU power capabilities.

"The most disappointing and inconsistent aspect of Gran Turismo 5 is by far the visual presentation. It's hard to appreciate how good a car looks when its covered with jagged flickering shadows and poorly blended dust trails. Unbelievably the background elements like crowds and trees are crudely constructed or completely 2-dimensional. The chasm and quality between the standard and the premium models is extreme indeed. Presentation Score 8.6. " ~ Gamertrailers Review (see link above)

And now Forza Motorsport 4 Review from Gametrailers @ 7 min mark:

"...intricately detailed racing models, for all 500 of its cars, with smooth shading, image-based lighting and reflection techniques that really let them shine. The handful of Auto Vista models [~25 of them] go beyond that with fully detailed engines and small touches like dials and labels. Presentation Score: 9.1"

In other words Forza 4 car models are more consistent overall and the game runs faster than GT5 does. It looks like the Xbox360 is actually a more powerful console than the PS3. Whether this is because of cost cutting or PS3's inability to handle 60 fps with many detailed cars like Forza 4 can I can't say for certain, but what I can say from this comparison without a doubt is that Cell's power advantages are nowhere to be seen against the Xbox360. Xbox360 itself though pales in comparison to a $50 AMD Phenom II X4 + $60 HD6670 setup today. So by extension if PS3 cannot convincingly outperform the Xbox360, it cannot be faster than modern AMD/Intel's CPUs for games.

If the Cell is supposedly 4-5 generations ahead of Core 2 Duo (the modern architecture at the time of PS3) as you claimed, why do PS3 games look so outdated, and hardly any better than Xbox360 games? Dark Souls on PS3 vs. PC version.....Uncharted 3 doesn't even run at native 1280x720....etc. etc.

Sorry, I mixed up 2/3 with 3/4. Forza 4 definitely looks better than GT5. Here is a 2 min video head-to-head on graphics:
http://www.gametrailers.com/videos/fg0u7h/forza-motorsport-4-forza-4-vs--gran-turismo-5

Case in point going back to Forza 4 vs. GT5. While you keep touting that Cell is way ahead of PCs, the fact of the matter is PS3 doesn't even have superior graphics to the Xbox360, a 1-year-older console. That means the $$$ that Sony spent on the Cell didn't materialize into any tangible benefits compared to the 3-core PowerPC Xenon in the 360. So how are you arguing that the "alien" technology in the Cell is so much superior to modern x86 CPUs when the Cell + NV GPU combo couldn't even best the Xbox360?? That's ironic.

PS3's GT5
gt5h.jpg


360's Forza 4
forza4d.jpg


GT5
gt5b.jpg


Forza 4
forza4b.jpg


GT5
gt5c.jpg


Forza 4
forza4c.jpg


GT5
gt5d.jpg


Forza 4
forza4d.jpg


GT5
gt5ex.jpg


Forza 4
forza4e.jpg

Dude, you're completely biased towards cheaper products. This is such good proof. Forza 4 does not look better than GT5. I have both. So does the other poster. Everyone who has both admits GT5 looks more photorealistic and has better graphics. It's also hilarious how the screens use a non premium car for GT5.

PS3 graphics on multiplatform games look a little worse because its easier to design a game for the 360 and then port it over to the ps3. However, ps3 exclusives look better than 360 exclusives. I have both systems, along with everyone else who has both agree that this is the case.

Stop acting like you know everything when you don't.
 
Last edited:
Status
Not open for further replies.