First Unreal Engine 4 Screenshots now available

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
I sincerely doubt that Epic can convince both companies who have years upon years of research invested in those consoles to make any changes.

What research? Microsoft is practically buying off the shelf parts for the Xbox Next. :p

It would behoove both MS, Sony, and any other console maker to listen to what the big dev houses want for hardware.

But, most likely, the next crop of consoles will be focusing on social and casual gaming, rather than more serious titles. Hence, last years midrange PC GPUs.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
A tweaked 6850 or 560 would be plenty for 1080p.

And those are already out of date, and since these consoles haven't even been announced yet, they'd become even more out of date.

Depending on when they plan to launch, they'll need a Radeon 78x0 design, or possibly even an announced future 8xx0 design. If they want to last for years, they need to come out of the gate swinging.

But, like I said before, the new emphasis is on casual and social gaming. That antique Radeon 4670 they're supposedly using will play Farmville incredibly well.
 

LumbergTech

Diamond Member
Sep 15, 2005
3,622
1
0
I hate the unreal engine....can't imagine they will make this one much better...

Anyone else feel this way? The games on the unreal engine just feel weird to me.
 

Fire&Blood

Platinum Member
Jan 13, 2009
2,333
18
81
Microsoft knows how. They buy in bulk to bring the price down. Then they undercut their own price in the hopes of making a profit on the games. They have done it before. They were selling PCs at less than cost to get Windows out into the market place. It is a reasonable strategy.

I won't dispute that but there is a limit to how much cost they can eat up. 360 is a great example, it was on par with the 7800 GT, a mid range GPU at the ~$299 price point.

I don't think gaming suffers because of console hardware ability, not at launch at least. It's the long lifespan of consoles that starts to slow things down since they are the priority platforms. Even updating a 3 year old PC has limitations that can't be overcome, let alone a static console that gets overrun by time.

Whatever a mid range GPU offers at the launch of next gen consoles, Microsoft and Sony will match it. At that point, it won't be an issue, it's later down the road when things get complicated, with PC's adopting new tech that doesn't get utilized because the focus is on developing for consoles that stand still unlike the dynamic PC's.
 

arkcom

Golden Member
Mar 25, 2003
1,816
0
76
Couldn't MS and Sony make a base 720p and higher end 1080p version with ~2x the gpu? As hardware costs come down, drop the lower end models. It seems like this would be simple enough.
 

stuup1dmofo

Member
Dec 2, 2011
84
0
0
It's kinda like designing new ships but being restrained to the standard footprint of the titanic because that was what the Panama Canal was designed for.
 

reallyscrued

Platinum Member
Jul 28, 2004
2,618
5
81
I hate the unreal engine....can't imagine they will make this one much better...

Anyone else feel this way? The games on the unreal engine just feel weird to me.

This. The engine tries to do too many things at once and many games end up 'feeling' the same because of it. I don't know how to describe it but I know what you mean.

Still psyched for what visuals UE4 might bring. UE3 is getting very long in the tooth.
 
Nov 26, 2005
15,194
403
126
This. The engine tries to do too many things at once and many games end up 'feeling' the same because of it. I don't know how to describe it but I know what you mean.

Still psyched for what visuals UE4 might bring. UE3 is getting very long in the tooth.

Talk about the same feeling, look at the Source engine for Couterdaytf2l4deat
 

dagamer34

Platinum Member
Aug 15, 2005
2,591
0
71
Increasing RAM is nothing like changing out the processor or video card.

Probably not, but everything has it's costs. Besides, upgrading from one model to the next in the same family isn't as difficult as say, switching from AMD to nVidia during the middle to late design phase. Adding more Stream processors/CUDA cores shouldn't change programming that much, if at all.
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
Looks no where near as good as the Samaritan demo, and the only reason why Sony and Microsoft have chosen the conservative route this time is due to the balls up they both made of the current generation. Too many hardware failures and stupid hardware choices. Take existing hardware and optimise it - lop off bits that developers won't miss. Don't create a new custom chips that turn out either to be crap for their intended purpose or ignore your engineers to cut costs and then have to replace consoles en-masse because you turned efficient thinking toward greed.

It's all pretty well documented by now - they only have themselves to blame, but of course consumers are the ones who suffer for their mistakes.

UE4 just looks like they're implemented GPU CUDA PhysX directly into UE3, and that's it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Rumor has it it'll be a 6770

I am afraid the current rumors are worse.

For Xbox "720" / Durango, it's rumored to have HD6670.
http://www.ign.com/articles/2012/01/24/xbox-720-will-be-six-times-as-powerful-as-current-gen

I am not sure I believe this rumor though. It sounds way too weak.

On the PS4 side, things aren't looking better though.

AMD's A8-3850 APU and HD 7670 GPU.
http://www.ign.com/articles/2012/04/04/sources-detail-the-playstation-4s-processing-specs

HD7670 is a re-badged HD6670. The combined power of HD6670 + A8-3850 (HD6550 GPU) in CrossFire is <= HD6770 (but that's assuming perfect scaling).

avp.png

batman.png

battlefield.png

crysis2.png

dirt3.png

starcraft.png


Based on these specs, next generation tessellation is going to be severely limited since HD6670 has awful tessellation unit. By end of 2013/early 2014, this type of GPU power will be slower than AMD's APU at the time and by desktop PC standards completely budget level. Remember that HD6770 itself is just a re-badged HD5770.

In my eyes, anything less than HD6950/GTX560Ti or even HD7850 is a complete disappointment for next generation consoles because I have a feeling they'll also stick around for 7-8 years. Selling a console in 2013/2014 with HD6670 GPU after nearly eight years of Xbox360/PS3 would be a major slap in the face. However, given that MS wants Xbox to be a media device first and foremost, I can easily imagine them spending $50-60 integrating Kinect and cost cutting on the GPU.

I would love to see HD7950M from laptops to make it into PS4/Xbox next.
- ZeroCore power for amazing idle power consumption
- Just 50W of load power
- 2GB of VRAM
- Good tessellation performance on the AMD side

Sony is bleeding $. They can't afford to sell hardware at a loss as was the case for PS3. For that reason I am ruling out a fast GPU in their console.

For MS, they seem to care more about capturing more market share from social/casual gaming, focusing even more on Kinect and extending media functionality to get people to want an Xbox for things other than games. Given MS's lack of exclusive games, I am not seeing any evidence that MS cares about capturing even more hardcore gamers especially since Kinect has been a success for them. I think MS realizes the real growth is capturing some of Nintendo's fans instead and other casual gamers. The fraction of hardcore gamers who will leave PS3 for MS's console is probably not large enough, but there are tons and tons of casual gamers that might consider getting Xbox and playing arcade games, Minecraft style games.

Maybe if these consoles are very underpowered, developers will start making cutting edge games on the PC again and porting them back to consoles (especially since PS4 will have AMD CPU). :D

Here is another article on Unreal 4 engine and how it will simplify coding for developers. It has real time dynamic lighting model:

Unreal Engine 4.0 - The Imagination Engine: Why Next-Gen Video Games Will Rock Your World (WIRED Magazine)
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
To be fair, console performance is usually better than PC for the same GPU, due to having lower overhead (Direct3D API). So a 6670 on console may punch above its weight. Nevertheless it's apparently still too slow for Sweeney's tastes hence his lobbying for better perf.

P.S. Nice find re: the Wired article, thanks!
 
Last edited:
Nov 26, 2005
15,194
403
126
I somewhat agree with above. But these console people would probably spend more than usual if they knew their options first. e.g. a 6670 vs a 7950 or 570
 

GrumpyGamerLP

Junior Member
May 17, 2012
9
0
0
You can't really judge a game engine from screenshots. You have to see it in action.

As far as what 3D chipsets they use in next gen hardware, they really need to keep the price down. IMHO, the only reason the Wii gained so much market share is because it was so damned cheap compared to the other consoles. Price is a major factor in a global recession. I think a big reason no one has released a new console in so long was the simple fact that it is financial suicide to try to launch a new system with the way the economy is.
 

TheSlamma

Diamond Member
Sep 6, 2005
7,625
5
81
I hate the unreal engine....can't imagine they will make this one much better...

Anyone else feel this way? The games on the unreal engine just feel weird to me.
Source makes me feel this way

but Unreal? not really.

I find Gears of War 3 way different than Thief 3 and the later Rainbow 6 games
 

dagamer34

Platinum Member
Aug 15, 2005
2,591
0
71
Looks no where near as good as the Samaritan demo, and the only reason why Sony and Microsoft have chosen the conservative route this time is due to the balls up they both made of the current generation. Too many hardware failures and stupid hardware choices. Take existing hardware and optimise it - lop off bits that developers won't miss. Don't create a new custom chips that turn out either to be crap for their intended purpose or ignore your engineers to cut costs and then have to replace consoles en-masse because you turned efficient thinking toward greed.

It's all pretty well documented by now - they only have themselves to blame, but of course consumers are the ones who suffer for their mistakes.

UE4 just looks like they're implemented GPU CUDA PhysX directly into UE3, and that's it.

And you can tell all that from some pictures? Really?
 

PrayForDeath

Diamond Member
Apr 12, 2004
3,478
1
76
I find Gears of War 3 way different than Thief 3 and the later Rainbow 6 games

Not to mention mass effect, batman, and borderlands. Totally different experiences.


When Portal 2 first came out, my friend saw me playing it, and he was like "is this game made by half life's dev?" (he knew nothing about the game at the time)
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You can't really judge a game engine from screenshots. You have to see it in action.

:thumbsup:

I think I read that only 14 people worked on that demo. 50 developers were employed for Gears of War 1, scaling up to maybe 80 for Gears of War 3.

Now compare that to hundreds that work on COD games that still look like ****. D:
 
Oct 30, 2004
11,442
32
91
It's a shame that studios and gamers are focusing so much on graphics quality when what we need is game quality. The UT3 engine may have been far more advanced than the engine that powered the original Unreal Tournament (UT99), but UT3 was a consolized abomination and a huge failure compared to UT99.

I don't have any faith in Epic's ability to produce a half-decent UT sequel.
 

dagamer34

Platinum Member
Aug 15, 2005
2,591
0
71
It's a shame that studios and gamers are focusing so much on graphics quality when what we need is game quality. The UT3 engine may have been far more advanced than the engine that powered the original Unreal Tournament (UT99), but UT3 was a consolized abomination and a huge failure compared to UT99.

I don't have any faith in Epic's ability to produce a half-decent UT sequel.

The two are tied together and game engines are really the key to cutting down the time it takes to pump out a high quality game. Otherwise, developers end up writing a bunch of custom tools and you take 5-6 years to release a game as was the case of Final Fantasy XII and XIII. That's going to be unrealistic as we enter the next generation where everything is an order of magnitude more complex.

However, when you read the article, you'll see that Unreal Engine 4 is much more than just graphics, but giving non-programmers the ability to edit game worlds in real time without having to re-compute in world lighting (which can take hours). Another example is better scripting of events, again to avoid the specialized need for programmers each time something is changed.