Go Back   AnandTech Forums > Hardware and Technology > CPUs and Overclocking

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals with Free Stuff/Contests
· Black Friday 2014
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 11-04-2012, 06:21 PM   #26
N4g4rok
Senior Member
 
N4g4rok's Avatar
 
Join Date: Sep 2011
Location: Northwest Arkansas, USA
Posts: 285
Default

Quote:
Originally Posted by Dark Shroud View Post
The current xBox 360s use a PowerPC "APU" where the AMD GPU is on the same die as the CPU.

I'm hoping the final systems will both have at least 8GB of ram since its so cheap now.
I don't think this is correct. I've cracked a few of them open, and thy've got a powerPC cpu on the left, with an ATi Xenos chip next to it. they were two distinct chips sharing a bus. That was it.
__________________
Primary: Hades
> AMD Phenom II X4 970 @ 3.9 Ghz + Antec Kuhler 620
> ASUS M4A79XTD EVO
> XFX Radeon R7950 Black Edition
> 128GB Samsung 830 SSD + 2TB Hitachi Deskstar HDD
> 8GB G.SKILL Ripjaws DDR3 RAM @ 1600 Mhz
> NZXT Hades Case

Home Server: Charon
> AMD Sempron 145 @ 2.8 Ghz + Stock cooler
> GigaByte GA-M68MT-S2
> 2TB Seagate Barracuda Green HDD + 2TB Samsung Spinpoint HDD
> 4GB Crucial DDR3 RAM @ 1333
> NZXT Hush Case
N4g4rok is offline   Reply With Quote
Old 11-04-2012, 06:31 PM   #27
NUSNA_Moebius
Golden Member
 
NUSNA_Moebius's Avatar
 
Join Date: Oct 2010
Location: USA
Posts: 1,327
Default

Quote:
Originally Posted by N4g4rok View Post
I don't think this is correct. I've cracked a few of them open, and thy've got a powerPC cpu on the left, with an ATi Xenos chip next to it. they were two distinct chips sharing a bus. That was it.
The revised systems have the CPU and GPU on a single die. Can't find any real hard data on whether or not the eDRAM was integrated as well, but I would assume it was looking at Xbox 360 Slim mobo shots.
__________________
MY YOUTUBE CHANNEL! - Current muses: Ghost Recon: Phantoms, War Thunder, Firefall

Geezer computer - Phenom II x4 @ 2.8 GHz | Diamond ATi Radeon 5850 1 GB | 2 x 2 GB DDR3-1600 | MSI 770-C45 Mobo | Asus Xonar DS Sound card | Win7 64 | 1 TB HDD | LG 32" HDTV
NUSNA_Moebius is offline   Reply With Quote
Old 11-04-2012, 07:22 PM   #28
jpiniero
Golden Member
 
Join Date: Oct 2010
Posts: 1,571
Default

Quote:
Originally Posted by Fox5 View Post
It had 240 shaders total
Wikipedia says that Xenos has 48 shaders. Whether it's really comparable to the R600 shaders I don't know.

Quote:
but it wouldn't be competitive with the rumored Tahiti-level GPU in the Nextbox.
I haven't seen any recent rumors on the 720's GPU, but it was similar to the 7670 (480 shaders) than Tahiti. Anything close to Tahiti isn't going to happen.

I'd say more likely the PS4's final system is:
2 SR Modules, 2.3-2.5 Ghz
500-640 shaders
4-8 GB Ram, maybe GDDR5.

With the 720 being similar except some sort of multicore PowerPC cpu, with one of the cores being reserved for Kinect.
jpiniero is offline   Reply With Quote
Old 11-04-2012, 07:22 PM   #29
Haserath
Senior Member
 
Join Date: Sep 2010
Posts: 735
Default

Quote:
Originally Posted by N4g4rok View Post
I don't think this is correct. I've cracked a few of them open, and thy've got a powerPC cpu on the left, with an ATi Xenos chip next to it. they were two distinct chips sharing a bus. That was it.
http://www.techradar.com/us/news/gam...plained-711942
Quote:
Originally Posted by jpiniero View Post
Wikipedia says that Xenos has 48 shaders. Whether it's really comparable to the R600 shaders I don't know.
http://www.anandtech.com/show/3774/w...-xbox-360-slim
Quote:
The core was made up of 48 shader processors and each SP could work on a vect4 plus a scalar op in parallel. These days we'd probably call it a GPU with 240 cores, although it's a bit dated from a functionality standpoint.

Last edited by Haserath; 11-04-2012 at 07:30 PM.
Haserath is offline   Reply With Quote
Old 11-04-2012, 07:35 PM   #30
Fox5
Diamond Member
 
Fox5's Avatar
 
Join Date: Jan 2005
Posts: 5,750
Default

Quote:
Originally Posted by jpiniero View Post
Wikipedia says that Xenos has 48 shaders. Whether it's really comparable to the R600 shaders I don't know.



I haven't seen any recent rumors on the 720's GPU, but it was similar to the 7670 (480 shaders) than Tahiti. Anything close to Tahiti isn't going to happen.

I'd say more likely the PS4's final system is:
2 SR Modules, 2.3-2.5 Ghz
500-640 shaders
4-8 GB Ram, maybe GDDR5.

With the 720 being similar except some sort of multicore PowerPC cpu, with one of the cores being reserved for Kinect.
48 VLIW5 shaders = 240 (according to how ati/amd counts them now)
It sounds pretty similar to the HD2900 series gpus, which were basically shaved by AMD cutting down die size (ditching the internal ring bus), way pumping up shader count, and improving drivers. On a console, individual games could be optimized for the GPU, but older games wouldn't benefit from driver advances, but I'd imagine the architecture fairs better on a console than it did on a PC before ATI was able to optimize the JIT. (although wikipedia claims r500/xenos is closer to r520/x1800 than it is to r600)
__________________
ebay
Look up bluefox451

heatware
http://www.heatware.com/eval.php?id=35565
Fox5 is offline   Reply With Quote
Old 11-04-2012, 08:04 PM   #31
Haserath
Senior Member
 
Join Date: Sep 2010
Posts: 735
Default

Considering AMD counts GCN cores a bit differently than VLIW cores, a 7770 style GPU at 1Ghz or slightly less would outperform the Xbox 360 GPU by at least 6x.

GCN counts vector processors(and not the one scalar unit in the CU).
VLIW5 counts the 4 vector units+1 scalar unit.

Basically would be 640 vs. 184 units. Clocks on the 360 are at 500Mhz as well.

I wouldn't mind seeing that kind of GPU in consoles as long as it kept them reliable and affordable.

The CPU side should at least come with a 3ghz processor. No point in leaving an anemic CPU for the next 6+ years.

Power use can't really go over 150W though. The Xbox 360's that RRoD pulled a bit over that.

What sucks is that these consoles might start on 40nm instead of 28nm...
Haserath is offline   Reply With Quote
Old 11-04-2012, 09:39 PM   #32
Aikouka
Lifer
 
Aikouka's Avatar
 
Join Date: Nov 2001
Posts: 24,609
Default

Quote:
1080p60 games in 3D
Did he misspeak? The HDMI spec doesn't support 1080p @ 60Hz with full-resolution 3D content (i.e. Frame Packing).
Aikouka is offline   Reply With Quote
Old 11-04-2012, 10:07 PM   #33
Schmide
Diamond Member
 
Schmide's Avatar
 
Join Date: Mar 2002
Posts: 4,252
Default

Quote:
Originally Posted by Aikouka View Post
Did he misspeak? The HDMI spec doesn't support 1080p @ 60Hz with full-resolution 3D content (i.e. Frame Packing).
HDMI 1.4b supports it. I don't think it would be much of a streach to move from ATI's current 1.4a to 1.4b, considering they claim 4096x2160 resolution per display.

Quote:
Cutting-edge integrated display support

DisplayPort 1.2
Max resolution: 4096x2160 per display
Multi-Stream Transport
21.6 Gbps bandwidth
High bit-rate audio
Quad HD/4k video support
1080p60 Stereoscopic 3D
HDMI® (With 4K, 3D, Deep Color and x.v.Color™)
Max resolution: 4096x2160

2560x1600p60 Stereoscopic 3D
Quad HD/4k video support
Dual-link DVI with HDCP
Max resolution: 2560x1600
VGA
Max resolution: 2048x1536
__________________
All errors are undocumented features waiting to be discovered.
Schmide is offline   Reply With Quote
Old 11-04-2012, 11:04 PM   #34
Fox5
Diamond Member
 
Fox5's Avatar
 
Join Date: Jan 2005
Posts: 5,750
Default

Quote:
Originally Posted by Haserath View Post
Considering AMD counts GCN cores a bit differently than VLIW cores, a 7770 style GPU at 1Ghz or slightly less would outperform the Xbox 360 GPU by at least 6x.

GCN counts vector processors(and not the one scalar unit in the CU).
VLIW5 counts the 4 vector units+1 scalar unit.

Basically would be 640 vs. 184 units. Clocks on the 360 are at 500Mhz as well.

I wouldn't mind seeing that kind of GPU in consoles as long as it kept them reliable and affordable.

The CPU side should at least come with a 3ghz processor. No point in leaving an anemic CPU for the next 6+ years.

Power use can't really go over 150W though. The Xbox 360's that RRoD pulled a bit over that.

What sucks is that these consoles might start on 40nm instead of 28nm...
Except for the Wii U, why would they start at 40nm? Volume? Both nvidia and amd video cards have shifted to 28nm, and AMD apus will be there soon.
__________________
ebay
Look up bluefox451

heatware
http://www.heatware.com/eval.php?id=35565
Fox5 is offline   Reply With Quote
Old 11-04-2012, 11:35 PM   #35
Aikouka
Lifer
 
Aikouka's Avatar
 
Join Date: Nov 2001
Posts: 24,609
Default

Quote:
Originally Posted by Schmide View Post
HDMI 1.4b supports it. I don't think it would be much of a streach to move from ATI's current 1.4a to 1.4b, considering they claim 4096x2160 resolution per display.
HDMI 1.4a supports 4K. None of that means that it supports 1080p60 using Frame Packing.

http://www.hdmi.org/manufacturer/hdmi_1_4/

However, since 1.4b supports 1080p120 (as listed here: http://en.wikipedia.org/wiki/HDMI#Version_1.4), it can support 1080p60 with 3D. Although, consider this... the PS3 only supports HDMI 1.3 through hardware, and it actually supports HDMI 1.4a through software.
Aikouka is offline   Reply With Quote
Old 11-04-2012, 11:52 PM   #36
Schmide
Diamond Member
 
Schmide's Avatar
 
Join Date: Mar 2002
Posts: 4,252
Default

Quote:
Originally Posted by Aikouka View Post
HDMI 1.4a supports 4K. None of that means that it supports 1080p60 using Frame Packing.

http://www.hdmi.org/manufacturer/hdmi_1_4/

However, since 1.4b supports 1080p120 (as listed here: http://en.wikipedia.org/wiki/HDMI#Version_1.4), it can support 1080p60 with 3D. Although, consider this... the PS3 only supports HDMI 1.3 through hardware, and it actually supports HDMI 1.4a through software.
You do understand that frame packing is just 2 frames next to (above below) each other? The only thing I think 1.4b does is make this auto delectable.

Edit: 3D 1080p60 is just 3840x1080p60 or 1920x2160p60.

Edit: That being said. No 7770 class card is going to game at these 3d resolutions. I think they're taking about 3d graphics at 1080p.
__________________
All errors are undocumented features waiting to be discovered.

Last edited by Schmide; 11-05-2012 at 12:01 AM.
Schmide is offline   Reply With Quote
Old 11-04-2012, 11:54 PM   #37
zebrax2
Senior Member
 
Join Date: Nov 2007
Posts: 821
Default

Quote:
Originally Posted by cytg111 View Post
- Why would you rip out the decoder? Is there any good reason not to leverage the vast x86 optimizations that goes into todays x86/amd64 compilers?
The reason i ask is 1.) mainly I'm just curious 2.) trying to see if they could try to work around the x86 license limitations if they customized their current processors.
zebrax2 is offline   Reply With Quote
Old 11-05-2012, 04:40 AM   #38
NTMBK
Diamond Member
 
NTMBK's Avatar
 
Join Date: Nov 2011
Posts: 4,553
Default

Quote:
Originally Posted by nismotigerwvu View Post
I'd be VERY surprised if there wasn't a discrete GPU included in the final hardware. It would be interesting to see if they stay with an APU and perhaps use crossfire or perhaps even a setup with one GPU used for GPGPU while the other handles standard GPU duties
I've seen this idea proposed in several places, and I still maintain that it's ridiculous. The difficulties that developers had working on the PS3's messy memory architecture should have scared them away from such a ridiculous model.

The main benefit of an AMD APU should be obvious- shared memory address space. GCN can work in the x86 address space. You have a single memory bank, something high bandwidth (high end DDR3, DDR4, GDDR5, maybe even XDR2). Both the CPU and GPU sides of the APU work on the same objects in memory- there is no split, no "this half goes for CPU, this half for GPU"- and there is no copying backwards and forwards between graphics and main memory. Removes a major bandwidth bottleneck, as well as removing a developer headache, and makes GPGPU tasks more practical. If the first pass of your algorithm depends heavily on single threaded performance, use CPU cores, and if the second pass requires multicore performance, use the GPU. Normally the practicality of this sort of thing is limited by PCIe bandwidth and latency, but not in a shared memory APU model.

A games console is the ideal platform to make the most of AMD's claimed gains from using APUs. Adding a "Crossfire"'d card would make development a pain in the arse, again.
__________________
Quote:
Originally Posted by Maximilian View Post
I like my VRMs how I like my hookers, hot and Taiwanese.
NTMBK is offline   Reply With Quote
Old 11-05-2012, 05:29 AM   #39
Haserath
Senior Member
 
Join Date: Sep 2010
Posts: 735
Default

Quote:
Originally Posted by Fox5 View Post
Except for the Wii U, why would they start at 40nm? Volume? Both nvidia and amd video cards have shifted to 28nm, and AMD apus will be there soon.
My mistake, I was thinking of the Wii U launch this year instead of late 2013. Very scatterbrained sometimes...

They should hopefully have the ability to release on 28nm.
Haserath is offline   Reply With Quote
Old 11-05-2012, 08:54 AM   #40
Cerb
Elite Member
 
Cerb's Avatar
 
Join Date: Aug 2000
Posts: 15,580
Default

Quote:
Originally Posted by zebrax2 View Post
BTW can an x86 CPU be still be considered x86 if you rip out the x86 decoder?
No, but then it won't be any kind of CPU. The x86 memory model is baked into the hardware (8-bit to 64-bit support, and a fully hardware MMU), along with direct execution of many x86 instructions, optimizations for x86's specific handling of a C-like data stack, and also lots of goodies keeping up with exceptions and flags.

Quote:
Originally Posted by nismotigerwvu View Post
Precisely, this means more about general hardware than anything specific, AMD x86 and DX11-class graphics hardware. I'd be VERY surprised if there wasn't a discrete GPU included in the final hardware.
Whether x86 or not, given the advantages of sharing low-latency shared RAM access, I would hope that, if they can't fit it on the die, they'll fit it on the package, so that it won't be stuck with the added latency and/or reduced bandwidth of an external bus.
__________________
Quote:
Originally Posted by Crono View Post
I'm 90% certain the hipster movement was started by aliens from another galaxy who have an exaggerated interpretation of earth culture(s).
Cerb is online now   Reply With Quote
Old 11-05-2012, 10:28 AM   #41
sm625
Diamond Member
 
sm625's Avatar
 
Join Date: May 2011
Posts: 4,769
Default

A HD7700 series gpu has 1.5 billion transistors. A10 trinity die is 1.3 billion transistors, with no more than 700 million composing the gpu. So that tells us they could easily double the gpu size of an A10, and still be under 2 billion transistors. That is a very reasonable size and the price wouldnt be too bad either. Should be less than $150. Two years into the console's life that chip would be cut down to $60. It all sounds very reasonable to me.

Actually going from 500 million transistors (xbox 360) to only 2 billion transistors seems kind of weak. Shouldnt we be looking for a ballpark of 3-4 billion? That's an APU the size of a 7970...
__________________
I am looking for a cheap upgrade to my 3 year old computer.
AT forum member #1: Buy a 4790k

I am looking for a way to get 10 more fps in TF2.
AT forum member #2: Buy a 4790k

Last edited by sm625; 11-05-2012 at 10:36 AM.
sm625 is online now   Reply With Quote
Old 11-05-2012, 11:01 AM   #42
Puppies04
Diamond Member
 
Puppies04's Avatar
 
Join Date: Apr 2011
Posts: 4,858
Default

Quote:
Originally Posted by VulgarDisplay View Post
Most games run at 640p and are upscaled. A few games run at 720p, and even fewer actually run at 1080p.
Also don't consoles run at 30fps? I know there are gains to be made having a closed platform but 60FPS @1080p in 3D seems way too optimistic considering what the current gen consoles can achieve.
__________________
Quote:
Originally Posted by moonbogg View Post
If you have kids, you need to kill them
My weddding pic http://i226.photobucket.com/albums/d...ps224f2a90.png
Puppies04 is offline   Reply With Quote
Old 11-05-2012, 11:21 AM   #43
Tuna-Fish
Senior Member
 
Join Date: Mar 2011
Posts: 538
Default

Quote:
Originally Posted by Fjodor2001 View Post
Then why not simply ask the developers to go and get whatever recent x86 CPU and GFX card they like, if it's not going to match the actual PS4 hardware in the end anyway. Why bother sending them a special "PS4 development PC"?
The software shipped in the box is an early version of what the final system will have, and is likely only made to work on that single system. Also, even if the final performance is not anywhere near the current devkit, several details of it may be very similar to the final platform. Things like how the memory bus is shared between GPU and CPU (Can you just pass pointers? Can they even be virtual pointers or do you have to translate them?) actually affect early engine development more than the raw performance numbers.

Quote:
Originally Posted by ShintaiDK View Post
If you go by the same rumourmill crap and "insider sources" and "already shipped".

Then it seems Xbox720 will be Intel and nVidia.
I know from a source I consider very reliable that the earliest xbox next devkit was indeed Intel + nvidia. The later ones aren't. As with last gen, the earliest dev kits are really just the platforms the OS/API devs happened to work on to build their stuff. As the release date grows nearer, the devkits become more similar to the final hardware.

Quote:
Originally Posted by zebrax2 View Post
I believe they ship with extra ram because they have to run the development software on it.
This would be correct. Dev kits almost always have more ram than the final machines. The final dev kits (ie, not this one yet) this time around might be exceptions to this rule, as all the sensible memory tech available is point-to-point, and so when they move to the final silicon, there is no simple cheap way to ship more memory in the system.

Quote:
BTW can an x86 CPU be still be considered x86 if you rip out the x86 decoder?
If a tree falls in the forest...

Well it still needs a some kind of decoder. What would you put in it's place?

It would obviously not be directly compatible with any x86 software, but some x86 features, such as strict memory ordering, would still be visible.

Quote:
Originally Posted by Arkaign View Post
8GB would be nice, but I think even 4GB would do the trick provided it's hellishly fast ram. A10 is incredibly sensitive to memory speed in desktop benches, with DDR3-2133 making a monster of an improvement over 1066 and 1333.

If they managed to get something north of 2133 on a double-wide bus it'd be pretty solid.
Right now, the speculation is divided between DDR4 and GDDR5. DDR3 will be on it's way out when these are released, and would be expensive to supply for the latter half of the lifetime of the consoles.

Quote:
Originally Posted by Dark Shroud View Post
Well Sony did use XDR in the PS3, if they used XDR2 in the PS4 it would give insane performance. If they use XDR2 I would live with 4GB of ram.
XDR2 does not even exist outside of paper. Also, as I understand it this gen Sony got hurt more than a little for using boutique ram that had almost no other users. In chip manufacture getting volumes up is the number one rule for keeping costs down.

They were both very active participants in the DDR4 spec, which I expect to mean that that's what they'll use. GDDR5 would give more bandwidth, but would be considerably more expensive. (Especially in the long term, as PCs can be expected to use DDR4 for a very long time, whereas GDDR5 will probably be replaced with something in GPUs in a few years.)

Quote:
Originally Posted by alyarb View Post
The Xenos GPU in the xbox is based on R600, the first of the unified VLIW5 series. Nothing like X800/1800
No, Xenos is vec4 + scalar, which is not same as VLIW5. It would be correct to say that R600 was further development based on Xenos.

Quote:
Originally Posted by Haserath View Post
Considering AMD counts GCN cores a bit differently than VLIW cores, a 7770 style GPU at 1Ghz or slightly less would outperform the Xbox 360 GPU by at least 6x.
Actually, by much more than that. The vec4 is only useful if there are four independent identical operations you can do on the same fragment/vertex. Which there are for vertex processing and other geometry, but generally aren't for pixel processing. vec4+s is only as good as 2 or 3 scalar operations for most pixel processing, and pixel processing is (way) more than half of all shader processing.
Tuna-Fish is offline   Reply With Quote
Old 11-05-2012, 11:24 AM   #44
Kenmitch
Diamond Member
 
Kenmitch's Avatar
 
Join Date: Oct 1999
Location: 92557
Posts: 5,509
Default

AMD in the console doesn't equate to AMD in your desktop. Not sure why a lot of members are thinking you'll get subpar performance if AMD is inside your console.

Console = The ability to fully optimize the OS and game(s) to set hardware.

The bar is set by the console manufacturer(s) and not by the suppliers of the hardware. It's a prove it or loose it type of deal when it comes to the underlying hardware in question. I would think that it's already proven that the underlying AMD hardware can meet or exceed the requirements set by the console manufacturer(s).

It's not up to AMD or Intel or Nvidia to dictate the performance of your console they only supply the power that was asked for in the first place....Consoles are like Apple products in a way as you only get what they think you need/want.
__________________
HeatWare
Email
Kenmitch is online now   Reply With Quote
Old 11-05-2012, 04:25 PM   #45
Haserath
Senior Member
 
Join Date: Sep 2010
Posts: 735
Default

Quote:
Originally Posted by Tuna-Fish View Post
Actually, by much more than that. The vec4 is only useful if there are four independent identical operations you can do on the same fragment/vertex. Which there are for vertex processing and other geometry, but generally aren't for pixel processing. vec4+s is only as good as 2 or 3 scalar operations for most pixel processing, and pixel processing is (way) more than half of all shader processing.
I need to stop pulling stuff out of my arse(even though speculation is entertaining), but do you have a source for that or any experience in the field?
Haserath is offline   Reply With Quote
Old 11-05-2012, 05:14 PM   #46
Aikouka
Lifer
 
Aikouka's Avatar
 
Join Date: Nov 2001
Posts: 24,609
Default

Quote:
Originally Posted by Schmide View Post
You do understand that frame packing is just 2 frames next to (above below) each other? The only thing I think 1.4b does is make this auto delectable.
I know a modest amount about 3D because I spent awhile researching it. I had an older Denon 1910 AVR, which doesn't have 3D support. I wanted to do 3D with my TV, and I was bewildered as to why the 1910, which seemed quite capable, could not actually handle it. There were really two reasons why it can't handle it...

1) Not having the proper spec (minimum of 1.4a) means it's incapable of saying, "Hey! I can do 3D!"

2) The extra frame being packaged offsets the audio, which the receiver would be unable to process.

As long as I wanted to keep the Denon in the loop between my 3D-capable device (PS3) and my TV, I could not use 3D. The only way to make it work would be to route it apart from the AVR and use an optical cable to feed the audio into the AVR. Also consider that using an optical cable would have sacrificed lossless audio. In other words... a giant mess!

Quote:
Originally Posted by Schmide View Post
Edit: 3D 1080p60 is just 3840x1080p60 or 1920x2160p60.
Neither of those is correct, because there's actually a 20 pixel spacer between each frame. Although, if I remember correctly, they're laid vertically, and it's also a 48Hz refresh rate, because HDMI 1.4a only supports 3D in 1080p at 24Hz (i.e. an actual 48Hz per combined frame).

Quote:
Originally Posted by Schmide View Post
Edit: That being said. No 7770 class card is going to game at these 3d resolutions. I think they're taking about 3d graphics at 1080p.
That's another reason why I was curious about it. The current generation games don't render 3D content at 1080p because (1) they can't handle it (2) HDMI 1.4a restricts you to 24Hz at 1080p. I've tried using the latter... it's not pleasant.
Aikouka is offline   Reply With Quote
Old 11-06-2012, 07:23 AM   #47
Tuna-Fish
Senior Member
 
Join Date: Mar 2011
Posts: 538
Default

Quote:
Originally Posted by Haserath View Post
I need to stop pulling stuff out of my arse(even though speculation is entertaining), but do you have a source for that or any experience in the field?
My source is a few years trying to optimize shaders for the damn things.
Tuna-Fish is offline   Reply With Quote
Old 11-06-2012, 11:42 AM   #48
happysmiles
Senior Member
 
happysmiles's Avatar
 
Join Date: May 2012
Posts: 344
Default

I have a ridiculously strong feeling that Kaveri having a fully shared data bandwidth will be in the PS4 custom chip.

Why lift heavy with one when you can do the lifting with two?

Regardless, the difference between 720 and PS4 should be less and easier for developers and nice for us consumers! I'd imagine that true 1080p 30fps with all eye candy enabled while having future potential is their current goal.

4K would be the next step (I'm guessing next gen would be 8 years so wouldn't surprise me)
happysmiles is offline   Reply With Quote
Old 11-06-2012, 12:39 PM   #49
Ventanni
Senior Member
 
Join Date: Jul 2011
Posts: 834
Default

If I'm not mistaken, couldn't amd also drop support for some instruction sets (like mmx, sse, etc) from the cpu that are deemed unnecessary by Sony? Realistically if Sony is going to do an apu here, then it's going to have to be manufactured very cheaply en mass, and large dies will cut into that. But couldn't there be savings to be found by culling some transistors from the CPU as well?
__________________
Desktop: Core i7 3770k, 8GB, Geforce 560 Ti
HTPC: Core2 Q6600, 4GB, Geforce 285
Ventanni is offline   Reply With Quote
Old 11-06-2012, 12:48 PM   #50
NTMBK
Diamond Member
 
NTMBK's Avatar
 
Join Date: Nov 2011
Posts: 4,553
Default

Quote:
Originally Posted by Ventanni View Post
If I'm not mistaken, couldn't amd also drop support for some instruction sets (like mmx, sse, etc) from the cpu that are deemed unnecessary by Sony? Realistically if Sony is going to do an apu here, then it's going to have to be manufactured very cheaply en mass, and large dies will cut into that. But couldn't there be savings to be found by culling some transistors from the CPU as well?
a) Removing the vector units would be a MASSIVE redesign of the core.

b) Why would you want to remove the vector units? Game code is an ideal use case for vector units.
__________________
Quote:
Originally Posted by Maximilian View Post
I like my VRMs how I like my hookers, hot and Taiwanese.
NTMBK is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 10:13 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.