People forget that the RSX,even though technically the same G70 chip as in the 7800GTX,was actually slower in reality. It had only a 128 bit memory controller(instead of a 256 bit memory controller) and half the ROPs of the G70 found in the 7800GTX.
From the RSX Wiki page:
:thumbsup: Excellent points. I've been saying for a while that the GPU in PS3 was junk on day 1 and nowhere near 7950GT/7900GTX. Around PS3's launch 8800GTX completely owned the PS3's/Xbox 360's GPU and as I said 20 months out GTX280 was leagues apart, 5.75-6X faster than PS3's GPU. Essentially RSX became
shockingly outdated with paltry memory bandwidth and major VRAM bottleneck less than 2 years after launch relative to the best PC cards such as HD4870 and 280. However, to this date, there is a myth that PS3/360 were some uber powerful consoles for years after launch when in fact they became outdated
quicker relative to PC GPUs that followed, and most PC gamers refuse to acknowledge this data.
To this date no game ever released for PS3/360 matched 2007 Crysis 1 on the PC. Yet, CDPR already claims that Witcher 3 on PS4 will equal Witcher 2 PC graphics. That's pretty good because the generation is just getting started.
Intel wouldn't sell it below $100 without the GPU? Intel is currently selling the cheapest i3s below $100 with the igp ($99 in MC). One of the higher end i3s.
You are not seeing the big picture. Core i3 + Intel motherboard would still use more power than an 8-core Jaguar part. Secondly, Intel motherboards/chipsets cost significantly more as a total package. Finally, a Core i3 would have been an epic fail for a console since they would have allocated 1 core for OS/background tasks leaving just 1/2 of the Core i3 for gaming. You think PS4 using 2 cores for background/OS would have been not resulted in a major issue for a Core i3? We've already seen the Original Xbox using Intel Pentium chip and it didn't do squat of a difference against GameCube. In fact the less powerful PS2 easily outsold both.
A 7970m performs similarily to the console GPUs but IS NOT COMPARABLE TO THE CONSOLE CHIPS. 7970m gets binned, console APU either passes or fails the bin, there are no die harvests; that's why the consoles run comparatively low clocks. The console APU functions as a poorly binned desktop chip (extra voltage to keep the poorer chips going) therefore the is no rhyme or reason in comparing it to a highly binned and expensive mobile chip.
Your usage of mobile chips is completely absurd.
It's not arburd. As in the other thread you are not understanding the main point I am making. The best performance/watt is seen in the mobile space. Therefore, 680M/680MX/775M/780M and 7970M represent the best possible GPU performance in an energy constrained space that you could have had at that time. Using those chips as a reference point tells us the absolutely best reasonable 100W GPU performance that PS4 could have had.
As I already said, unless Sony or AMD wanted to take a hit on yields and release a fully unlocked 7970M with 1280SPs, the only way to tangibly improve performance was to use a chip like 780M (whether desktop or mobile has no relevance to the upper-performance boundary since 780M tells us the
best NV could have done at that time). Under the best case scenario, we would have had a PS4 with 30-40% faster GPU but considering 780M sold for $750 in retail, such an optimized desktop/mobile NV chip would have been completely unfeasible to use in a PS4.
Furthermore, given that PS4's GPU is at least 30-40 faster than XB1's but we aren't seeing PS4's games looking much better, even if PS4 had a GPU 30-40% more powerful, 90% of the developrs would have never prioritized for that level of performance leaving both XB1 and Wii U on the sidelines. The risk of making 1 of 3 consoles too powerful is that most developers will never spend the time and $ extracting maximum performane out of it. Therefore, using a much more expensive 780M style GPU in PS4 would have been a big waste of time and resulted in massive losses for Sony or no profits NV.
Right now PS4 is powerful enough and is cheap enough that it hit the perfect sweet-spot for a next gen consoles. 18.5 million sales reflect that. PS4 with 780M style GPU and an i5 priced at $599 would have completely bombed against this holiday's XB1 going for $330-350. ^_^
An i3 can't even stream twitch with any intensive game going using a PC. Going with a dual core is a horrible idea. With twitch taking up most of a core on an i3, add OS, what do you have left?
i3? You're out of your mind thinking that's a good idea.
Well apparently there are A LOT of PC gamers on our forum that think a G3258 OC is better than FX8000/9000 series for gaming and think dual core is still awesome for modern gaming. When "unknown" sites show that modern titles really
run much better on FX8000/i5, these tend to be ignored since it's "not the popular opinion".
The other point brought up is that MS's cost savings of fusing the CPU and GPU together later in Xbox 360's life was noticed by ALL of the major console makers. How would MS/Sony realize future die shrink power saving benefits together with reduced cost savings had they gone with an Intel+AMD or Intel+NV solution? Considering MS's experience with having real world facts on the yields, costs and benefits of going APU with the 360's later SKUs, I am sure they carefully considered the alternative of going i3+NV/AMD stand-alone GPU.
Since XB was meant to be a media device that promotes multi-tasking such as Skype + video steaming + gaming, a Core i3 would have been a massive fail for such a console.
It's shocking how many people are disappointed with the current consoles considering their prices and that we have games like Infamous Second Son, Driveclub and Ryse Son of Rome providing amazing graphics on a $350-400 console. Sure, with the cost of PC games and flexibility of mods, I still like PC gaming, but for the millions of gamers out there, PS4 is a FAR better designed/balanced gaming console than PS3/Xbox 360 ever was. The major issues with PS4 are poor media playback capability and lack of BC support.
I mean we will have Witcher 3 and Uncharted 4 on PS4. To max out Witcher 3 on the PC will require a GPU that costs more than PS4 alone. The diminishing returns of highest level of graphics beyond medium/high are extremely costly, requiring 3-4X the graphical power of PS4's GPU for less than commensurate increase in graphical quality. The average gamer would never pay $600+ for a PS4 that could run Witcher 3 at Ultra vs. a $350-400 XB1/PS4 that could run it at a Medium/High. That's why the goal of consoles is not to shove a 4790K with a 780Ti in there and sell it for $400. Last generation of consoles has shown that the historical model of
loss leader strategy (printer+ink/razor+razor blades) for consoles is no longer sustainable. It's simply too risky and it takes too long to re-coup the losses on hardware, which unnecessarily extends the normal life-cycle of consoles, thus actually hurting the industry long-term. Game publishers/developers would also like for the adoption of next gen consoles to happen as quickly as possible in order to justify the large expenditures associated with game development today. This is why this generation we are seeing $299-399 consoles.