How the PlayStation 4 is better than a PC

Page 62 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

smakme7757

Golden Member
Nov 20, 2010
1,487
1
81
It's all just hyperbole to get the marketing engine going.

Lets keep in mind that these consoles have been in development for a long time. They are not using hardware from today, they are using hardware from last week. Not to mention they are static, i.e you get what you get. That's it.

With that being said it is interesting how they have developed the console in a way to decrease latency between the main system memory and the GPU. A direct bypass will drop the latency by quite a bit, so it's nothing to sniff at. I mean lets face it, system memory is pretty flipping slow when compared to how many instructions a GPU or CPU can execute in a millisecond or even a nanosecond.

But, it's still just a stand alone unit. They are all the same, they are all static, and developers can fine tune their software for the components that are in these devices.

Again it's just hyperbole. Everyone knows that technology thunders forward. Saying that a static box is better than X system or Y system just doesn't hold up. X and Y will most likely change over time.

Say they get released in december. They will already be a full year behind curent technology.

Bottom line.
-They have made improvements to make the internal components more efficient.
-They can fine tune code for the system
-They can provide 1080p with quality graphics
-They are (hopefully) simple to use and fun to play
-They are not going to be pushing the boundries compared to a modern PC in any way shape or form when they are released (xmas i guess)
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
You seem to have missed the part that it's too expensive even "Titan-like" hardware;)

I didn't miss anything, I just assumed he was talking about "true" GI. Because the 680 already ran what UE4 had, and the GTX 780 is upwards of 50% faster than the 680.

You read it another way?
 

Spjut

Senior member
Apr 9, 2011
932
162
106
I didn't miss anything, I just assumed he was talking about "true" GI. Because the 680 already ran what UE4 had, and the GTX 780 is upwards of 50% faster than the 680.

You read it another way?

Well, this was the quote. Sounds to me he thinks UE4's previous GI is too expensive for both the PS4 and PCs with Titans.

Digital Foundry: We've seen Unreal Engine 4 step back from true real-time global illumination. Is it simply too expensive, even for next-gen consoles? Can you talk us through 4A's GI solution?

Oles Shishkovstov: Actually that's not true global illumination, but more of a really advanced algorithm producing convincing results. Yes, all that voxelisation and cone tracing is very expensive, too expensive even for Titan-like hardware.

I did a lot of research on GI during our last project, but we did not ship with it. The fundamental problem is: when an artist tweaks lighting on PC (with GI) it usually looks like crap on current-gen consoles (without GI). Next-gen console will solve it, enabling us to use some kind of real-time GI, so both the PC and consoles will get it. Personally I still lean towards coarse scene voxelisation and tweaking from here, quite possibly live with some amount of light leakage.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Well, this was the quote. Sounds to me he thinks UE4's previous GI is too expensive for both the PS4 and PCs with Titans.

How can it be too expensive for Titan if the 680 was demoed running it already?

It sounds (and is more practical) to believe he's saying real time lighting is too expensive even for Titan, and what UE4 showed was just a complex algorithm (which was too much of a drain on a GPU that is slower than the 680).

I can't imagine someone standing there saying "It's too much for a GPU that is 50% more powerful to run" while staring at a slower GPU running it. I think we have to presume what he was trying to say in a blurp as it was entering his mind to say it.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
They said something about how inefficient SVOGI was and that they now have better way to implement GI. Nothing about "consoles made us remove it".
With that being said it is interesting how they have developed the console in a way to decrease latency between the main system memory and the GPU. A direct bypass will drop the latency by quite a bit, so it's nothing to sniff at.
I think Latency-less CPU-GPU communication will be the next big thing, just like unified shader arch last time.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
They said something about how inefficient SVOGI was and that they now have better way to implement GI. Nothing about "consoles made us remove it".

I think Latency-less CPU-GPU communication will be the next big thing, just like unified shader arch last time.

Digital Foundry: We've seen Unreal Engine 4 step back from true real-time global illumination. Is it simply too expensive, even for next-gen consoles? Can you talk us through 4A's GI solution?

Oles Shishkovstov: Actually that's not true global illumination, but more of a really advanced algorithm producing convincing results.

Again looking at the context, the writing isn't exemplary so there is no reason to assume the dev is so stupid he would say SVOGI is too expensive for Titan while it was shown running on the 680 with far better IQ than what the PS4 showed.

Before I question the intelligence of the Dev, I'd consider the writing ability of the person who didn't even spell voxelization correctly.

I think it's safer to assume actual real time lighting would be too hard to run, even on Titan grade hardware. Rather than presume an algorithm shown to be running a 680 would be too much for a Titan to handle, though I could be wrong, it just seems like poor writing.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I think it's safer to assume actual real time lighting would be too hard to run, even on Titan grade hardware. Rather than presume an algorithm shown to be running a 680 would be too much for a Titan to handle, though I could be wrong, it just seems like poor writing.

Yes. It doesn't matter if Titan can handle or not (quite a small marketshare to develop around it anyways). What matters is: is it the best way to implement GI? Can something be done better, using less resources? And that was most likely the case here.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
I'm surprised no one has brought this up but TFLOPS can't be compared like this.

680-3.09 TFLOPS
7970 GHZ - 4.3 TFLOPS
Titan (using gpu boost at 980 mhz) - ~5 TFLOPS.

Using 1080p resolution

7970 ghz is not (4.3/3.09)= 40% faster on average than the 680.

Titan is more than (5/4.3) = 16% faster than the 7970 ghz.

I know I said this before, but I will try again. We are using GFLOPs because we are lacking better info and any benchmark. This is a mere estimation today, when better info was available we can reconsider the estimation.

Moreover, nobody here was worried when Nvidia compared its Titan to the PS4 using GFLOPs. Nobody mentioned anything about GFLOPs then.

I can see a clear pattern here from some posters, when X is used against the PS4 it is accepted without any worry, when X is used in the benefit of the PS4 then the same people here has objections.


I like how you don't link your quotes anymore.

http://www.linkedin.com/today/post/article/20130522214715-10904058-the-technology-behind-xbox-one

Where do these "5 years" come from, can't seem to find it.

Carmack and other devs have also said console games will still mostly target 30fps... Still haven't come up with a convoluted reason for that I see?

LOL The whole web reproduce the nonsensical quotes from the EA representative and you 'cannot' find the "five years ahead"...

http://www.destructoid.com/ea-ps4-xbox-one-a-generation-ahead-of-gaming-pcs-254360.phtml

Do you still insist on the 30fps? Do you know how many fps gives a GTX-680 for the same demos? 30fps or less.

They have not said that will target 30 fps because lacking hardware (which is your persistent belief) but that some of them prefer 30 fps and devote rest of performance to other things such as novel phys effects. It has been announced that the PS4 will have phys effects beyond PCs.

Moreover, when they say 30fps they don't mean 30 fps average.


2x of a similar PC is one thing, Tri SLI titans is another

2x is the minimum for the same hardware. The PS4 has modified hardware (the so-called supercharged architecture) that you cannot find in any PC.

Evidently, the supercharged architecture has limits and a PC with three Titan will be faster than the PS4.

Your claim has been corrected before, but don't forget to ignore this the next time, when you are repeating the same flawed "Tri SLI titans" claim please.


FUD and false all of it.

If the "draw call limitation" was any real showstopper...PC's today wouldn't have better I.Q. than console games...but they do.

But I will still enjoy my rig born (in 2008) runnin circles around the performance of the PS4.

Call it what you want, the 2--3x overhead is a well-known fact.

PCs today? Are you aware that current PCs can be 10x or even 16x more powerful than PS3 and Xbox 360?

As said before let us know how your noisy, hot, and power wasting rig runs circles and remain in the same place regarding first-party titles.


I didn't miss anything, I just assumed he was talking about "true" GI. Because the 680 already ran what UE4 had, and the GTX 780 is upwards of 50% faster than the 680.

You read it another way?

Epic has clearly stated that SVOGI was only a "prototype", not the final thing. SVOGI was too demanding for current PCs. They used SVOGI on the elemental demo running on the i7 + GTX-680 but they avoided the extended section that they showed at GDC 2013. Moreover, the PC with the GTX-680 couldn't obtain 30 fps @ 1080p.

Finally that demo is not a game. It is running under totally controlled situations, therefore they can force that the demo is never running out of specs. That cannot be made in a real game. Even if a Titan could handle SVOGI on the demo, this does not imply could fully handle a game with SVOGI. If Epic was so dumb to follow your flawed logic they would are trying to sell an engine which stuff barely playable on a 0.1% of high-end gaming PCs. Bravo!

SVOGI has been replaced with a more optimal ultra-realistic lighting GI using IES profiles.


The PS4 is basically i3 levels of CPU performance (on GDDR5 which will decrease CPU performance) and a 7790. We are expected to believe from the marketing that this will translate into Titan levels of performance with i7 SB-E levels of performance just because its a console OS instead of Windows. That is kind of an incredible statement to make and it requires a lot more than just "a console is worth 2-3x performance because of Windows". Windows isn't exactly slow, its the most advanced operating system on the planet. It needs a lot more explanation than what has been shown along with some benchmarks to prove that point.

The GDDR5 latency myth has been debunked in AT. As shown on other threads the latency is only slightly superior to DDR3 and over-compensated by the giant bandwidth gain.

The overhead introduced by windows is very well known. Just read to Carmack, Huddy, Lottes... explaining it to you.

Windows is not "the most advanced operating system on the planet", but one of the more retarded. No serious task where speed, security, efficiency, or stability are crucial uses Windows. E.g. the most advanced supercomputers in the Earth don't use windows (it would be ridiculous :awe:)
 
Last edited:

galego

Golden Member
Apr 10, 2013
1,091
0
0
A good interview with the chief technical officer of 4A, the studio behind Metro Last Light. Second half is about next-gen possibilites. He too says consoles can do at least 2x of what a similar PC can do.
http://www.eurogamer.net/articles/digitalfoundry-inside-metro-last-light
With heavy optimizations. None of which will be present on launch. Furthermore that quote is based on current consoles which are not x86 based (and optimizations could not be carried over to pc).

False. He is referring to next-gen consoles and he did not mention "heavy optimizations":

Digital Foundry: Do you think that the relatively low-power CPUs in the next-gen consoles (compared to PC, at least) will see a more concerted push to getting more out of GPU Compute?

Oles Shishkovstov: No, you just cannot compare consoles to PC directly. Consoles could do at least 2x what a comparable PC can due to the fixed platform and low-level access to hardware.
By low-level access to hardware he refers to something like Sony libGCM low level API (whose efficiency is about 2x that of Windows DX11).

Heavy optimization is made without APIs, coding directly to the metal. Sony also allows for this kind of direct access to hardware, but this super-efficient approach will be used only at the end of the life of the PS4.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Moreover, nobody here was worried when Nvidia compared its Titan to the PS4 using GFLOPs. Nobody mentioned anything about GFLOPs then.

Exactly, because everyone knew it's a meaningless metric. No one would be talking about it now either, except you keep bringing it up.

2x is the minimum for the same hardware. The PS4 has modified hardware (the so-called supercharged architecture) that you cannot find in any PC.

You can't find a single Titan or a single i7 in a PS4 either, so whatever point you think you're trying to prove here, you're failing. You're wrong, period. Doesn't matter how many times you try an explain it. You seem to think we don't understand, on the contrary, the rest of us understand all to well. Explain it again and you do nothing more but add another point in the "wrong" column.

There's a reason why you'll see random GFLOPS claims and no reputable person claiming or coming to the conclusion that you would need 3 titans based on those numbers. No, they're smarter than that. They'll throw meaningless gflops numbers and allow the gullible to draw their own conclusions. So far, in this entire forum, there's only one person gullible enough to claim you need 3 titans. There are a couple other AMD/Console advocates here, and even they have distanced themselves from this ridiculous notion.

And then there's this:

Ahahaha

8812345624_e9e2a37cfb_o.png


hahaha.

AMD needs to sprinkle some magic pixie dust on Epic.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
False. He is referring to next-gen consoles and he did not mention "heavy optimizations":

By low-level access to hardware he refers to something like Sony libGCM low level API (whose efficiency is about 2x that of Windows DX11).

Heavy optimization is made without APIs, coding directly to the metal. Sony also allows for this kind of direct access to hardware, but this super-efficient approach will be used only at the end of the life of the PS4.


Right from the article

Digital Foundry: Do you think that the relatively low-power CPUs in the next-gen consoles (compared to PC, at least) will see a more concerted push to getting more out of GPU Compute?

Oles Shishkovstov: No, you just cannot compare consoles to PC directly. Consoles could do at least 2x what a comparable PC can due to the fixed platform and low-level access to hardware

With the fixed platform and low level access you can optimize heavily. Fixed platform and low level acces do nothing unless you actually use the fact that they are available and optimize for them.

I don't get it, you toss out 2 titans and then quote sources which say only 2x. :confused:

Edit: Not to mention the fact that if you are going to compare TFLOPS then at least compare them across an architecture.
2x 1.84 ~= 7950 boost or a bit less than 925 mhz 7970.



Hell, I'll even toss this in.

HD graphics 2000 (766 mhz) (arrandale) - 38 GFLOPS
HD graphics 3000 (1300 mhz) - 130 GFLOPS
HD graphics 4000 (1300 mhz) - 330 GFLOPS
HD graphics 4600 (1300 mhz) - 416 GFLOPS
HD graphics 5200 (1300 mhz) - 832 GFLOPS

AMD

Trinity a10 at 800 mhz is around 600 GFLOPS.
Kaveri (512 GCN SP at 800 mhz) is around 820 GFLOPS.

An EX apu might be pretty close to 1.2 TFLOP.

IF this rate continues I expect intel to catch up with the xbox one theoretical GFLOPS (around 1.2 TFLOPS) in a couple years for the highest end igp (literally at 50% more theoretical power a really high end intel igp could match xbox one with broadwell. Not really that much out there considering with broadwell they are supposed to revamp the igp). That of course assumes that intel can continue pumping out these kinds of gains. For standard igp models I'd say about 5 years at current rates. However again you have to look at the efficiency of those GFLOPS, highest end intel igp 4000 cannot match 630m with a similar number of theoretical GFLOPS.

And intel already has similar things to HSA and hUMA with Haswell.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Not to mention the topic was the anemic tablet cpu, 2x slow is slow x 2.

I've never seen a single dev say anything about direct GPU overhead.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
There is a 8 core phone coming out this year!
Zo my god! It will have #X the power of the avg PC when .....xxx...xxx...xxx

The PS4 will not be had by most who want to buy in 2013. I wonder if the White Night will get one, I bet not. We should take a poll. Not a PC gamer and wont' own the console he is whoring/defending to the death,LOL :)
 
Last edited:

Schmeh39

Junior Member
Aug 28, 2012
17
0
61
LOL The whole web reproduce the nonsensical quotes from the EA representative and you 'cannot' find the "five years ahead"...

http://www.destructoid.com/ea-ps4-xbox-one-a-generation-ahead-of-gaming-pcs-254360.phtml

Did you even bother to read the article that Rajat Taneja wrote? Or did you just find the article where the author put it in the best possible light? Rajat never wrote anything about "five years." That was the words of the author of the summary piece you quoted.

2x is the minimum for the same hardware. The PS4 has modified hardware (the so-called supercharged architecture) that you cannot find in any PC.

Evidently, the supercharged architecture has limits and a PC with three Titan will be faster than the PS4.

Your claim has been corrected before, but don't forget to ignore this the next time, when you are repeating the same flawed "Tri SLI titans" claim please.

Even 2 titans is ridiculous. Please explain to me, other that the GDDR5, what is so supercharged about the PS4. And don't say HSA, just another marketing term by amd, or HUMA, you do realize that they will be releasing APU's for the PC market that have this too. As has been pointed out to you many, many times before, the CPU is targeted for tablets and other small form factor devices. If it was anywhere as powerful as an FX or I5 or I7, they would have already released it for high end systems, where there is a lot more margin than consoles.

As said before let us know how your noisy, hot, and power wasting rig runs circles and remain in the same place regarding first-party titles.

What do first party titles have to do with how powerful or efficient a machine is? If this was the case, I could argue that since neither the PS4 or XBox One have Mario, then the Wii U must be more powerful than them. In fact since you can't play Mario on the Department of Energy's Titan supercomputer, then the Wii U must be more powerful than the most powerful supercomputer in the world.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Call it what you want, the 2--3x overhead is a well-known fact.

PCs today? Are you aware that current PCs can be 10x or even 16x more powerful than PS3 and Xbox 360?

As said before let us know how your noisy, hot, and power wasting rig runs circles and remain in the same place regarding first-party titles.



More FUD.

Sorry for you, but when the PS3 launched, I had a...get ready for it:
A P4 2.4GHz with RD RAM...it got an AGP Gainward Blisss 7800GS++

As you can see here:
http://www.theinquirer.net/inquirer/news/1028220/gainward-agp-wolf-sheep-clothing


Not only did it perform better than the PS3...it also had better I.Q.


I wasn't limited to upscaled 720P with no AA, no AF, subpar shadowmaps and the rest of the crappy "features" of consoles.

And I am sorry to dissapoint you...but I actually thing my rig consumes less power today...then when I first build it.

It was a i7 920(45nm) + GTX285(55nm) rig with a slight OC.
It's now a i7 990X(32nm) + Titan(28nm) with no OC.

I bet you that I use less power...but have higher performance.
And thus you fail once again.

What is your PC specs again?
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Did you even bother to read the article that Rajat Taneja wrote? Or did you just find the article where the author put it in the best possible light? Rajat never wrote anything about "five years." That was the words of the author of the summary piece you quoted.



Even 2 titans is ridiculous. Please explain to me, other that the GDDR5, what is so supercharged about the PS4. And don't say HSA, just another marketing term by amd, or HUMA, you do realize that they will be releasing APU's for the PC market that have this too. As has been pointed out to you many, many times before, the CPU is targeted for tablets and other small form factor devices. If it was anywhere as powerful as an FX or I5 or I7, they would have already released it for high end systems, where there is a lot more margin than consoles.



What do first party titles have to do with how powerful or efficient a machine is? If this was the case, I could argue that since neither the PS4 or XBox One have Mario, then the Wii U must be more powerful than them. In fact since you can't play Mario on the Department of Energy's Titan supercomputer, then the Wii U must be more powerful than the most powerful supercomputer in the world.

Great first post! Welcome to the forums!
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Even 2 titans is ridiculous. Please explain to me, other that the GDDR5, what is so supercharged about the PS4. And don't say HSA, just another marketing term by amd, or HUMA, you do realize that they will be releasing APU's for the PC market that have this too. As has been pointed out to you many, many times before, the CPU is targeted for tablets and other small form factor devices. If it was anywhere as powerful as an FX or I5 or I7, they would have already released it for high end systems, where there is a lot more margin than consoles.
There are desktop APUs, but...
7750GDDR3 vs 7750GDDR5
7Apu8.jpg

If that doesn't help you get idea why desktop APUs are crippled, nothing will.

What do first party titles have to do with how powerful or efficient a machine is? If this was the case, I could argue that since neither the PS4 or XBox One have Mario, then the Wii U must be more powerful than them. In fact since you can't play Mario on the Department of Energy's Titan supercomputer, then the Wii U must be more powerful than the most powerful supercomputer in the world.
Ohhh... now I get it. Nothing to loose, he? I always said the Internet in caves is bad idea.
 

showb1z

Senior member
Dec 30, 2010
462
53
91
LOL The whole web reproduce the nonsensical quotes from the EA representative and you 'cannot' find the "five years ahead"...

Do you still insist on the 30fps? Do you know how many fps gives a GTX-680 for the same demos? 30fps or less.

Look buddy, I linked the original post by the EA guy on LinkedIn. Where does it say five years? Nowhere.
You're not only twisting and turning things out of context now. You're plain making things up and lying to further your delusions. Just because you put something between
tags doesn't make it a reality. And just cause the truth doesn't fit into your agenda, doesn't make it less so.

What does it matter what a GTX680 got in some demo. Get another one, or wait till 20nm next year. Ta-dah, your performance just doubled. Cause you know, that's how it goes in PC-tech.

Well have fun with your PS4 you probably won't even buy.
 

Schmeh39

Junior Member
Aug 28, 2012
17
0
61
There are desktop APUs, but...
7750GDDR3 vs 7750GDDR5
7Apu8.jpg

If that doesn't help you get idea why desktop APUs are crippled, nothing will.

I'm not arguing that GDDR5 is not a good thing. APU's are absolutely bandwidth limited, but what I'm saying is that GDDR5 alone, which is really the only part of the PS4 that can't be duplicated in PCs, is not enough to make the PS4 some "supercharged architecture" that destroys all other systems. If GDDR5 really was going to magically make the PS4 APU so powerful, then the PS4 would be completely wipe the floor with the XBox One, since not only does it have more compute units, but the XBox One is running DDR3.

Ohhh... now I get it. Nothing to loose, he? I always said the Internet in caves is bad idea.

Not sure what you are talking about, but okay.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I'm not arguing that GDDR5 is not a good thing. APU's are absolutely bandwidth limited, but what I'm saying is that GDDR5 alone, which is really the only part of the PS4 that can't be duplicated in PCs, is not enough to make the PS4 some "supercharged architecture" that destroys all other systems. If GDDR5 really was going to magically make the PS4 APU so powerful, then the PS4 would be completely wipe the floor with the XBox One, since not only does it have more compute units, but the XBox One is running DDR3.

From the specs we've seen, the PS4 will completely wipe the floor with the XBox One.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
If PS4 is anything like the XBO, it will be a DRM-infested thing where the prices for used games goes through the roof, and more and more titles (maybe eventually ALL titles) will be unplayable offline.

http://www.dailytech.com/Microsoft+...One+Games+Unplayable+Offline/article31630.htm

http://www.forbes.com/sites/insertcoin/2013/05/26/if-used-games-die-will-gamestop-follow/

http://techcrunch.com/2013/05/12/what-games-are-there-is-no-iron-throne-of-games-any-more/

You wanted cheap and easy? You get what you pay for, console-lovers. Oh no, wait, you don't even get that much! Console games already cost more than PC games at launch (and even more after sales), and these new XBO restrictions will make it so that consoles will be the old beater trucks of the gaming world: cheap to buy, expensive to operate. PCs won't be that much more expensive to buy (remember, if you want to calculate cost on an even playing field, assume you can hook up a PC to a TV; then, you only need a video card that outputs on High at 720p, which is only 2/3 of the number of pixels of 1080p; most people can't even tell the difference between 720p and 1080p on their HDTV screens anyway: http://www.engadget.com/2006/12/09/1080p-charted-viewing-distance-to-screen-size/), and cheaper to operate given all the Steam/GoG/etc. sales.

Oh but the XBO and PS4 can double as HTPCs, you say? So what. PCs already did that, and they did it better with more bays to expand into with cheap, non-proprietary hard drives, Blu-Ray drives, and USB remotes if you so desired. Furthermore, you can hammer out spreadsheets and word documents on keyboards and mice, edit photos more deeply than superficial app programs can, play RTS games, and more. Consoles cling to their game exclusives and multiplayer games that haven't gotten ported to PC (yet?) like Mario Kart, but PCs have exclusives too (e.g., most RTS games like SC2/Total War/CoH2/etc., Diablo 3, Dota 2, most serious FPSes (mouse >>> joystick, and no this is not up for debate; read this: http://www.eurogamer.net/articles/ms-killed-pc-xbox-cross-platform-play ), most serious RPGs (WoW, LOTRO, Elder Scrolls Online, etc.), and many indie games), and you can make multiplayer same-screen games on PC as well. Some people already have, though admittedly there haven't been many breakout hits--yet.

And as someone who has triple-wide monitors, I can only laugh at console gamers' forced tunnel vision. Even if multi-monitor is expensive and rare even in the PC world, it's POSSIBLE. Good luck ever finding console games that allow you to use three TV screens.

P.S. Yes sometimes you have driver issues or whatever, but it's not as if console games are bug-free either, and in any case I welcome the additional hurdles of PC gaming so that idiot children with IQs matching their (barely) two-digit ages aren't as prevalent in multiplayer games.

1832249-pc_gaming_master_race.jpg
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
That's all well and good blastingcap, but I can't afford six Titans, a SR3, and two E5-4650's just to get beat by these new magic consoles! They're to the metal bro, don't you understand HSA, libGCM, HUMA, APU, DPA, ZYX, ROGB, RTRQ, WTFBBQ? It's the return of FX bro, didn't you hear?
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
That's all well and good blastingcap, but I can't afford six Titans, a SR3, and two E5-4650's just to get beat by these new magic consoles! They're to the metal bro, don't you understand HSA, libGCM, HUMA, APU, DPA, ZYX, ROGB, RTRQ, WTFBBQ? It's the return of FX bro, didn't you hear?

As funny as I find this, I can't help but worry that someone, somewhere is taking this seriously.
 
Status
Not open for further replies.