How the PlayStation 4 is better than a PC

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

lilrayray69

Senior member
Apr 4, 2013
501
1
76
I think the PS4 has a 7850, which is mid-range today.

I don't see what the big argument is really...

Do any of you own a Rift? Are those even for sale? I imagine they'd be very, very expensive. I've never played a 3D game - either PC nor console. Nor do I care to unless it becomes pretty standard and not expensive.

The console and PC offer similar experiences in different way, with key differences. A console is for anyone who wants to buy something they don't need to put together or configure, just plug it in and pretty much go. There's a universal friends list for all your games which makes it easy. Everyone playing runs on the same specs so you don't have to worry about someone else having an advantage cause they use a $2,000 PC and a $200 gaming mouse/keyboard with a 50mbps internet connection.
Also consoles are starting to offer more via the arcade/online system of selling indie-games and whatnot. And you can of course stream video/music, run TV apps or whatever on it as well. They make for a pretty good, simple to use, entertainment system.

PCs can do a lot of that, and can do a whole lot of other things consoles can't. But they may cost more and may require more time/input/effort on the part of a person to really utilize. And yeah you can build a PC for around the same price that will perform just as well, if not better, than a new-gen console will but again that requires selecting the parts, putting em together, dealing with drivers and other software, and other things that a lot of people just don't want to do. They'd rather just buy it all wrapped up in a console and be done with it. And that's fine for some people.

They're just different, and I don't think need to be compared nor does one need to be "superior". The only thing I will say in that avenue is that I have never had as much fun playing a game as I've had playing on a console with a friend right next to me. I've come close with a few online games, and occasionally a PC LAN type thing - but playing games like Ghost Recon or Rainbow Six split-screen with a friend is one of the most fun things I can remember.
 
Last edited:

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
^Wrong.

Oculus Rift uses separate images per eye. Your eyes are each looking at a completely separate portion of a single screen, and that screen is split down the middle, each side a clone of the other, albeit one screen is angled a bit different. It provides a realistic depth and natural convergence, much like how you see the world right now. It's why Oculus feels as real as it does... it's tricking the brain into the convergence of what it then perceives as the real world. Oculus a true 3D enviornment , unlike stereoscopic 3D of todays passive and active TV's and monitors, which provide a quasi-3D (and very "bleh") effect based off two images on a single screen mashed together and slightly offset. It works nothing like 3D vision, which was all active stereoscopic based. To even compare them proves your lack of education in the 3D or VR field.

Uh...no.

Both "traditional" shutter glasses and VR provide the same stereoscopic effect, each provide a unique image to each eye, the method of delivery is just different. The only physical differences between the 2 technologies to my knowledge are that.

1) In VR you use a single panel split in half or 2 synced panels for each eye, which means each eye sees the refresh at the same time, where as in shutterglasses each eye sees alternate frames, generally requiring twice the frame rate to get similar smooth motion.
2) In VR your entire FOV is filled with a screen compared to shutter glasses where only the images inside the border of the 3D display appear 3D.
3) In VR you tend to get head tracking which provides the sense of feedback and greater immersion.

There are no other significant technical differences in the way stereoscopic 3D is implemented, the term quasi-3D doesn't mean anything. VR might look and feel better than stereoscopic 3D for the reasons above but they both provide full stereoscopic 3D, they're just delivered via different methods.

That's BS like a million times over. There is nothing you can do with any amount of money or expertise that will make PC gaming as good as console gaming for local multiplayer on a single box on a couch in front of a tv.

Sorry but this is flat out wrong, my setup is cheap and works fine.

My living room is a 125" 1080p projector screen with a 5.1 audio receiver, both the PS3 and PC are able to output to the screen and the audio, I have several of my PS3 controllers paired with my PC rather than the PS3 via bluetooth and configured to mimic the xinput of Xbox controllers which work with basically every modern game.

So lets compare.

PC screen res = True 1080p for everything
PS3 screen res = 720p or less for most games

PC frame rate = 60fps or above always
PS3 frame rate = 30fps average if you're lucky with dips

PC Sound = Full 5.1 Dolby surround
PS3 Sound = Full 5.1 Dolby surround

PC Controllers = Full PS3 wireless controller support or any other USB controller you prefer
PS3 Controllers = Limited range of compatible controllers

I regularly play local coop both on the PS3 and PC with friends, the PC is without a doubt hands down better, at 125" from about 10ft away 1080p looks divine, anything less like 720p or less looks absolutely diabolical. The frame rate drops on the PS3 for a lot of titles becomes almost unplayable, especially in split screen or coop modes.

It cost about £15 for a HDMI cable to the projector, £10 for a Coax run to my receiver, and about £10 for handful of USB bluetooth devices to pair with PS3 controllers, the driver software is free.

Both have there pro and cons.

I do miss the sense of community going to lan parties had.

I host about 4 LANs a year for close friends, we still have a riot, it's all about the community of being together, getting absolutely shit faced, playing drinking LAN games, eating snacks and takeaway together, putting porn on in the background when drunk, playing card games, board games. Sometimes we play console games on the big screen too, 8 player bomberman on the PS3 with 8 players was hilarious when drunk :)
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
@Beavermatic:

So I decided to do a little test this morning, and now I'm a bit worry about the Oculus Rift. So, as you said, the Oculus Rift feels more like you are there. I can only assume it is because the screens are closer to your eyes, so you see very little of anything else. It covers your peripheral vision. Well, I have a 27" 3D Vision monitor, and I decided to bring it right up to my face to experience the same effect (with no head tracking). Now most all my vision is completely covered by the screen.

I then loaded up Farcry 3 with the fix here http://helixmod.blogspot.com/2012/12/farcry3-dx9-only-3d-vision-fix.html

Now I started a new game and enjoyed. Here are my observations:
1) It definitely is more immersive when your vision is completely covered in the 3D image. It was very impressive in that regard, and I may play with the monitor closer to me in the future as a result. It may not be as close, but the image is many times larger, so it should be similar to what the Oculus Rift brings.
2) This may have been a Farcry 3 thing, as they do load the textures slowly in real time, but textures look pretty poor at times, even at 1080p. Though at other times, it looked good. Having a lot lower resolution concerns me, but I assume the prototype is just a starting point.
3) The big downside. I was getting sick quickly. It felt like motion sickness. Perhaps it was because everything was so much more real looking that the bobby of the view made me motion sick or maybe something else was at play, but what ever it was, I had to stop after 5-10 mins.

I may play around with distance more. Right in my face, just a couple inches from my nose was too much, but it did feel more like I was there.
 
Last edited:

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
One thing i like about PC over console...

Xbox game
Xbox 360 game
Xbox 720 game

Vs.

PS game
PS2 game
PS3 game
PS4 game

Vs.

PC game

One ever growing library.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
That is absolutely true. You can play PS2 games with PS4 machine too though, but you have to pay for it again, to stream it from Gaikai since Sony didn`t want to make the hassle of emulating/porting all those games from previous OS to x86. But since its x86 now, there might be much easier for PS5 to play PS4 games
 

lagokc

Senior member
Mar 27, 2013
808
1
41
That is absolutely true. You can play PS2 games with PS4 machine too though, but you have to pay for it again, to stream it from Gaikai since Sony didn`t want to make the hassle of emulating/porting all those games from previous OS to x86. But since its x86 now, there might be much easier for PS5 to play PS4 games

...assuming there will be a PS5 and when it comes Sony won't choose to save money by not including a blu-ray drive.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
...assuming there will be a PS5 and when it comes Sony won't choose to save money by not including a blu-ray drive.

If the state of broadband penetration wasn't so awful worldwide (US especially) I would think they would have ditched the drive already.

I don't think I have touched my disc drive in my PC in a few years. Since I installed windows I think?

Actually it would be awesome if they would offer a version for less money that didn't include the drive, and you downloaded everything online.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
@Beavermatic:

So I decided to do a little test this morning, and now I'm a bit worry about the Oculus Rift. So, as you said, the Oculus Rift feels more like you are there. I can only assume it is because the screens are closer to your eyes, so you see very little of anything else. It covers your peripheral vision. Well, I have a 27" 3D Vision monitor, and I decided to bring it right up to my face to experience the same effect (with no head tracking). Now most all my vision is completely covered by the screen.

Interesting, I might try this with my 24" 3D monitor, in fact I might try it with my 30" 2560x1600 that thing is close to filling your vision without needing to be too close, plus has a high PPI, sadly not 3D though. I bet if the frame of the glasses end at the frame of the monitor that'll feel a lot more like occulus just without head tracking.

Where's that picture of some dude with his jumper pulled over his head and wrapped around his monitor (creating tunnel vision) when you need it.
 
Last edited:

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
I think people are just upset (whether they realize it or not) that companies are finding out graphics are NOT where the money is. We've come to a point where the exponential increase in time and cost of graphics fidelity combined with the ever shrinking bleeding edge sphere is making companies reconsider their approach to development. I don't have much faith in Sony as a company, but maybe someone in the Playstation department has enough sense to read the writing on the wall and not repeat previous mistakes.

Plus they learned to not waste a billion dollars developing their own proprietary hardware. It's much cheaper to use modified AMD chips. And they obviously have corporate espionage. Instead of making a giant, watt consuming, heat producing behemoth like the original 360 and PS3, they went with a 1152 shader Radeon. Why? Because the Xbox 720 went with a 768 shader Radeon. If this were the fierce 2005-2006 graphics battle, they would have gone with a 1280-1536+ shader gpu and the box would be loud and hot. But both sides are making sure not to repeat the mistake, although Xbox will be disappointingly slow (I think the PS4 is taking a good middle ground - avoiding trying to cram a 7950 in there, but not going with a slow 7790 class GPU like Microsoft is).
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Interesting, I might try this with my 24" 3D monitor, in fact I might try it with my 30" 2560x1600 that thing is close to filling your vision without needing to be too close, plus has a high PPI, sadly not 3D though. I bet if the frame of the glasses end at the frame of the monitor that'll feel a lot more like occulus just without head tracking.

Where's that picture of some dude with his jumper pulled over his head and wrapped around his monitor (creating tunnel vision) when you need it.

I'm not sure that would be all that comfortable on a 2D screen. Part of the reason it works in 3D, is the focus of objects appears a distance into the screen, so you aren't crosseyed when looking at things.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Plus they learned to not waste a billion dollars developing their own proprietary hardware. It's much cheaper to use modified AMD chips. And they obviously have corporate espionage. Instead of making a giant, watt consuming, heat producing behemoth like the original 360 and PS3, they went with a 1152 shader Radeon. Why? Because the Xbox 720 went with a 768 shader Radeon. If this were the fierce 2005-2006 graphics battle, they would have gone with a 1280-1536+ shader gpu and the box would be loud and hot. But both sides are making sure not to repeat the mistake, although Xbox will be disappointingly slow (I think the PS4 is taking a good middle ground - avoiding trying to cram a 7950 in there, but not going with a slow 7790 class GPU like Microsoft is).

Where did you read on the MS specs since nothing has actually been announced? Not saying oyu are wrong, I have just not seen this info posted anywhere.

And personally I am happy they did not go with a huge power sucking machine.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
One thing i like about PC over console...

Xbox game
Xbox 360 game
Xbox 720 game

Vs.

PS game
PS2 game
PS3 game
PS4 game

Vs.

PC game

One ever growing library.

ohh..not rly. Go play Warhammer 40k final liberation or chaos gate... or a ton of other games. Windows is not backwards compatible more often than you may think.
Ever heard of GOG.com?
 

thelastjuju

Senior member
Nov 6, 2011
444
2
0
I'm surprised how complicated some of you are making this.. PC will always be superior to consoles for the ONE following reason:

OPTIONS > NO OPTIONS :D:thumbsup:

ohh..not rly. Go play Warhammer 40k final liberation or chaos gate... or a ton of other games. Windows is not backwards compatible more often than you may think.
Ever heard of GOG.com?

True, if you run Windows Vista, 7, or 8 you are bound to run into all sorts of compatibility issues.

You still have options though:

> Dual Boot with Windows XP for compatibility with old school games
> Virtualization
> Hunt around for fixes/workarounds/patches (75-80% of the older games that wont work with Win7 CAN RUN, given fixes available thru a simple google search)
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
two year old midrange hardware would be exactly whats in the PS4, something along the lines of a GTX570 or 580.... which is two years old. I forget what the exact AMD card it is equivalent to that's running inside the ps4, so I'm comparing it to nvidia's tech atm.

All of the hardware in the PS4, no matter what extended services support or add-on processor or gpu extensions they've revised to "support", has been out for a while... that GPU is aged and that CPU is a good couple years old. A quick look at its specs compared to AMD CPU's and GPU's product line will tell you exactly that. Now thats not saying they didn't add some new extension set support to the processor or modified the gpu a lil bit... but whoop-dee-doo. It's equivalent to adding shoes and a new saddle to a horse that is near end of its riding days.

I had (2) 7800gtx's when the Xbox 360 released. A single 7800gtx whooped its ass in gpu power multiple times over, and was capable of running games in 32 bit color, with numerous options of antialiasing, hires textures and all the additional postprocess at beyond 1920x1080p resoultions. And what is the 360 still pumping? "color banded" pisspoor low color depth, weak AA, embarassing texture quality, and can barely manage at most of its games native resolution of 720p. You can upscale all you want on your HDTV's... but your lil xenos was barely pumping at 720p.

I know, it hurts coming to the realization that whats inside your new $500 toy is recycled pc tech of yore, but thats exactly what it is. They can sugarcoat it all they want with "well we've added THIS!" or "well we updated it with THAT!", but thats all it is... sugarcoating. Underneath its still just a ball of poo.
So we went from 2-3 year old mid-range to 2 year old HIGH-END. It's not recycled pc tech of yore, it's cut-down cutting edge PC hardware. Only PS3 GPU was already obsolete when it launched but not Xenos. Nice comparing 2 GPUs which each cost as much as a whole console and call the console crap. Try running any new games on those cards, good luck. On consoles you still can because of optimizations and some visual trickery. It's like as if I bought a second Titan and call PS4 crap because it can't do what my PC can, notwithstanding the fact that one of my GFX cards sucks as much juice as the whole console and that my pc costs 8x less.
PC screen res = True 1080p for everything
PS3 screen res = 720p or less for most games

PC frame rate = 60fps or above always
PS3 frame rate = 30fps average if you're lucky with dips

PC Sound = Full 5.1 Dolby surround
PS3 Sound = Full 5.1 Dolby surround

PC Controllers = Full PS3 wireless controller support or any other USB controller you prefer
PS3 Controllers = Limited range of compatible controllers

Yeah right, I'm pretty sure I would get dips below 60fps on a Titan even at 1080p in some very demanding games like Crysis 3. The card alone costs 1000$
 
Last edited:

Beavermatic

Senior member
Oct 24, 2006
374
8
81
So we went from 2-3 year old mid-range to 2 year old HIGH-END. It's not recycled pc tech of yore, it's cut-down cutting edge PC hardware. Only PS3 GPU was already obsolete when it launched but not Xenos. Nice comparing 2 GPUs which each cost as much as a whole console and call the console crap. Try running any new games on those cards, good luck. On consoles you still can because of optimizations and some visual trickery. It's like as if I bought a second Titan and call PS4 crap because it can't do what my PC can, notwithstanding the fact that one of my GFX cards sucks as much juice as the whole console and that my pc costs 8x less.


Yeah right, I'm pretty sure I would get dips below 60fps on a Titan even at 1080p in some very demanding games like Crysis 3. The card alone costs 1000$


You've got to be joking. At release, at 570 was mid-range. Now, a 580 might be considered high-end to some.... but I honestly consider the 590 to be highend or SLI'd 580's. Of course, my expecations are high.... I just upgraded from two gtx680's to two gtx Titans. Don't ask me why because I have no idea... was just in the mood to do so. And single versions of either of these cards were slamming my games at 2560x1600... with everything maxed out (and using FXAA or 2x MSAA). With two cards, I am able to max everything out, and do 8x+ MSAA in most games and runs like butter. The only exception to that is Crysis 3, where I've got to leave MSAA at 4x, or then I might start seeing a noticeable difference in framerate.

And as far as Xenos goes, it WAS obsolete. There's no argument there. It was custom tailored tech based on a preexisting older APU (it was technical a AMD X-something card it was bastard child of, cant remember which card specifically.) for a console on technology that already existed. A mid-range video card at the time of 360's release (and likely even low-end) on PC ran laps around it. It doesn't matter what additional extensions it had or extended architecture enhancements they retrofitted onto it... it was weak sauce to start with. You can put lipstick on a pig all day long, but in the end, its still just a pig. Going back and adding a new architecture extension to it may have given the card a slight boost, but not nearly enough to compete with a high-end pc graphics card without such architecture at the time.


I get it the appeal of the consoles. I do. I have a ps3 and xbox 360 sitting right next to my TV. I also have a $8000 gaming PC I'm typing on right now that I built, and that I update about every year. I understand the appeal of consoles... simplicity and price and first party titles, and the ability to get a decent quality for gaming at a low price and play what everyone else is playing.. But to sit here and argue the technical aspects of PC and console and try to justify a console even coming close to being technically superior in anyway, shape, form is just ridiculous. It's, as what some might say "going full retard". It would be like me trying to argue the computational power of my $8000 rig to, say, a $16 million dollar supercomputer.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
So we went from 2-3 year old mid-range to 2 year old HIGH-END. It's not recycled pc tech of yore, it's cut-down cutting edge PC hardware. Only PS3 GPU was already obsolete when it launched but not Xenos. Nice comparing 2 GPUs which each cost as much as a whole console and call the console crap. Try running any new games on those cards, good luck. On consoles you still can because of optimizations and some visual trickery. It's like as if I bought a second Titan and call PS4 crap because it can't do what my PC can, notwithstanding the fact that one of my GFX cards sucks as much juice as the whole console and that my pc costs 8x less.


Yeah right, I'm pretty sure I would get dips below 60fps on a Titan even at 1080p in some very demanding games like Crysis 3. The card alone costs 1000$

Sure but it never touches 30. Most developers say 30fps ie enough for consoles, and they have been building games below 1080p. I think this is silly, but that's what some are saying. Personally I feel 60fps is the prime target if at all possible. Heck when I am playing Crysis 3 at 2560x1440 I'm not seeing much below 40fps with SMAA.

LAN parties died for a reason, internet gaming is superior.

I think people started to get over the whole lugging your PC to someone's garage thing. My PC isn't lightweight, those big heatsinks on the CPU for cooling shouldn't be moved around much if possible anyway.

It's a whole lot more convenient to just load up a game at home and go to it over the internet. I think once DSL and Cable became so prevalent, there was no need to do the whole LAN Party thing as much either. Since almost nobody is using dialup anymore, the ping times and such are less of a factor.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I know, it hurts coming to the realization that whats inside your new $500 toy is recycled pc tech of yore, but thats exactly what it is. They can sugarcoat it all they want with "well we've added THIS!" or "well we updated it with THAT!", but thats all it is... sugarcoating. Underneath its still just a ball of poo.

Not a great way to have a reasonable discussion with someone. That's the way adolescence talk to each other.

As was stated, this is not 2yr old tech. The APU that's used in PS4 has yet to be released and will be the state of the art when it is.
 

Beavermatic

Senior member
Oct 24, 2006
374
8
81
Not a great way to have a reasonable discussion with someone. That's the way adolescence talk to each other.

As was stated, this is not 2yr old tech. The APU that's used in PS4 has yet to be released and will be the state of the art when it is.

I'm not trying to have a reasonable discussion with you. This thread lost any credibility for a reasonable discussion as soon as I read the title because its a freaking joke. And yes, talking to any who beleives the "APU in the ps4 will be state of the art" is like talking to a little kid.
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
You've got to be joking. At release, at 570 was mid-range. Now, a 580 might be considered high-end to some.... but I honestly consider the 590 to be highend or SLI'd 580's. Of course, my expecations are high.... I just upgraded from two gtx680's to two gtx Titans. Don't ask me why because I have no idea... was just in the mood to do so. And single versions of either of these cards were slamming my games at 2560x1600... with everything maxed out (and using FXAA or 2x MSAA). With two cards, I am able to max everything out, and do 8x+ MSAA in most games and runs like butter. The only exception to that is Crysis 3, where I've got to leave MSAA at 4x, or then I might start seeing a noticeable difference in framerate.

And as far as Xenos goes, it WAS obsolete. There's no argument there. It was custom tailored tech based on a preexisting older APU (it was technical a AMD X-something card it was bastard child of, cant remember which card specifically.) for a console on technology that already existed. A mid-range video card at the time of 360's release (and likely even low-end) on PC ran laps around it. It doesn't matter what additional extensions it had or extended architecture enhancements they retrofitted onto it... it was weak sauce to start with. You can put lipstick on a pig all day long, but in the end, its still just a pig. Going back and adding a new architecture extension to it may have given the card a slight boost, but not nearly enough to compete with a high-end pc graphics card without such architecture at the time.


I get it the appeal of the consoles. I do. I have a ps3 and xbox 360 sitting right next to my TV. I also have a $8000 gaming PC I'm typing on right now that I built, and that I update about every year. I understand the appeal of consoles... simplicity and price and first party titles, and the ability to get a decent quality for gaming at a low price and play what everyone else is playing.. But to sit here and argue the technical aspects of PC and console and try to justify a console even coming close to being technically superior in anyway, shape, form is just ridiculous. It's, as what some might say "going full retard". It would be like me trying to argue the computational power of my $8000 rig to, say, a $16 million dollar supercomputer.
oh, please, GTX580 was a high-end card when it was released, it was the fastest and most expensive card, GTX590 came MUCH later. Well, maybe not the fastest, Radeon 5970 was very competitive. The point is it was high-end then, and it was as overpriced compared to the rest of the cards as Titan is now.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Not a great way to have a reasonable discussion with someone. That's the way adolescence talk to each other.

As was stated, this is not 2yr old tech. The APU that's used in PS4 has yet to be released and will be the state of the art when it is.

State of the art when compared to another APU. What we have been saying all along is that people are running 6 core SB-E CPUs at 4.5Ghz and SLI 680s or even SLI Titans. That's a ton more power than any APU.
 

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
I'm not trying to have a reasonable discussion with you. This thread lost any credibility for a reasonable discussion as soon as I read the title because its a freaking joke. And yes, talking to any who beleives the "APU in the ps4 will be state of the art" is like talking to a little kid.

By all means show us a CPU that is more complex than the APU they designed specifically for PS4.
 

alcoholbob

Diamond Member
May 24, 2005
6,387
465
126
...assuming there will be a PS5 and when it comes Sony won't choose to save money by not including a blu-ray drive.

Why wouldn't they? They own Blu Ray. Most likely the PS5 will have dual lenses, one backwards compatible with Blu Ray, and one for whatever is the new 4K disc format. If Sony is cheap the format they might cut backwards compatibility to would be DVD.
 

Beavermatic

Senior member
Oct 24, 2006
374
8
81
By all means show us a CPU that is more complex than the APU they designed specifically for PS4.

/facepalm. First off, APU is a blanket term for multi-processing unit integrated architectures. One cannot simply state just a CPU as an APU is CPU and GPU integration.

Second, I can easily outpower the APU of the PS4 with my machine right now, by multiple times over and over, which is a oc'd intel i7 3930k, (2) GTX Titan's in SLI. Outside of the comparable "APU" components, there's also 32GB of extremely low latency DDR3, some pretty speedy Samsung solid state drives for OS's, raided 1TB harddrives for games and apps respectively, and dvd burner, bluray burner, mechanical razer keyboard, RAT mouse, 1000w gold rated psu, S-IPS HP ZR30w monitor at 2560x1600, and various other things.

In every way, shape, form... I've trumped a PS4 or the next Xbox. And my last 3 systems also trumped the PS4. It isn't even comparable. It would be like trying to compare a Ferrari next to a Honda civic.
 
Status
Not open for further replies.