Up front costs for PC are exaggerated. A midrange PC is more than capable of hooking up to a TV and outputting at 720p, same as consoles. The main reason why consoles are perceived to be less expensive than PC gaming is because of the up front cost, yet that is apples and oranges because you are comparing a TV experience to a monitor experience. Compare TV vs TV experience at 720p vs 720p and the truth emerges. Yes you will need a table or something to use a mouse in a living room but I think it's a fairer comparison than comparing TV 720p to monitor 1080p+. A console outputting lower-rez textures at 720p on a TV can't fairly be compared to high-rez 1080p on a desktop monitor, to say nothing of PC graphics mods that may be high-texture or offer more effects like TressFX. And yes, believe it or not, 5770s are good overclockers and you can get ~6850 performance out them if you go high enough, which is enough for today's games at 720p. Maybe not tomorrow's, though, which is why I was using a 3470 + 7850 as my hypothetical gaming rig to compare to PS4. The 7850 is a fantastic overclocker (typically 25% OC even on stock voltage, so it wouldn't add that much load). (Btw, I would use Haswell + whatever comparable GPU is out at the time of launch of the PS4, but I don't know exactly when that will be so I'm playing it safe and comparing it to current PC hardware rather than future PC hardware).
Note that VGA cards will keep declining in price so you can't use today's VGA card prices vs. a future PS4's prices. If you don't agree with this statement, then fine, I get to compare the cost of a 7950 (with four free games!) ten years from now, to the cost of a PS4 next year. Do you think that's fair? Of course not. Must keep it apples-to-apples.
The bottom line is that up front costs for PCs aren't necessarily that much higher than for consoles, especially if you already have a Windows license from an old build or something.
Then you get into operating costs.
As you implicitly admitted, console games cost more even when used, and then you say that's because of the used game market and that maybe prices will go down if used games become harder to sell or higher priced or whatever. That may be true, but there are limits to that argument:
MSFT and Sony don't make that much money on the initial hardware, they make it on the games. Game publishers have to pay a licensing fee to MSFT/Sony when they publish games. There is no analogous fee for PC games. There is therefore an inherent cost advantage for PC games. It's like buying a smartphone on contract vs. buying a smartphone by yourself and then getting a cheaper monthly rate from T-Mobile or some other carrier that offers cheaper plans when you bring your own phone. If T-Mobile offers you a "free" phone but charges you $20 per month more than the identical plan if you were to bring your own, identical phone to T-Mobile, was that "free" phone truly free? Of course not.
I am not "compaining" about this, I am stating a fact. Why should I "compain" anyway, given that I don't own consoles?
Plus, as I already mentioned, launch prices are lower for PC than consoles for the same games on average. Many times you will see the same game launch for $60 on console but only $50 on PC.
XBL and PSN offer services, but so does Steam, and Steam doesn't charge you $$$ per month. I have not heard one way or another from Sony if PSN is gonna start charging or not, but if they do, then you should factor that cost in as well.
Electricity costs are difficult to compare without hard numbers from Sony, but what we know so far is that PS3 Slim has load wattage of 100W and a midrange PC is in the 200-300W range, say 250W for a modern Ivy Bridge + 7850 system. I showed you my numbers already for what 150W difference costs you per year with a fairly heavy gaming load: barely more than $10. And that's just the first year. If you consider how PCs keep shrinking in size, the electricity difference will keep narrowing as PCs keep getting parts shrunk and improving performance per watt, so comparing a 2014 PC vs the PS4 is probably more apt.
You called me out on estimating 35W idle for a modern midrange gaming rig that I defined as something like a 3470 + 7850 with the usual mobo/RAM/fans/SSD/optical drive, claiming 35 watts "[your] ass." Yes you are an ass for spreading FUD and using your ancient GPU as an example. I've done extensive testing with a Platinum 80+ PSU and yes 35W is not unreasonable. Gold 80+ PSU, Ivy Bridge 3470, and 7850 and Samsung DDR3-1600 1.35V should get you there, and even if you use higher voltage RAM and a HDD instead ofa SSD, and a Bronze PSU, you are still in the 40-45W range at worst. Take a look at this thread for instance:
http://www.silentpcreview.com/forums/viewtopic.php?f=28&t=64394 That's not a very efficient PSU, and the RAM was 1.65V, and it still had amazingly low power draw. Now add back in the idle power from a 7850 (10W) and it's still in the 35W ballpark:
http://www.techpowerup.com/reviews/AMD/HD_7850_HD_7870/24.html If you undervolt the CPU, GPU, or RAM, wattage can potentially go even lower.
And that's not even adding the versatility factor of how PCs can do more stuff than gaming consoles and are more flexible as far as upgradability (more bays, more sockets, etc. and more display options if you want to say, go triple-screen 3D), with potentially cheaper parts if consoles force you to use proprietary hard drives or such. Good joke by the way about how you can learn assembly language and program a console to do things like Photoshop, yeah good luck with that and lack of mouse as well.
And just quit it with steam surveys, that doesn't prove anything. By "office" machine I meant non-gaming machines, including yes, small home office machines or outdated machines. I already explained this, just as how I pointed out that it is meaningless to compare PS4 to the outdated PCs out there. That makes no sense. If you don't understand why, how about we compare Steam Hardware Survey "average PC" to the "average" console in operation since 2004? Do you see now why it is meaningless to talk about outdated hardware? Let's compare apples to apples: modern PCs to a modern console like the PS4, ok?
The point is that consoles aren't a "tiny fraction" of the cost of PCs, as Galego claimed, especially if you look at overall lifetime costs.
As for the rest of console gamers being adult peasants, you said it, not me. I never said such a thing, but I'm not going to argue with you about that statement, either.
Now ps4=5770. Fanboy much? And you said it yourself. 720p on console =/=720p on pc.
Top end apu? Yea... that one from ps4?
Truth is used games are the reason why consoles games are more expensive and don't go on sales as fast as pc games. That is what M$ want to change and what you are bashing. Yet you also compain about console games prices. Pick a side!
35W my ass... And you have no clue what your talking about. To compare these "idle" 100W of PS3 slim you need to check you power consumption in BIOS. How much is that? 150W? And why even comparing stone age tech to current gen? You know that when PS3 was released power saving features were not the most important ones? You know that my 8800GTS which is younger than ps3 takes 60W@idle?
Ahahaha... 75..what?
I want to see how 1 jaguar core and zero-core GCN is taking 75Watts...
150W difference isn't much? Yet in GTX700 thread 20 watt lower power consumption is HUUUGE, to the point where 7970GHz dosn't justify it with the performance it offers.
Yea...I forgot you get your PC hardware upgrades for free... How do you work for?! AMD? Nvidia? Intel?
Used games gain? XBOX live and PSN gives something for your money.
With some "tweaking" you can do everything on console. It is just a barrier for brainless kinds not knowing assembler.
Yea...maybe you should add to that virtual machines? Srsly? Who da faq have steam on their office PC?
The rest which are not kids peasants, are adults peasants..kk