I hate what the CPU is becoming!

Axonn

Senior member
Oct 14, 2008
216
0
0
Hello everybody ::- ). My first post here. I'm going to try and make it a meaningful one. Please take into consideration that I am talking purely from an ENTHUSIAST's standpoint.

I hate the CPU's future. I hate the fact that pointless transistors and technology is being shoved down our throats. I never had and (hopefully) never will have integrated graphics in my computer. A discrete GPU cannot be replaced by these pathetic Sandy Bridge or Bulldozer architectures (at least not for 10 years). Ok, so they cater to the mainstream. I got no problem with that. But WHY are we, the people who buy their BEST products and, for sure, ensure a future for their top products, being treated like this?

I want to upgrade my CPU next year, but I hate the choices: Sandy Bridge has a stupid integrated GPU in all versions, while AMD is going for the exact same crap. So I'm going to shove my money into something I will NEVER USE. Annoying!

At least if they thought to bring hybrid GPU technology to the desktop! But did they? Of course not. *sigh*. I would understand the usefulness of an integrated GPU if in normal OS mode (no games), the discrete GPU would be COMPLETELY SHUT DOWN and the system would fall back on the CPU integrated graphics.

I agree that it's a good package... for the masses. But not for us who still buy discrete GPUs. Discrete GPUs will not be replaced by any CPU GPU mongrel anytime soon. It's impossible (for the immediate future) due to the huge advantages a dedicated card has over some nickel-sized partition on a CPU.

I WANT CLEAN CPUs!
 

SparkyJJO

Lifer
May 16, 2002
13,357
7
81
Correct me if I'm wrong but aren't there basically "two" lines of CPUs going to be made? The ones with the integrated GPUs, and the "classic" CPU only?

Honestly I haven't kept up with it all.
 

Terzo

Platinum Member
Dec 13, 2005
2,589
27
91
I don't know much about cpu technology, but does having the igpu actually hurt anything? If you're not using it, I don't imagine it would draw any power. Hell, I like having mobo gpu's so that I can more easily test the computer when something goes wrong (no need to use video card for output).

My only guess is that you could argue R&D (and silicon I guess) put into the igpu could instead be used to build a beefier cpu?
 

Kivada

Junior Member
Sep 10, 2010
23
0
0
The entire hardware industry is going multi core and GPGPU, the majority of software houses are nearly a decade behind in implementation of 64 bit, multi core and gpgpu.

We as top end buyers actually only make up less then 1% of total sales, but we are turned into these companies free marketing wing, making buying decisions for people.

Eventually everything will be an SoC, they are cheaper to make. At the very least current and future IGPs can be used as OpenCL and DirectCompute cores.
 

Flipped Gazelle

Diamond Member
Sep 5, 2004
6,666
3
81
Hello everybody ::- ). My first post here. I'm going to try and make it a meaningful one. Please take into consideration that I am talking purely from an ENTHUSIAST's standpoint.
<snip>

I think you need to research more before ranting.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Correct me if I'm wrong but aren't there basically "two" lines of CPUs going to be made? The ones with the integrated GPUs, and the "classic" CPU only?

Honestly I haven't kept up with it all.

Yes, there will be "enthusiast" CPUs, that are derivatives of their enterprise gear, that as far as I know do not have an IGP embedded in them.
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
FYI, Bulldozer does not have a GPU, at least for the first generation.

Now take a chill pill man. The world in going to end in 2012, be happy with what you have now.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
I hate the CPU's future. I hate the fact that pointless transistors and technology is being shoved down our throats. I never had and (hopefully) never will have integrated graphics in my computer.

I WANT CLEAN CPUs!

Don't worry. It's likely as the two becomes more and more integrated they will find synergy. At the moment the integration is simple enough it doesn't do much more than seperate tasks for each other, but there will be code that cross.

Plus, when they get the embedded DRAM or dense SRAM on package or stacked, only the highest end GPUs will have advantage over the processor graphics parts on 3D code.

Ultimately the goal is to have the serial-processing cores and parallel-processing cores combined for maximum efficiency. You can't complain about that can you?
 

Axonn

Senior member
Oct 14, 2008
216
0
0
Flipped Gazelle: maybe I do. But then that is why I posted, right? So I am cleared up on some things by people in the know ::- P.

yasasvy: *grin*.

SparkyJJO: didn't know that Bulldozer won't have GPU. That's good news.

Kivada: damn, I thought we're more than 1&#37; :;- D. True, SOC is cheap. But SOC sucks compared to a good discrete CPU + GPU. Best performance.
 

Bman123

Diamond Member
Nov 16, 2008
3,221
1
81
Maybe you are complaining because you feel like you are gonna overpay for the cpu since it has new technology that your not gonna need, but what do I know
 

degibson

Golden Member
Mar 21, 2008
1,389
0
0
Kivada: damn, I thought we're more than 1% :;- D. True, SOC is cheap. But SOC sucks compared to a good discrete CPU + GPU. Best performance.

Best performance because current-generation graphics are highly optimized for computation density, not low-latency communication between CPU and GPU. It takes hundreds of nanoseconds to transfer one byte between GPU and CPU today! Software knows that, and only does embarrassingly parallel, huge working set, massively recurrent computations on the GPU.

Admittedly, first- and probably second-generation CPU+GPU offerings will probably be terrible -- especially because software will lag, badly. But give it time! Eventually, best (game) performance will come from a CPU+GPU sharing a die. Because software will evolve to understand that CPU-GPU-CPU latency will drop to a couple of nanoseconds. Suddenly things that never made sense before work faster than ever.

In the meantime, you'll still have your discrete GPU parts (NVidia ain't dying any time soon) and CPUs will work just fine if their on-die GPUs are offline.
 

MrPickins

Diamond Member
May 24, 2003
9,125
792
126
Ultimately the goal is to have the serial-processing cores and parallel-processing cores combined for maximum efficiency. You can't complain about that can you?

This was my thought.

Nothing wrong with having a SIMD core in your CPU. I just hope they can come up with a standard API or set of OPs so they can be widely utilized.
 

Axonn

Senior member
Oct 14, 2008
216
0
0
I am familiar with integer/floating, low latency, all that stuff. I'm not debating that. Of course for >90&#37; of people CPU/GPU union will pay off especially at later stages (and hell, for most people, even Sandy Bridge will rock! compared to what crap Intel gave them so far).

But... when does the CPU/GPU mongrel get 2 GB of dedicated GDDR5 RAM? ::- ) When does it get a socket capable of pouring 250 W in it? Yeah, for Solitaire and HD movies it will be a cool thing, but for tri-monitor super quality 3D gaming? ::- D. In how many years? I can't even see a 3rd generation Fusion product capable of doing that. Fusion is not enough. More is needed. Much more. And that simply outstretches the CPU.

Let me put it this way: I think a good fusion would be moving the GPU *NEAR* the CPU, maybe even on-die, but with a huge amount of super-fast RAM near it. We need less than 20 nm to be able to move huge GPUs on die with CPU however. And I'm not even sure that will ever compete with discrete solutions.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
Why stop at GPUs? Why not complain that the CPU is using up your precious transistors to support uOps that you personally will never run?

:D
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Let me put it this way: I think a good fusion would be moving the GPU *NEAR* the CPU, maybe even on-die, but with a huge amount of super-fast RAM near it. We need less than 20 nm to be able to move huge GPUs on die with CPU however. And I'm not even sure that will ever compete with discrete solutions.

Processor graphics parts won't replace discrete GPUs. But they will provide value in things that CPU or GPU alone can't provide. Sure, absolute performance wise nothing will replace having a 500mm2 GPU and a 300mm2 CPU. But can they do that while being on a laptop providing reasonable battery or without costing $1k for each? Or will a seperate system be able to communicate as fast as two being on-die?

Before, integration was entirely about cost. Now its about perf/x, $/x.

Again, you complain about having a "useless" die on a CPU. With simple one next to each other integration, I agree. But that will change.
 

nenforcer

Golden Member
Aug 26, 2008
1,779
20
81
Well I vote with my wallot and refuse to purchase either Core i3 with integrated video and now Sandy Bridge.

I will also probably avoid initial Fusion cores to see how those pan out.

Some people think nVidia's time as a stand alone supplier is limited but I think they will be just fine being the last independent discrete graphics supplier.

It will be really bad when non-computer savvy people purchase these machines and then when they're kid wants to play a game which won't run and they will be either forced to ugprade the CPU or hopefully at least have the ability to add a discrete graphics card.

Swapping the CPU is more of a pain than just taking in and out a video card as well.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
I would just like to chime in that i am disappointed that we are getting quad sandy bridges with graphics that i will just disable instead of hex sandys without trash graphics for the same cost.

I am waiting for bulldozer now on this reason alone.
 

WildW

Senior member
Oct 3, 2008
984
20
81
evilpicard.com
I'm quite upbeat on the whole GPU+CPU thing. Hopefully it will bring an end to laptops and desktops with hopeless integrated graphics. In a couple of years time, most every PC out there should have a usable mid-range graphics capability to serve as a base configuration for PC games. Then with any luck the PC gaming market will be revitalized as never before.
 

OBLAMA2009

Diamond Member
Apr 17, 2008
6,574
3
0
apparently he psychotically objects to higher performance and smaller form factors. maybe you should put a petition together to repeal moore's law
 

Kivada

Junior Member
Sep 10, 2010
23
0
0
Agreeing with WildW here, it's only a good thing, even if you don't use it as a GPU, its still going to be useful as a GPGPU chip.

I've actually wondered why AMD hasn't revived the AiW lineup with a dedicated set top box comp based around the 905e, Go green edition of the HD5750 and use the HD4295 as a dedicated GPGPU with 128-256Mb of GDDR5 and solder in 2-4Gb of 1.3v 1333 DDR3.

It would make one very good HTPC if it was all built into a single slim MATX sized box, sacrificing the upgrade ability and including a modified Mythbuntu or similar with Android. Toss in a pair of bluetooth game pads and include any and all games and game dev tools they can and start digging into the gaming market from the home brew side.
 

Axonn

Senior member
Oct 14, 2008
216
0
0
OBLAMA2009: I am not "psychotically" objecting to that. Apparently you didn't understand anything of what was said so far: I was saying that the inclusion of those transistors FOR GRAPHICS is pointless when I got a discrete GPU. That's all. As Acanthus very well put it: you could get a better CPU instead of that! D'uh.

I prefer no integrated GPU for -> less money, more transistors for what I really care (CPU, because I got a discrete GPU).
 

sandorski

No Lifer
Oct 10, 1999
70,790
6,349
126
lol. I remember similar rants when Dual Core began. It'll take a few years, but when the potential of this gets tapped you'll be all over it because it will make a big difference.
 

RavenSEAL

Diamond Member
Jan 4, 2010
8,661
3
0
I somewhat agree with the OP, CPU+GPU is:

1) an overkill
2)It's going to increase temperatures, which leads to #3
3) It's going to hammer overclocking.

So unless they build a line of CPUs without GPU intergrated. I'll stick with my e8500 for a few more years or i'll simply buy an i7 when the prices drop a bit.