Well, I called it wrong... I guess the "gimmick" is here to stay

Andvari

Senior member
Jan 22, 2003
612
0
0
Five years ago when I built my P4 system with a Radeon 9800, I thought "Man PCs are really coming a long way." Not long afterwards, nVIDIA invents SLI. I immediately thought, "Man, what a gimmick. This definately won't last. They can't just cop out of making real progress by slapping two cards together for an improvement."

Boy was I wrong. Not only did it last, but ATI made their own version. The school of thought even spilled over into CPUs, with multiple "cores." To me, it seemed like they couldn't invent something superior, so they just doubled up on existing technology. Tripled. Quadrupled. And so on.

I mean clearly, it works. And it is far more practical with CPUs than GPUs, both in terms of money and performance. I guess it just seemed stupid to me because it seemed too simple. Too obvious. It just seems like a solution that an average Joe would come up with. "Hey Joe, I got a new processor." To which he replies, "Dur, well I've got TWO new processors! I win!"



Anyhoo, I just wanted to mention how horribly wrong I was in predicting technology on this one. I've usually had good luck with such shots in the dark. I remember back when I was a kid reading computer magazines in the grocery store, drooling over the latest 3dfx Voodoo cards, and I saw an add for something called nVIDIA. I dunno if I was simply impressed by the logo or what, but I immediately thought, "Wow, this graphics company is going places." Heh.
 

Andvari

Senior member
Jan 22, 2003
612
0
0
^Oh, woops. Well I guess it just became more mainstream then. Or maybe I lived under a rock in the past. Either way. =p
 

krnmastersgt

Platinum Member
Jan 10, 2008
2,873
0
0
You do realize the average trend of technology is meant to double in its capacity/ability every year. Going at that rate or just under it, you buy something and within a year or two at most it's outdated :D
 

Andvari

Senior member
Jan 22, 2003
612
0
0
^Yeah, I know. I just expect a 2 to be succeeded by a 4; not two 2s.

I want one thing to double in ability, not quantity.
 

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
from my observation... the silicon industry was beginning to realize that moore's law is not going to be able to continue forever.

the limitation is hard to overcome as they use light for lithography, and the wavelengths of available light are only so small.

to get smaller, there are a couple of options. x-ray lithography has been minorly in use in various companies but not very widespread, even though it's a superior technology, it never gained much steam. They are difficult to work with, focus, hazardous for humans, etc.

the ultra thin (~2-3 nm) gate oxide also came with its own limitations...


for a while, the silicon industry started hitting a wall. that's when you started seeing dual and quad cores. If you can't do more with 1 chip, then just pump out multiple cores. graphics cards are already internally parallel... so things like SLI make a lot of sense. we are already seeing multi GPU chips coming out soon as well.


Now... intel is back in full force with superior technology which once again advances the field and allows smaller and faster processors. the gate oxide problem has been overcome for the time being using hafnium oxide. and this was not a trivial solution by any means. silicon is much more compatible with silicon oxide as a gate oxide, as one might expect.


however the lithography problem still exists.

x-ray or electron beam lithography may have to be used (main-stream) at some point, x-ray is one option but since electrons can be tuned to any wavelength, you could theoretically create almost atomic transistors using e-beam litho.

however, the limitation with e-beam litho is that it is incredibly slow... it could take months and months just to make a lithography mask. and keep in mind, an intel pentium processor has MANY MANY masks... I don't know the exact number but it's probably in the 50-100 range.

I am very excited about the hafnium oxide solution (not to mention the newer metal gate versus the traditional polysilicon gate), some fellow acquaintances of mine actually worked on this technology.

however no one really knows what a few years from now holds in terms of processor technology.

 

batmang

Diamond Member
Jul 16, 2003
3,020
1
81
Originally posted by: Andvari
^Yeah, I know. I just expect a 2 to be succeeded by a 4; not two 2s.

I want one thing to double in ability, not quantity.

Well, Nvidia has Tri-SLi and ATi has Spider. So its basically happening. :)
 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
Still, SLI and CF are hardly mainstream solutions, I'd love to see the breakdown of SLI/CF vs single GPU users, I would bet that at least 99% still run single GPUs at this stage.

It's more for the extreme gamers and enthusiasts more than anything I guess.
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
I thought I read not too long agon in one of the AT articles stating that approx. 90% of the desktop PCs shipped have no discrete GPU at all. (98% probably for laptops) SLI/CF is, in an ideal world, actually the most efficient way of improving gaming experience.

Say SLI/CF works as intended without bugs or incompatibility, and think about:

- Will a $1K CPU 10 times the gaming performance compared to, say, a $100 one?
- How about $300 memory? Will it give you double the performance of $50 memory, not to mention 6 times if we go by the price?
- Motherboard, HDDs, everything else: same story

Nothing will give the financially proportionate improvement in gaming experience other than GPUs. Unfortunately the implementation of SLI/CF isn't ideal (bug fest) and GPU market is way too volatile (there is the fear that something new is coming soon) Right now it looks like both AMD/NV are trying to fix the drivers/compatibility issue first. It makes sense since a clean code for multi-GPU under DX10 can serve them for some time to come.

P.S.: Why is this thread in CPU & OC forum?

 

NaOH

Diamond Member
Mar 2, 2006
5,015
0
0
SLI IS a marketing gimmick that doesn't return the performance gains you'd expect after dropping double the amount you'd normally spend. Nvidia knows this as well. Only thing it's good for is increasing your e-peen.
 

krnmastersgt

Platinum Member
Jan 10, 2008
2,873
0
0
That isn't true, sure they're being slow about driver and bug issues but the core fact is that when you're a serious enthusiast every little improvement in performance is worth it, SLI and CF were made for enthusiasts (who have the money anyway), that's the reason most people don't use it, in terms of the price/performance ratio yeah it sucks, but if have the money and want the performance boost, why not? Some guy in India is building a billion dollar home, and his logic is I have the money and I want it, so why not?
 

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
It not a "gimmick" unless you can get better performance on a single card for cheaper.

SLI is very good for people who have the money for it.


I am personally more excited about dual GPU cards than SLI/crossfire.

I personally like to use my PCI ports. but apparently hardcore gamers never heard of PCIE raid (of course you could do SLi plus PCIE raid if you have a tri-SLI mobo...)
 

Andvari

Senior member
Jan 22, 2003
612
0
0
I think most of you are missing my point here, except wired.

This isn't about GPUs or CPUs specifically. It's just about technology in general. Basically I'm bugged by the school of thought that instead of making an actual successor to a piece of technology, hardware designers are simply doubling up on existing technology.

I put this in the CPU forum because I'm mainly bugged by the CPU aspect of it. With GPUs, it's completely left up to the consumer. With CPUs, we buy a cingle chip with multiple cores. This doesn't effect the consumer, but I just think it looks bad for Intel/AMD/whoever. Where are the 8 cores? The 16 cores? Why stop there? The real question is, where is the new SINGLE core processor that outperforms all these multi-cores?

It's like taking a long road trip and you drive your 15-gallon-gas-tank car until you run out of gas, and then there is another 15-gallon-gas-tank car waiting for you there. That's effectively 30 gallons, but it's 2 cars. Where's the 35-gallon-gas-tank car?

No matter how you slice it, it is obviously more potential power for the consumer, which is good I guess in the end. With GPUs it costs money. With CPUs it's just unimpressive theory-wise.



EDIT: By the way those questions are all rhetorical now, since wired already addressed it with the modern failings of moore's law. I was just trying to better explain my intent with this thread for those of you that were off track. =p
 

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
Thanks for the vote of confidence :)


In any case, I wouldn't be surprised if we started seeing dual box systems for the hardcore enthusiasts in a few years' time.

Box 1: dual quad-core CPUs
Box 2: 3-4+ video cards, and a bridging CPU to interface it with box 1

Of course box 2 could be much smaller... but you can imagine what kind of kickass system would be born from such a setup
 

Amaroque

Platinum Member
Jan 2, 2005
2,178
0
0
I disagree with the OP. I've used dual 200MHz PPro systems, and they were still smoother running than the 600MHz P3's. Granted, the P3 would do single tasks much faster. I still found the two PPros a hugly much smoother computing experience.
 

krnmastersgt

Platinum Member
Jan 10, 2008
2,873
0
0
Well what's wrong with doubling it up? Making the same performance from last generation operate at a cooler temperature in a smaller size and allowing 4 cores to fit into 1 cpu is a technological advancement, you can't expect there to be a breakthrough with every single generation of computer components. Moore's law doesn't state what the advancement has to be, going from 2 to 4 cores is doubling it's capabilities, and going to 8 cores next year will be too, but as we all know there is a physical limit to how fast our technology can grow without some sort of revolutionary breakthrough in manufacturing. Also wired, you seen Skulltrail? It basically is Box 1 and Box 2 put together, you should read up on it, I find it ridiculously cool :p
 

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
Originally posted by: krnmastersgt
Well what's wrong with doubling it up? Making the same performance from last generation operate at a cooler temperature in a smaller size and allowing 4 cores to fit into 1 cpu is a technological advancement, you can't expect there to be a breakthrough with every single generation of computer components. Moore's law doesn't state what the advancement has to be, going from 2 to 4 cores is doubling it's capabilities, and going to 8 cores next year will be too, but as we all know there is a physical limit to how fast our technology can grow without some sort of revolutionary breakthrough in manufacturing. Also wired, you seen Skulltrail? It basically is Box 1 and Box 2 put together, you should read up on it, I find it ridiculously cool :p

i'll check it out later, thanks for the rec
 

aussiestilgar

Senior member
Dec 2, 2007
245
0
0
Servers have been doing multi-core for years. Why not? At some point it costs more money and effort to create a faster technology than it does to innovate a different path with the technology you have now. Performance still increases, its just a different method. I think its more than likely that we'll using multi-core multi-threaded CPUs and GPUs in one rig soon enough.
 

Amaroque

Platinum Member
Jan 2, 2005
2,178
0
0
Originally posted by: aussiestilgar
Servers have been doing multi-core for years. Why not? At some point it costs more money and effort to create a faster technology than it does to innovate a different path with the technology you have now. Performance still increases, its just a different method. I think its more than likely that we'll using multi-core multi-threaded CPUs and GPUs in one rig soon enough.

Well posted.
 

tigersty1e

Golden Member
Dec 13, 2004
1,963
0
76
Intel did try to increase the single core processors, but they failed to cope with heat.

They got diminishing returns on the single cores.

It's like a car engine... You can make a single piston and shaft more efficient, but there's a point where diminshing returns come into effect and it's more better to add to slap 8 together and make a v8 engine.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Originally posted by: Andvari
To me, it seemed like they couldn't invent something superior, so they just doubled up on existing technology. Tripled. Quadrupled. And so on.
Well, you're correct but that doesn't mean the underlying problem isn't significant. It's getting increasingly hard to build better and better single cores. Multicore is in fact an easy way for designers to attain more performance.



 

Falloutboy

Diamond Member
Jan 2, 2003
5,916
0
76
atleast to me the most logical way for graphics to move forward is each generation creat a core that can be put in a MCM package, then have 1-4 on a card for the different levels. seems to be this would be a very econmical way to go since your only producing 1 core at any one time