Uh Oh. Remember, the key word is "MAY"

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: Aikouka
Originally posted by: ArchAngel777
Context, context, context... What was the topic I was referring too? SLI/Crossfire. Why was AGP stupid in relation to SLI? The answer is obvious... Troll elsewhere.

The topic you were referring to was the state of video cards in the old PCI market and how we had SLi. Then you went on to talk about PCI-E and said that it'll be fine as long as they don't do anything stupid again. Then you so sneakly put "AGP" in parenthesis, which meant the prior statement referred to AGP as being one of the originating stupid actions. Then, my statement went on to remark about how even though we "lost" SLi capabilities with the switch to AGP, there were plenty of benefits.

Although I don't see how you can get off calling me a troll when you're not even on topic ^_^. In fact, your "troll" defense is just something you're using to try to make your lackluster statement not demean you in any manner. Ranks up there with the Chewbacca defense :p.

Originally posted by: zephyrprime
Read up on computer history. The industry didn't move from PCI to AGP. It moved from VESA local bus to AGP.

Sigh, why do you people do this to yourself? Make audacious claims like you're some sort of sage when I can easily go to Wikipedia and prove you wrong?

I'll just grab a quote or two from the VESA Bus page or the PCI page.

http://en.wikipedia.org/wiki/Peripheral_Component_Interconnect
The PCI bus is common in modern PCs, where it has displaced ISA and VESA Local Bus as the standard expansion bus, but it also appears in many other computer types.

http://en.wikipedia.org/wiki/VESA_Local_Bus
By 1996, the Pentium (driven by Intel's Triton chipset and PCI architecture) had eliminated the 80486 market, and the VESA Local Bus with it. Many of the last 80486 motherboards made have PCI slots in addition to (or completely replacing) the VLB slots.

Now, the only way you can say that I'm wrong, is if you get all haughty and say, "Well, I meant busses designed specifically for graphic use!" Which first of all, that was not even what I said originally. Second of all, the VESA bus was an extension of the ISA bus and they both had to work together. ISA handled more than just graphics in this case.

Apop knew what I meant, he understood the context of my post... But that might be a bit much to expect from you.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Originally posted by: beggerking
Originally posted by: zephyrprime

Actually, those are all advantages ;).
Read up on computer history. The industry didn't move from PCI to AGP. It moved from VESA local bus to AGP.

its ISA -> EISA-> VLB -> PCI->AGP-> PCI-e
[/quote]You are right. my mistake.

 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: apoppin
Originally posted by: ArchAngel777
Originally posted by: GundamSonicZeroX
Originally posted by: apoppin
Originally posted by: ShadowOfMyself
You should really change the title of this thread.. Soon its gonna be invaded by wreckage + gstanfor VS apoppin, etc, all because some dude who calls himself an analyst said so

indeed

his analysis is based on 2 faulty premises:

1) that AMD will be the only platform using ATi GPUs

2) that Crossfire will fail
:laughing; That's dumb! It's bad business to limit your consumers like that. And about Crossfire failing... I think that both SLI and Crossfire may be done away with someday (not soon, but maybe someday) I mean, it didn't last too long when 3dfx did it, why would it last longer if Nvidia/ATi did it?


Because we have the PCI-E interface now. The reason you could not SLI was due to the AGP interface specification. Though there was some talk about it being possible to do PCI + AGP, that isn't reality, because didn't see it, did we? SLI/X-fire is here to stay unless they do something stupid again (AGP).

at least in the short-term ... 5 years or so

who can see the future clearly?

i imagine the new AMD platform may change things

ideally there would be ONE card doing the work of two ... sli/xfire generates more heat, noise than a single card .... perhaps the GX2 is the wave of the future . . . multi cores instead of multi cards

we had multicore cards. i owned a 3d fx 5500 apg thaty had 2 cores. its the wave of the past not the future. cross fire/ sl are here to stay since they allow you nearly double the performance of the best card out there.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
If Nvidia is all thats left, wouldn't that mean monopoly? And isn't monopoly illegal?
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
While I think we'll still see discrete GPU's supplied by DAMiT for the near future, I think all discrete GPU's will cease to exist on the high-end market at some point. I think eventually all of our GPU's will be unified with the CPU.
 

sandorski

No Lifer
Oct 10, 1999
70,853
6,391
126
Originally posted by: wizboy11
If Nvidia is all thats left, wouldn't that mean monopoly? And isn't monopoly illegal?

No. They may be the last(serious) player, but they just ended up there due to no fault of their own. If they end up in that position I also think it would be a sign that they soon would have no Market in which to "Monopolize".
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: josh6079
While I think we'll still see discrete GPU's supplied by DAMiT for the near future, I think all discrete GPU's will cease to exist on the high-end market at some point. I think eventually all of our GPU's will be unified with the CPU.

I disagree. I don't think a CPU\GPU will ever be as powerful as a separate GPU. I think it will be a way to make cheaper integrated graphics
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Wreckage
Originally posted by: josh6079
While I think we'll still see discrete GPU's supplied by DAMiT for the near future, I think all discrete GPU's will cease to exist on the high-end market at some point. I think eventually all of our GPU's will be unified with the CPU.

I disagree. I don't think a CPU\GPU will ever be as powerful as a separate GPU. I think it will be a way to make cheaper integrated graphics

Perhaps. There are a lot of factors involved with making a CPU / GPU hybrid that may turn out to be more beneficial than discretes, but we'll have to just wait and see.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
As I have said in the other thread, CPU/GPU hybrid will allow integration into smaller devices(ultra mini compact notebooks?) and/or lower production cost and/ or extend battery life, but it will not be faster.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: beggerking
As I have said in the other thread, CPU/GPU hybrid will allow integration into smaller devices(ultra mini compact notebooks?) and/or lower production cost and/ or extend battery life, but it will not be faster.

not in the beginning, perhaps

 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: apoppin
Originally posted by: beggerking
As I have said in the other thread, CPU/GPU hybrid will allow integration into smaller devices(ultra mini compact notebooks?) and/or lower production cost and/ or extend battery life, but it will not be faster.

not in the beginning, perhaps


Probably not for some time to come either. First offering of this type of CGPU will start small. Maybe even hand held devices. They don't need much power compared to desktops and highend lappys. Eventually, the company (Intel/DAMMIT/Nvidia) will gradually design for budget/midrange lappys. Then servers. Then finally, something worthy of a gamers desktop. Until then, discrete will still rule for the desktop. This may take years.
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
If ATI goes out of business then we are all doomed, with nvidia the only major player in the game and with no competition, they'll get sloppy and we'll get crappy cards.

Hopefully AMD will not eliminate ATI's high end market, now I can hate them and not feel bad about them since finally Intel got back in the game with Core 2 Duo ;)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Originally posted by: apoppin
Originally posted by: beggerking
As I have said in the other thread, CPU/GPU hybrid will allow integration into smaller devices(ultra mini compact notebooks?) and/or lower production cost and/ or extend battery life, but it will not be faster.

not in the beginning, perhaps


Probably not for some time to come either. First offering of this type of CGPU will start small. Maybe even hand held devices. They don't need much power compared to desktops and highend lappys. Eventually, the company (Intel/DAMMIT/Nvidia) will gradually design for budget/midrange lappys. Then servers. Then finally, something worthy of a gamers desktop. Until then, discrete will still rule for the desktop. This may take years.

ok . . . my prediction . . . by q4 2011 . . . 5 short years . . . the CPU-GPU hybred will compete succesfully - performancewise - with with 'traditional' discreet HW . . . and showing distinct advantages for whatever the 'platform' is designed for. . . i.e. we will have very specialized platforms optimized just for gaming

i believe you are looking at a wave of the future about to become the incoming tide. AMD is just positioning itself to get ahead of it... as is intel . . . and nvidia.
 

her34

Senior member
Dec 4, 2004
581
1
81
Originally posted by: josh6079
Originally posted by: Wreckage
Originally posted by: josh6079
While I think we'll still see discrete GPU's supplied by DAMiT for the near future, I think all discrete GPU's will cease to exist on the high-end market at some point. I think eventually all of our GPU's will be unified with the CPU.

I disagree. I don't think a CPU\GPU will ever be as powerful as a separate GPU. I think it will be a way to make cheaper integrated graphics

Perhaps. There are a lot of factors involved with making a CPU / GPU hybrid that may turn out to be more beneficial than discretes, but we'll have to just wait and see.

what are some of the factors?
 

her34

Senior member
Dec 4, 2004
581
1
81
Originally posted by: apoppin
Originally posted by: keysplayr2003
Originally posted by: apoppin
Originally posted by: beggerking
As I have said in the other thread, CPU/GPU hybrid will allow integration into smaller devices(ultra mini compact notebooks?) and/or lower production cost and/ or extend battery life, but it will not be faster.

not in the beginning, perhaps


Probably not for some time to come either. First offering of this type of CGPU will start small. Maybe even hand held devices. They don't need much power compared to desktops and highend lappys. Eventually, the company (Intel/DAMMIT/Nvidia) will gradually design for budget/midrange lappys. Then servers. Then finally, something worthy of a gamers desktop. Until then, discrete will still rule for the desktop. This may take years.

ok . . . my prediction . . . by q4 2011 . . . 5 short years . . . the CPU-GPU hybred will compete succesfully - performancewise - with with 'traditional' discreet HW . . . and showing distinct advantages for whatever the 'platform' is designed for. . . i.e. we will have very specialized platforms optimized just for gaming

i believe you are looking at a wave of the future about to become the incoming tide. AMD is just positioning itself to get ahead of it... as is intel . . . and nvidia.

how can a hybrid cpu-gpu overcome system bandwidth limitation?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: her34

how can a hybrid cpu-gpu overcome system bandwidth limitation?

i just read up on that . . . now where?
:Q

it seemed quite possible

edit: not in bookmarks and evidently older than 6 days.

it was in an article that talked about Folding at home, Stream and the new AMD platform . . .

or was it 3 articles? :p
:confused:
[i read so much about new tech recently that i am on info overload ... and now i get to play video games . . . NWN2 runs on Win2K but is a BEAST ... i doubt my forthcoming x1950p will be 'enough' ... my x850xt is struggling.
:Q

i now have a 176 page NWN2 manual to read . . .
:frown:

see you Nov 8 or so

:D

to answer your question . . . "no idea" ... but they are smart guys and gals.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Originally posted by: ArchAngel777
Apop knew what I meant, he understood the context of my post... But that might be a bit much to expect from you.

Hah, nice ;). I love how people who realize they lose their base in an argument turn to defaming their "opponent" just to show how weak they really are. So, don't try to pull a John Kerry and escape a stupid/badly-worded comment just to try to make yourself look better. You've already shown just how much you know by even attemping to refer to the switch from the becoming-saturated PCI bus (which isn't really the case anymore as most devices have moved off the PCI bus and onto direct connections with the northbridge or PCI-E lanes) to the AGP bus as stupid.

Now, I'm going to explain to you why I lashed out against your comment so you understand where I'm coming from. See, your comment attempted to look at a minor piece of the graphic processing community (3Dfx's Scan-Line Interleave) and say that AGP was stupid in comparison to PCI because it removed the ability to have multiple gpus. Well, actually... I should add that the other poster was right in that AGP 3.0 did add the ability for more than one AGP Master (card) per AGP Target (chipset). Although, this was fairly late in the game and I don't think manufacturers could create designs as I believe 3.0's introduction was not far off from the introduction of PCI-E 1.0. So, why would you even refer to a move as stupid just because it removes a niche product that so few people actually used (VooDoo 2 only, unless you could the VooDoo 5 5500's built-in SLi (7950 GX2 esque)). That is worse than complaining about the nForce 3 because they removed SoundStorm!

Frankly, you need to get off your high-horse and accept that your comment was off-base and has low merit. Just because the move to AGP removed SLI does not mean that a similar change from PCI-E to another multi-GPU-less setup would be comparably, according to you, stupid. I think we all know that AGP 1x provided twice as much bandwidth as standard 133MHz PCI and it was dedicated to only the video card.

Also, a little tid-bit of information about how to post properly, learn to quote only what you're referring to and you'll avoid a page+-long post that only had a sentence from yourself like you just made :p.

Now, to go back to the topic... somewhat:

From what it sounds like, CPU/GPU combinations may have a separate set of memory, similar to what they have now, for the graphics portion. There's a reason they use the graphics DRAM on video cards and it isn't for the premium price ;). I imagine that this will change eventually as most solutions tend to be basic tie-ins and then evolve into more of a true combination (Sort of like how Kentsfield is just two dual-cores smacked together instead of 4 true cores). Although, how well something is designed is typically dependent on how much R&D time is involved.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i hate being lectured . . . especially by someone who is unqualified to give it :p


anyway the new hybreds are supposed to have much more shared bandwith available ... i don't think we are going to get a lot of details on the new architecture for quite awhile ... there is a lot of speculation

HOWEVER, it does seem to be the 'future' . . . since DAAMit, intel and nvidia are all exploring it . . . that seems - at least imo, to be a good 'indicator' ;)

EDIT . . . so , in a way, the OP is entirely correct . . . AMD/ATI may withdraw from discrete GPU biz. ... eventually

nvidia, also
:Q
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
geforce 7950 gtx has memory bandwidth of 44.8GB/sec

core 2 duo L2 cache is ~20GB/sec
That's referring to the GPU's bandwidth for it's memory/core interaction, not system bandwidth limitation. The GB/sec for the GPU / CPU communication currently is a lot lower.

As you can see in this picture if there were to come about a CGPU, it would free up some traffic on the chipset and the GPU to CPU communication could be faster than 8-8.5 GB/sec.

I'm not saying that a shared cache method would be the technique to use when cutting down latency. There are other means to doing so, one of them being Torrenza and AMD's HTX socket.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: josh6079
geforce 7950 gtx has memory bandwidth of 44.8GB/sec

core 2 duo L2 cache is ~20GB/sec
That's referring to the GPU's bandwidth for it's memory/core interaction, not system bandwidth limitation. The GB/sec for the GPU / CPU communication currently is a lot lower.

why does GPU need to communicate with CPU or vice versa?

the only bottleneck is CPU to/from memory bandwidth
and
gpu to/from memory bandwidth
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,407
8,595
126
Originally posted by: apoppin

i.e. we will have very specialized platforms optimized just for gaming
*cough*xbox*cough*