From China with Love [G80, R600 and G81 info]

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: apoppin
Originally posted by: Wreckage
Originally posted by: apoppin
FACT: nvidia IS the little guy now

FACT: NVIDIA is the ONLY guy now.

ATI is gone, nothing more than a brand name and soon nothing more than a label printed on an intergrated chipset from AMD.

The R600 will be the last high end card from ATI.

just because you post the same ridiculous lie a thousand times doesn't make it true
:thumbsdown:

Actually it is true..
ATI is bought by AMD to fight the bigger war : AMD vs Intel

Nvidia is the only guy in the high-end market(smaller than CPU market). This is bad news for us.. less competition = higher prices.
 

nrb

Member
Feb 22, 2006
75
0
0
Originally posted by: apoppin
i seriously doubt that they will give up the 'high end' . . . AMD will not accept 2nd best nor buying GPUs from a monopoly....
But why would AMD care about either of those things? Your argument seems to be based purely on emotion rather than business. You think AMD will somehow feel degraded or inferior because they are participating in a market without making the most powerful component available in that market, and will therefore pull out all the technological stops to in order to restore their sense of pride and self-respect.

I just don't buy that. The only reason why AMD would concentrate resources on any market is if they think they can make a profit by doing so.

And why on Earth would AMD care about how much money you have to pay for a video card they aren't supplying? Why would they care whether Nvidia has a monopoly or not, so long as they aren't losing money as a result?

If you want to convince people you need to come up with a business case for AMD participating in what is currently a non-profitable market for ATI.
 

Nelsieus

Senior member
Mar 11, 2006
330
0
0
Originally posted by: Creig
Originally posted by: nrb
Is AMD interested in the GPU market? Yes, absolutely - there's a lot of money to be made in integrated GPUs. That's not the same thing as being interested in the high-end market.

If we follow that logic, then Nvidia shouldn't be interested in the high-end market either. But there's obviously money to be made at all levels of performance. Remember, the specs and features of today's flagship products will be found in tomorrow's mid and low end cards. So it's not as if all the money in R&D is going to waste.

And, as apoppin mentioned, a company will have increased sales throughout their whole lineup if their flagship product is recognized as the top performer. So there's yet another reason for AMD to continue producing high performance video cards that can compete directly with those from Nvidia.

This is already AMD's philosophy against Intel. And look at how quickly they grew after they released the Athlon. All the review sites declared AMD the performance winner, the enthusiasts jumped onboard with AMD and now AMD is a major player in the CPU market. Why would anybody think that they'll do anything differently with ATI? Just because they build performance CPUs doesn't mean they can't build performance GPUs as well.

It doesn't mean they can't, but it doesn't mean they want to , either.

AMD can now build a GPU that rivals nVidia in the next few years with ATI in their possesion. That's not what we're debating. The argument is why would AMD have any interest in building a GPU that rivals nVidia in the next few years?

To add to their revenues? They'll certainly benefit from revenues, but be plagued with lower margins, dividing their resources, loosing significant amounts of money poured into R&D, and ultimately shifting away from their main battle, that is with Intel.

Your argument seems to be "AMD will stay in the highend discreet markets because they can and it gives them more money." I'm afraid you guys aren't looking at the bigger picture, nor the deeper picture. There's more involved than AMD buying ATI and all of a sudden milking all of ATI's profits for the years to come.


Also, RedBox, I agree. I think we'll definately see R600 and then R620. R680 will launch based on how successful R600 and R620 was, imo. After the R6** architecture, though, is when I believe we'll start to see a pull-out by AMD. I predict we may see variants of R700, since development on that project has already begun, but whether it competes with nVidia's G90/100 will remain to be seen.

I actually think AMD could be rather successful just staying in the mid-range and lower-end markets. And the argument that "highend would affect mainstream sales" wouldn't apply, because they wouldn't have any highend to begin with.

That's just my opinion, anyways.

Nelsieus
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Why would AMD stop High End GPUs as its the one which gets the most lime light. Wouldn't have the High End performers be good for PR and sales marketing?

Wouldn't this be bad for business for AMD, since ATi was making money in all sectors, why pretty much cut off 50% of ATis revenue (High End, meaning that in the long term, High End would turn into mid, then low etc)? Isn't that a bad business strategy?
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: apoppin
it is an acquisition that ATi wanted

not a hostile takeover. ;)

more like a marriage

HM! I wonder why a company would actually WANT to be taken over?

Problems, perhaps?
YUP.

And no, it wasnt a marriage in any sense.. that would be a MERGER.

Originally posted by: apoppin
From China with Love [G80, R600 and G81 info]
i guess the info 'fits" . . . even if the card mightn't/ :p

This is so weak of you.

Complaining about the GTX being an inch or two longer? LOL

As if you will be buying one anyway? Why comment on such a insignificant detail and try to make it a big deal? Its not.
You wont buy one, you are looking at outdated X1950Pro cards for $199. :disgust:

Ok, so cut down the GTX by some miniscule amount, and slow it down and I'm sure you'd be buying! Right. I'm sure then you'd be the G80s biggest proponent if they just made a product to YOUR specifications.

The rest of us want the fastest card ever produced, with the most advanced feature set the world has ever seen.
Guess your more interested in shaving an inch off.. looks like you are commenting on big boy hardware when you need to be playing with low powered, outdated stuff.. Amoppin.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: nrb
If you want to convince people you need to come up with a business case for AMD participating in what is currently a non-profitable market for ATI.

Quit speaking logically.
That doesnt apply here. This is an ATI fanboy haven. Hush with that reasonable, logical approach you are employing.
You are only allowed to flame people and mob them, while ignoring all statements and logical arguments contained within the posts of those you attack.

Originally posted by: Drayvn
Why would AMD stop High End GPUs as its the one which gets the most lime light. Wouldn't have the High End performers be good for PR and sales marketing?

Wouldn't this be bad for business for AMD, since ATi was making money in all sectors, why pretty much cut off 50% of ATis revenue (High End, meaning that in the long term, High End would turn into mid, then low etc)? Isn't that a bad business strategy?

I agree with your analysis. My point is that while AMD probably doesnt -want- to stop or hurt high end GPU development..
its that they likely wont have much of a choice. If they do try to take on NV high end GPUs and Intel on CPUs, I think thats a recipe for disaster.. while definitely creating an enemy out of a former ally (Nvidia).

Theres definitely room for companies to jockey around a bit for the next few years.. as AMD reorganizes itself, and Intel figures out how they are going to go after AMD.. and as NV sits in the middle of everyone.
Theres a lot in the air. I wouldnt mind seeing AMD work closely with NV still, or NV working closely with Intel. Or for AMD/Intel to go head to head on CPUs and NV continue as a niche company.
All I can say with certainty is that all 3 (NV/ATI/Intel) are now in stable positions. Before, ATI and NV were going to run each other out of business.. well, one of them was going to go eventually and it just happened to be ATI.

I think AMD would love to continue the high end GPU market, they are probably constantly weighing the costs vs benefits of such an idea.
I'm guessing that they are going to see how R600 pans out, if they make a killing.. of course they'll continue on.. AS LONG as it doenst slow down their advance on Intels market.
In honesty, they arent big enough, nor is AMD reorganized enough at this point to be ready for a two front war against NV and Intel.

Intels market > Nvidias. AMD did not take over ATI to help them out in their former goals.. but rather expand AMDs and refine their platform/product.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
This link from lopris thread confirms my 15K 3dmark06 score with a 5ghz C2D.
13.7K on Q2Quadro@3.6ghz

Also theres a C2E@3.8ghz with an overclocked GTX getting 12.6K

Judging from the 3.6ghz Core2Quadro result.. I dont think its any stretch of the imagination that a Core2 processor@5ghz could hit 15K, which is only 1.3K under the 3.6ghz C2Q number.
It appears G80 scales very well with faster CPUs. Which is about time we get this kind of firepower.

15K is probably about as good as a single, stock 8800GTX is capable of pulling off.. with the appropriate CPU and with current, immature drivers. :D :beer: :thumbsup:
 

XNice

Golden Member
Jun 24, 2000
1,562
0
76
I am boggled why everyone sees this ACQUISITION as means to directly attack Nvidia. Like so many of the more logically minded posters have said, they are trying to battle/compete with Intel. Nvidia is too much of an ally to AMD for them to do anything to directly and negatively affect them.

Nvidia is arguably the reason why AMD is where they are now.....nforce anyone? Yea we know VIA had a huge part before the nforce series, but NV made award winning chipsets long before they started producing chipsets for Intel cpu's.
 

Elfear

Diamond Member
May 30, 2004
7,081
596
126
Originally posted by: Elfear
Originally posted by: Crusader


You obviously havent read my direct response to this, or choose to ignore it becuase most of you would rather attack someone rather than take into account their actual viewpoints.

Crusaders view-
Doesnt matter what happens to "ATI" branded video cards. They can go under. That'd be great. Why? This is a free market, someone will take their place if theres a profit to be made.
ATI isnt necessary to be around, someone else will step in.
Thats why I dont care. Kinda funny watching the ATI fanboys squeal though, I can admit that much.

I'm guessing no one here is in any sort of private business? There are plenty capable companies that'd get in the market if they could A) Survive Nvidia, and B) Consider it worthwhile/profitable.


I'll take a stab at it. What about barriers to entry? It's not like opening up a competing lemonade stand across the street from your neighborhood rival. To get into the graphics card business takes some serious capital, some very talented engineers and programmers, and even then you're starting out behind the big boys by a long shot. If Nvidia and ATI have been in the graphics business for years and years now (i.e. they have LOTS of experience) and they've been working on G80/R600 for a long time now, how is an upstart company supposed to directly compete with them? They couldn't, not for many years to come and it wouldn't be a very lucrative business getting there.

That's a very basic business principal. Those industries which have little threat of substitutes, high entry barriers, and are very capital intensive don't have new guys popping up all over the place. It would be a lose lose situation for everyone if ATI stopped making high-end video cards no matter which team you root for.

You never did answer my rebuttal here Crusader. I'm not saying you're totally wrong but you definetely have some flaws in your logic.
 

Nelsieus

Senior member
Mar 11, 2006
330
0
0
Originally posted by: Elfear
Originally posted by: Elfear
Originally posted by: Crusader


You obviously havent read my direct response to this, or choose to ignore it becuase most of you would rather attack someone rather than take into account their actual viewpoints.

Crusaders view-
Doesnt matter what happens to "ATI" branded video cards. They can go under. That'd be great. Why? This is a free market, someone will take their place if theres a profit to be made.
ATI isnt necessary to be around, someone else will step in.
Thats why I dont care. Kinda funny watching the ATI fanboys squeal though, I can admit that much.

I'm guessing no one here is in any sort of private business? There are plenty capable companies that'd get in the market if they could A) Survive Nvidia, and B) Consider it worthwhile/profitable.


I'll take a stab at it. What about barriers to entry? It's not like opening up a competing lemonade stand across the street from your neighborhood rival. To get into the graphics card business takes some serious capital, some very talented engineers and programmers, and even then you're starting out behind the big boys by a long shot. If Nvidia and ATI have been in the graphics business for years and years now (i.e. they have LOTS of experience) and they've been working on G80/R600 for a long time now, how is an upstart company supposed to directly compete with them? They couldn't, not for many years to come and it wouldn't be a very lucrative business getting there.

That's a very basic business principal. Those industries which have little threat of substitutes, high entry barriers, and are very capital intensive don't have new guys popping up all over the place. It would be a lose lose situation for everyone if ATI stopped making high-end video cards no matter which team you root for.

You never did answer my rebuttal here Crusader. I'm not saying you're totally wrong but you definetely have some flaws in your logic.

Well let's not overlook Matrox or even S3G. I mean, obviously nobody else is as far along as nVidia or ATI, but it wouldn't necisarily be like having to start from scratch.

It wouldn't totally be a lose-lose, either, imho. It would basically slow an otherwise rapid industry down (which I think some people might be partial to, including myself). Yes that means less innovation, perhaps, but also not having to buy a new GPU every few months to stay on top of things.

Don't get me wrong, I think it's unfourtunate ATI is / will go down like this. And there are definately some big disadvantages with it, like you mention. But I'm also looking at a few implications that might not be so bad. I doubt they will outweigh the negatives, but we'll have to see.

 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: Elfear
Originally posted by: Elfear
Originally posted by: Crusader


You obviously havent read my direct response to this, or choose to ignore it becuase most of you would rather attack someone rather than take into account their actual viewpoints.

Crusaders view-
Doesnt matter what happens to "ATI" branded video cards. They can go under. That'd be great. Why? This is a free market, someone will take their place if theres a profit to be made.
ATI isnt necessary to be around, someone else will step in.
Thats why I dont care. Kinda funny watching the ATI fanboys squeal though, I can admit that much.

I'm guessing no one here is in any sort of private business? There are plenty capable companies that'd get in the market if they could A) Survive Nvidia, and B) Consider it worthwhile/profitable.


I'll take a stab at it. What about barriers to entry? It's not like opening up a competing lemonade stand across the street from your neighborhood rival. To get into the graphics card business takes some serious capital, some very talented engineers and programmers, and even then you're starting out behind the big boys by a long shot. If Nvidia and ATI have been in the graphics business for years and years now (i.e. they have LOTS of experience) and they've been working on G80/R600 for a long time now, how is an upstart company supposed to directly compete with them? They couldn't, not for many years to come and it wouldn't be a very lucrative business getting there.

That's a very basic business principal. Those industries which have little threat of substitutes, high entry barriers, and are very capital intensive don't have new guys popping up all over the place. It would be a lose lose situation for everyone if ATI stopped making high-end video cards no matter which team you root for.

You never did answer my rebuttal here Crusader. I'm not saying you're totally wrong but you definetely have some flaws in your logic.

his logic isn't flawed...neither is yours..

someone will step in to replace ATI (Ageia??) , but they will be producing entry level cards. It'll be a few years before they will be able to have a GPU to compete in high-end market.

This is exactly what happened when
1. Intel 486 vs Amd 386 = we paid premium for crap i486 cpu
2. Nvidia Gforce1 vs Ati Rage = we paid premium for gforce Gpu
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
his logic isn't flawed...neither is yours..

someone will step in to replace ATI (Ageia??) , but they will be producing entry level cards. It'll be a few years before they will be able to have a GPU to compete in high-end market.

This is exactly what happened when
1. Intel 486 vs Amd 386 = we paid premium for crap i486 cpu
2. Nvidia Gforce1 vs Ati Rage = we paid premium for gforce Gpu
I'm still waiting for some good competition against Creative.
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
Originally posted by: Nelsieus
It doesn't mean they can't, but it doesn't mean they want to , either.

Nor does it meant they intend to pull out of the high end market.



Originally posted by: Nelsieus
AMD can now build a GPU that rivals nVidia in the next few years with ATI in their possesion. That's not what we're debating. The argument is why would AMD have any interest in building a GPU that rivals nVidia in the next few years?

To add to their revenues? They'll certainly benefit from revenues, but be plagued with lower margins, dividing their resources, loosing significant amounts of money poured into R&D, and ultimately shifting away from their main battle, that is with Intel.

Why would they need to shift anything away from Intel? They still have all the ATI employees and engineers. They will probably downsize ATI a bit, just to get rid of managerial and executive positions that are now duplicated between ATI/AMD, but ATI can (and is) continuing business as usual.



Originally posted by: Nelsieus
Your argument seems to be "AMD will stay in the highend discreet markets because they can and it gives them more money." I'm afraid you guys aren't looking at the bigger picture, nor the deeper picture. There's more involved than AMD buying ATI and all of a sudden milking all of ATI's profits for the years to come.

I don't think YOU'RE looking at the big picture. ATI now has access to all the patents, licensing and technology of AMD and vice versa. Both ATI and AMD are now in an even stronger position to bring new technology to their respective production lines than they were before the acquisition. ATI wasn't in any danger of going under before the buyout and now they have even greater technological resources to draw upon. It makes no sense to say AMD bought ATI for $5.4 billion dollars only to tell ATI to close up shop.



Originally posted by: Nelsieus
I actually think AMD could be rather successful just staying in the mid-range and lower-end markets. And the argument that "highend would affect mainstream sales" wouldn't apply, because they wouldn't have any highend to begin with.

I don't understand the logic of your argument here. Again, sales of mid to low end cards is partially fueled by the hype surrounding high end cards. So by not having a high end card, mid and low end sales would suffer. Possibly to the extent that a company would not be able to survive.

I'm sure that if Nvidia thought they would do better by not offering a high end card, they would have by now. They aren't coming out with cards like the G80 just for the sheer joy of building it. They're doing it to make a profit. The same applies to ATI.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: josh6079
his logic isn't flawed...neither is yours..

someone will step in to replace ATI (Ageia??) , but they will be producing entry level cards. It'll be a few years before they will be able to have a GPU to compete in high-end market.

This is exactly what happened when
1. Intel 486 vs Amd 386 = we paid premium for crap i486 cpu
2. Nvidia Gforce1 vs Ati Rage = we paid premium for gforce Gpu
I'm still waiting for some good competition against Creative.

I was a mx300 fan, but SB Live just sounded too good...

and for some unknown reason less people care about sound cards nowadays.. 3D sound positioning was a big thing back then when 5.1 speakers were expensive.. now 5.1 speakers are so cheap people just don't need a good soundcard anymore...

also, integrated soundcards are getting much better now..
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
...let's not forget the integrated market largely outweighs the discrete desktop, which is why Intel continues to be the number one supplier of graphics. This, is the arena AMD wants to compete with. Not a smaller, nich market that nVidia seems to be covering and has no sign of letting up on.
If the integrated market was really that inticing for businesses, Nvidia and ATI would have been competeing in the integrated sector instead of the discrete power-houses. I mean, if the integrated solution is really that appealing for large corporations such as Nvidia and ATI, they both would have been making nothing but integrated solutions for years so as to reap the rewards from it's huge user-base.

Also, what would happen if there was to be a CPU/GPU die? It would eliminate a huge amount of the bottleneck between GPU/CPU communication thereby "killing" PCI-Ex16 and possibly advanced chipsets as we know it. (Although the chipsets may become more advanced still for reasons other than trafficing video signals back and forth [i.e. dedicated physics, etc.[)

A integrated solution isn't demanding enough to warrant the design of a CPU/GPU substrate. Also, Nvidia isn't planning on making CPU's just so that it can sit out with an overpriced, high-end, discrete video card while making integrated CPU/GPU solutions. These first CPU/GPU designs will have more power than a simple integrated Intel, otherwise they wouldn't be pushing for the design.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
I was a mx300 fan, but SB Live just sounded too good...

and for some unknown reason less people care about sound cards nowadays.. 3D sound positioning was a big thing back then when 5.1 speakers were expensive.. now 5.1 speakers are so cheap people just don't need a good soundcard anymore...

also, integrated soundcards are getting much better now..
The point is, no one will step up quickly enough to really compete if indeed ATI stops making cards. After a period of time passes in a monopoly, the same thing would happen, "less people care about [video cards] nowadays...integrated [solutions] are getting much better now.."

It's not a "someone will step up and everything will be like normal again" scenario. By the time someone "steps up" CPU/GPU designs will be the hot-zone and discrete methods will be a dying industry.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
actually, m200/gf6100 are attempts to reap rewards from integrated market.. Personally I would prefer a m200 or 6100 over anything intel offers..

I believe the CPU/GPU die idea is an attempt to save production cost even farther than integrated graphics..more of a universal cpu/gpu for desktop,notebooks,cellphone, pda etc etc PCI-ex16 bandwidth isn't a problem for today's videocard.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: beggerking
actually, m200/gf6100 are attempts to reap rewards from integrated market.. Personally I would prefer a m200 or 6100 over anything intel offers..

I believe the CPU/GPU die idea is an attempt to save production cost even farther than integrated graphics..more of a universal cpu/gpu for desktop,notebooks,cellphone, pda etc etc PCI-ex16 bandwidth isn't a problem for today's videocard.

You might be right about the cutting of manufacturing costs. Although they won't see profits for such a move until it has countered their amount in R&D for such CPU/GPU technology.

As far as the bandwidth for PCI-Ex16 goes, it isn't the bandwidth that is creating the bottleneck but the fact that the instructions have to be sent through the chipset first and then go to the CPU. Having an on-die interaction between the two on one piece of substrate would be much faster.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: josh6079
I was a mx300 fan, but SB Live just sounded too good...

and for some unknown reason less people care about sound cards nowadays.. 3D sound positioning was a big thing back then when 5.1 speakers were expensive.. now 5.1 speakers are so cheap people just don't need a good soundcard anymore...

also, integrated soundcards are getting much better now..
The point is, no one will step up quickly enough to really compete if indeed ATI stops making cards. After a period of time passes in a monopoly, the same thing would happen, "less people care about [video cards] nowadays...integrated [solutions] are getting much better now.."

It's not a "someone will step up and everything will be like normal again" scenario. By the time someone "steps up" CPU/GPU designs will be the hot-zone and discrete methods will be a dying industry.

different market.

most people acquire more information from what they see than what they hear. No matter how great the soundcard/speaker setup is, our hearing is limited.

That is not the case with videocard and monitor. Increased monitor size require extra resolution require better videocards.

Colorwise we have already attended our physical limitation at 16mil colors(we can actually see alot less).

 

Elfear

Diamond Member
May 30, 2004
7,081
596
126
Originally posted by: beggerking

his logic isn't flawed...neither is yours..

someone will step in to replace ATI (Ageia??) , but they will be producing entry level cards. It'll be a few years before they will be able to have a GPU to compete in high-end market.

This is exactly what happened when
1. Intel 486 vs Amd 386 = we paid premium for crap i486 cpu
2. Nvidia Gforce1 vs Ati Rage = we paid premium for gforce Gpu

Hmm. Well my limited knowledge of GPU history is that even though ATI was founded in 1985, it never became a serious contender to Nvidia until around the year 2000 with the Radeon 8500. Nvidia was founded in 1993 but I don't know if Nvidia and ATi were direct competitors at that time or not. It took ATI a long time to be competitive with Nvidia and they have been duking it out for 6+ years now with no one else getting close to their market share.

Look what happened to XGI. They tried to compete with the big boys in 2003 when they introduced the Volari Duo and look where they are now. All the other graphics card companies are hardly a blip on the radar screen. I just don't see someone walking in and being directly competitive with Nvidia (assuming ATI quits making high-end stuff) in a couple years. I imagine it would be a long road for a company who decided to compete in this industry. A long road in which we as consumers would be stuck with ho-hum GPU's at high price levels. Doesn't sound that great to me.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: josh6079
Originally posted by: beggerking
actually, m200/gf6100 are attempts to reap rewards from integrated market.. Personally I would prefer a m200 or 6100 over anything intel offers..

I believe the CPU/GPU die idea is an attempt to save production cost even farther than integrated graphics..more of a universal cpu/gpu for desktop,notebooks,cellphone, pda etc etc PCI-ex16 bandwidth isn't a problem for today's videocard.

You might be right about the cutting of manufacturing costs. Although they won't see profits for such a move until it has countered their amount in R&D for such CPU/GPU technology.

As far as the bandwidth for PCI-Ex16 goes, it isn't the bandwidth that is creating the bottleneck but the fact that the instructions have to be sent through the chipset first and then go to the CPU. Having an on-die interaction between the two on one piece of substrate would be much faster.

I'm saying cost + size --> mobile phone, pda, ultraportable notebooks etc , larger market for a single cpu = mass production = less individual cost = more profit.

on-die would surely be faster, but is it the bottleneck? are you sure it is a bottleneck in today's videocard?
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
different market.

most people acquire more information from what they see than what they hear. No matter how great the soundcard/speaker setup is, our hearing is limited.

That is not the case with videocard and monitor. Increased monitor size require extra resolution require better videocards.

Colorwise we have already attended our physical limitation at 16mil colors(we can actually see alot less).
Of course it's a different market. If it was the same we wouldn't be talking about Nvidia and ATI since Creative would have the crown.

The point is, the sound industry is at a discrete king-pin with integrated solutions slowly getting better--enough to do some competition. That is bound to happen as time goes on and technology advances. The same would be said if the video market had one discrete king-pin with time passing and integrated technology natively becoming more advanced.

If ATI does indeed leave the market entirely, it won't be pretty. And even if some competition does come about, it won't be for a while at which point, Nvidia will be on to CPU/GPU solutions (maybe). It's not a pretty, "somone's gonna come around and keep prices low and technology advancing."
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
on-die would surely be faster, but is it the bottleneck? are you sure it is a bottleneck in today's videocard?
Not in today's, but when compared to the traffic of an on-die GPU/CPU solution, yes.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: josh6079
Of course it's a different market. If it was the same we wouldn't be talking about Nvidia and ATI since Creative would have the crown.

The point is, the sound industry is at a discrete king-pin with integrated solutions slowly getting better--enough to do some competition. That is bound to happen as time goes on and technology advances. The same would be said if the video market had one discrete king-pin with time passing and integrated technology natively becoming more advanced.

If ATI does indeed leave the market entirely, it won't be pretty. And even if some competition does come about, it won't be for a while at which point, Nvidia will be on to CPU/GPU solutions (maybe). It's not a pretty, "somone's gonna come around and keep prices low and technology advancing."

exactly what I said.... someone will come in to compete unlike the soundcard industry, which is pretty much stagnant. Videocard will keep advance but price will be up to Nvidia, not up to the market .. :(

When compared to the traffic of an on-die GPU/CPU solution, yes.

but is it a bottleneck that would affect system performance?