Is it just me or do we need a 3rd player in the GPU industry?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
in 2007 i remember via launch volari duo graphic card and it perform similar to nvdia 6800 gt, but its have lower image quality and people didn't know anything about via and after that i never heard about them again. And to be honest intel make mistake by pushing x86 instruction set too hard, they should use powerVR technology after all intel owned it. And another player to recognize is qualcom with their adriano gpu and qualcom are bigger player than via
 

dust

Golden Member
Oct 13, 2008
1,339
2
71
We sure need another player in the market. It worries me how both the existing ones started to be comfortable with their current status, none of them is eager anymore to really pull ahead, we get a~15-25% performance improvement with every generation and a huge marketing bull.


To whoever said a player is all he needs, would you like to pay north of 600$ for a card that barely pulls ahead of the same tier of last generation? You certainly haven't thought that through.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Competition is good but this is one tough industry to get into and succeed.
 
Last edited:

ShelbyGT500

Junior Member
Jan 27, 2007
8
0
0
Wont happen, the free market can only bear what it can. ie look at how many businesses are shutting down in the current economy.

We sure need another player in the market. It worries me how both the existing ones started to be comfortable with their current status, none of them is eager anymore to really pull ahead, we get a~15-25% performance improvement with every generation and a huge marketing bull.


To whoever said a player is all he needs, would you like to pay north of 600$ for a card that barely pulls ahead of the same tier of last generation? You certainly haven't thought that through.

15-25% is pretty good, the problem is nothing on the software side is really pushing the hardware to be better.
 
Last edited:

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
We sure need another player in the market. It worries me how both the existing ones started to be comfortable with their current status, none of them is eager anymore to really pull ahead, we get a~15-25% performance improvement with every generation and a huge marketing bull.


To whoever said a player is all he needs, would you like to pay north of 600$ for a card that barely pulls ahead of the same tier of last generation? You certainly haven't thought that through.

A big part of the problem is that the majority of monitors are currently stalled running at 1080p res. This is actually a step backwards compared to a few years ago. Until we get to the next standard of 4k and hopefully at least 120hz we're going to be stuck here for awhile. That's why Nvidia put out 3d vision and AMD put out eyefinity. To justify the need for the power of the newer GPUs.
 

Throckmorton

Lifer
Aug 23, 2007
16,830
3
0
No. If there was a third GPU company, and each of the 3 got 1/3 of sales, they'd all be worse off. You can't advance technology as fast as they have been doing without being able to sell enough units to recoup the R&D costs.

And then they might all go out of business and we'd be left with INTEL.
 

veri745

Golden Member
Oct 11, 2007
1,163
4
81
I'd like to see at least third company in the graphics card industry. Would bring fresh new products and good competition.

I don't understand why it's not the case yet...


I like how you say "yet" as though there used to be fewer players in the GPU space and the future trend will be an increasing number of competitors.

More correct would be to say, "I don't understand why it's not the case *anymore*" since S3, 3DFX, and Matrox are out of business / no longer compete in this space.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
I'd like to see at least third company in the graphics card industry. Would bring fresh new products and good competition.

I don't understand why it's not the case yet...

It is like the republicans and the democrats. They both work together to drown any competition.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
There's simply too much that nvidia and ATI don't do right in my opinion.

ATi's filtering may not be as good, they do something to their z-range that makes it shorter, and ATi's not interested in the support of older games. They also don't have PhysX.

Nvidia doesn't do scaling through HDMI in many cases, they don't have as good of opengl performance anymore, they only partially support older games (it would be nice if they're drivers forced 32 bit RGBA frame buffer when an application asks for a 16 bit format, and it would also be nice if they'd include the drivers a check box to force an complimentary FP32 Z-buffer), and they don't have edge-detect AA. OVerall, nvidia is better, but they're still not too good.

It would also be nice if one of them included a glide wrapper in their drivers that includes a 3dfx hardware ID so it works with all glide games.

I really wish Intel had released Larrabee, even if it was slower than Fermi. I wish they had released Larrabee, because it emulated rasterization except for textures. Even though emulation of ROPs and depth units is slower, I think it's a good idea.

What are intel's drivers like on Sandy Bridge's integrated GPU? I'm sure that if intel made a discrete GPU its drivers'd be better than those of ATi and nvidia.

Even tho they offer NO discrete GPU . Imagination tech is making its presence felt in phones and tablets.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
http://www.youtube.com/watch?v=XVZDH15TRro

Rasterization's golden age is near its end as computational power has gotten us closer to affordable real-time raytracing. Intel isn't completely out of the game, but it will take a while before they have something cost-competitive.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
you link is exploded, here is a proper one: http://www.youtube.com/watch?v=XVZDH15TRro

And the demo in question uses 4 seperate servers each running knight ferry working together to render the game. The laptop only displays it.
http://www.zdnet.co.uk/news/process...up-on-intels-knights-ferry-platform-40089094/

Thanks, link had already been fixed, though.

Yes, hence my saying "affordable" raytracing. Ze guy who explains ze demo talks about ze setup and how ze laptop is merely displaying ze output.

Given the rapid increase in computational power per dollar in the last several decades, I'm betting that affordable real-time raytracing's day will come. Not yet. But it's not as far off as one might think.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
how do you know how far off i think it is?

Anyways, in each demo image they had ONE high quality object (car, chandelier) and a bunch of low quality nasty blocky ancient (original resources from the original engine) crap looking ones.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
how do you know how far off i think it is?

Anyways, in each demo image they had ONE high quality object (car, chandelier) and a bunch of low quality nasty blocky ancient (original resources from the original engine) crap looking ones.

Was I talking about you?

I fully expect NVIDIA to have the resources (human, money, and otherwise) necessary to combat Intel if the grudge match moves beyond rasterization (same old same old... criticize current-gen raytracing resolutions, textures, etc. if you want, but it allows you to do things that are impracticable w/ rasterization) on into 3D, raytracing, holograms, etc.

I am not so sure that AMD has the resources necessary to keep up. They may end up selling their graphics business to NVIDIA, INTEL, or someone else, if the situation gets bad enough.

Thus the graphics race may end up being between Intel and NVIDIA--still a two horse race, just with different horses.

By the way, those of you talking about barriers to entry should know that barriers are not impenetrable. Look at what happened to NIKON, for instance, once ASML gained some steam.

A resurgent APPLE has made great inroads into Microsoft's OS once-monopoly.

INTEL has its hands full with ARM, and if it weren't for INTEL's fabs, it would be an even more serious situation for INTEL.

Speaking of INTEL, it used to own the CPU market even more than it does now, and if AMD hadn't been so mismanaged, perhaps it would have been able to continue its success past the Athlon series. Even so, AMD went from near-zero to ~30% of the server market for a while.
 
Last edited:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
criticize current-gen raytracing resolutions, textures, etc. if you want
1. I wasn't criticizing current ray tracing solutions, I was criticizing this demo.
2. I don't want to criticize it, I am just being unbiased and honest.
3. This demo takes a super ancient crappy looking (by today's standards) game (wolfenstein 3d), slaps ray tracing on it, and replaces one select object per area.
Everything there was crap besides the nice shiny stuff they inserted (car and chandelier); and they seemed to be proud of the fire even though it looked terrible.
I mention this because I am concerned about the performance of a game who is 100% modern & high quality textures, physics, models, etc.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
1. I wasn't criticizing current ray tracing solutions, I was criticizing this demo.
2. I don't want to criticize it, I am just being unbiased and honest.
3. This demo takes a super ancient crappy looking (by today's standards) game (wolfenstein 3d), slaps ray tracing on it, and replaces one select object per area.
Everything there was crap besides the nice shiny stuff they inserted (car and chandelier); and they seemed to be proud of the fire even though it looked terrible.
I mention this because I am concerned about the performance of a game who is 100% modern & high quality textures, physics, models, etc.

Fair enough but I am completely underwhelmed with the state of rasterized graphics, too. Why do you think NVIDIA is pushing 3D and Surround and hardware PhysX, and INTEL is pushing raytracing (aside from how it works better with their more-complex cores)? Rasterization is the same old, same old, no matter how many tricks you throw in there to simulate accurate shadows or whatnot. Due to hardware limitations, rasterization has many more years to live, but I hope we aren't still stuck with it by 2020.

Stuff like ONLIVE isn't going to work well with fast-paced games where timing is critical, such as FPS's, but imagine something like ONLIVE paired up with Intel cloud-powered raytracing:

http://news.yahoo.com/s/pcworld/20110304/tc_pcworld/intelhopestoboostcloudgamingwithraytracing

P.S. http://www.youtube.com/watch?v=ianMNs12ITc and http://www.youtube.com/watch?v=FL7dUcKk9F0 -- and that's with current-day technology. Give it several years, and we'll see what happens.

P.P.S. I came across this which I can't read and which is difficult to translate, but it appears to be a cloud raytracing demo: http://www.youtube.com/watch?v=IeXubCHQIUo Very impressive if they are already doing this in real time over the web, but I doubt it.
 
Last edited:

biostud

Lifer
Feb 27, 2003
18,251
4,764
136
competition is good, but I can't really see any missing features atm or something that would need a new competitor, and it seems like performance/watt is better now than ever.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Stuff like ONLIVE isn't going to work well with fast-paced games where timing is critical, such as FPS's, but imagine something like ONLIVE paired up with Intel cloud-powered raytracing

Onlive has, due to budget and costs, been allocated anemic amounts of GPU per individual. Such that a 200$ gets you a better image rendered at home.

I am not saying that ray tracing isn't awesome, it is, this is how we make movies. And I think 5 to 10 years from now (closer to 10) there is a good chance of everything being raytraced in real time. but onLive will not bring it about because it is far behind on the hardware curve and is doomed to fail due to a plethora of reasons i have discussed elsewhere. (which is good, I want it to fail because of another set of reasons i mentioned already)

lets stick to the discussion at hand there. They have 4 servers ray tracing that one game for display on a laptop monitor.
Scale up the resolution to 1080p and you got about a 3x increase in cost (computational cost), scale down those super fancy objects slightly (they were overdone), scale the rest of the room up a lot to match the quality of the good objects, throw in some tesselation to further reduce costs... well I am guessing you can do it within a 2x budget.
This puts you at 6x cost, 4 servers become 24 servers... Now, this is early hardware and its the stupid intel x86 everywhere initiative that just doesn't work, so lets say you dump it for a real parallel computing architecture like fermi for a good 2x-4x performance, you are back at 4 to 8 servers.
Now, it is critical that we know how many cards are in each server, if each server has 4 cards... well that is very bad, if each server is just running one knight ferry card ... then i have to ask why put it 4 servers? if we are lucky and its 2 cards a server we are still looking at 8 to 16 cards... so you are looking at needing cards that are 8x-16x what we have today, or running in SLI/xFire two cards which are 4x-8x the performance of a top card today... at BEST.
This will happen eventually, say in 5 to 10 years.

Now that is tons of assumptions and rough guesses... made worse by how scarce the info they gave us is. but this is why I put in ranges to compensate and say at best.
 
Last edited:

itsmydamnation

Platinum Member
Feb 6, 2011
2,773
3,149
136
unfortunatly a lot of this was lost in a forum crash, but it really is by far and way the most technical discusion i have seen on the internet about ray tracing.

Real-Time Ray Tracing : Holy Grail or Fools’ Errand?*PartialReconstruction

general consensus, ray tracing has just as many issues as rasterization, you have to hack at it just as much as rasterization to make it work in certain situation. Both rasterization and ray tracing have there advanatages and there is nothing to stop you using both at the same time.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
AMD and nVidia are competing so well right now. It's hard to envision someone else coming in and bringing out better products. Intel seems to be making some headway with their integrated GPUs, but it's going to be quite some time before they can create something more serious.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
AMD and nVidia are competing so well right now. It's hard to envision someone else coming in and bringing out better products. Intel seems to be making some headway with their integrated GPUs, but it's going to be quite some time before they can create something more serious.

intel's main problem is that they are trying to extend their monopoly. Instead of scaling up their IGP, they are trying to get replace GPUs with inefficiency x86 multicore implementation so that they could extend their monopoly into new fields and crush the competition... the problem is that they pay too heavy a performance fine for it.

It is the exact same marketing driven design that resulted in the P4 being the fail that it was. Unless they wise up and tell their engineers "design the most powerful and power efficient GPU you can" rather then "design a GPU that will allow us to extend our x86 monopoly" then they will not be able to make headway into the field. Unless competing fabs continue to struggle and allow intel to leverage a frightening process advantage to mitigate their marketing driven design shortcomings.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
intel's main problem is that they are trying to extend their monopoly. Instead of scaling up their IGP, they are trying to get replace GPUs with inefficiency x86 multicore implementation so that they could extend their monopoly into new fields and crush the competition... the problem is that they pay too heavy a performance fine for it.

It is the exact same marketing driven design that resulted in the P4 being the fail that it was. Unless they wise up and tell their engineers "design the most powerful and power efficient GPU you can" rather then "design a GPU that will allow us to extend our x86 monopoly" then they will not be able to make headway into the field. Unless competing fabs continue to struggle and allow intel to leverage a frightening process advantage to mitigate their marketing driven design shortcomings.

I agree with the above, which is also applicable to smartphones and tablets... trying to x86 the world because of your patents isn't always the most efficient thing to do. ARM has exploited this by designing mobile computing with energy efficiency in mind right from the get-go. It should be interesting to see where the battle lines shift in PC-vs-mobile, and if ARM ends up dominating the server market due to energy efficiency. So long as INTC maintains its process advantage, I say fat chance, but I suspect that INTC's process advantage will shrink in the next several years as it hits the brick wall of physics... everyone will have to move to some newer tech soon to keep performance/dollar and performance/watt growing at a fast clip.

And yes, affordable raytracing (i.e., on par with rasterization with similar performance) is not going to happen overnight, or even in the next few years.

I brought up Intel as an example of a potential third party entrance into high-end graphics. The way things are shaking out, I can see a three-horse race for high-performance gaming graphics for a while: NVDA, AMD, and INTC.