Mr. Pedantic
Diamond Member
- Feb 14, 2010
- 5,039
- 0
- 76
We sure need another player in the market. It worries me how both the existing ones started to be comfortable with their current status, none of them is eager anymore to really pull ahead, we get a~15-25% performance improvement with every generation and a huge marketing bull.
To whoever said a player is all he needs, would you like to pay north of 600$ for a card that barely pulls ahead of the same tier of last generation? You certainly haven't thought that through.
We sure need another player in the market. It worries me how both the existing ones started to be comfortable with their current status, none of them is eager anymore to really pull ahead, we get a~15-25% performance improvement with every generation and a huge marketing bull.
To whoever said a player is all he needs, would you like to pay north of 600$ for a card that barely pulls ahead of the same tier of last generation? You certainly haven't thought that through.
I'd like to see at least third company in the graphics card industry. Would bring fresh new products and good competition.
I don't understand why it's not the case yet...
I'd like to see at least third company in the graphics card industry. Would bring fresh new products and good competition.
I don't understand why it's not the case yet...
There's simply too much that nvidia and ATI don't do right in my opinion.
ATi's filtering may not be as good, they do something to their z-range that makes it shorter, and ATi's not interested in the support of older games. They also don't have PhysX.
Nvidia doesn't do scaling through HDMI in many cases, they don't have as good of opengl performance anymore, they only partially support older games (it would be nice if they're drivers forced 32 bit RGBA frame buffer when an application asks for a 16 bit format, and it would also be nice if they'd include the drivers a check box to force an complimentary FP32 Z-buffer), and they don't have edge-detect AA. OVerall, nvidia is better, but they're still not too good.
It would also be nice if one of them included a glide wrapper in their drivers that includes a 3dfx hardware ID so it works with all glide games.
I really wish Intel had released Larrabee, even if it was slower than Fermi. I wish they had released Larrabee, because it emulated rasterization except for textures. Even though emulation of ROPs and depth units is slower, I think it's a good idea.
What are intel's drivers like on Sandy Bridge's integrated GPU? I'm sure that if intel made a discrete GPU its drivers'd be better than those of ATi and nvidia.
you link is exploded, here is a proper one: http://www.youtube.com/watch?v=XVZDH15TRro
And the demo in question uses 4 seperate servers each running knight ferry working together to render the game. The laptop only displays it.
http://www.zdnet.co.uk/news/process...up-on-intels-knights-ferry-platform-40089094/
how do you know how far off i think it is?
Anyways, in each demo image they had ONE high quality object (car, chandelier) and a bunch of low quality nasty blocky ancient (original resources from the original engine) crap looking ones.
1. I wasn't criticizing current ray tracing solutions, I was criticizing this demo.criticize current-gen raytracing resolutions, textures, etc. if you want
1. I wasn't criticizing current ray tracing solutions, I was criticizing this demo.
2. I don't want to criticize it, I am just being unbiased and honest.
3. This demo takes a super ancient crappy looking (by today's standards) game (wolfenstein 3d), slaps ray tracing on it, and replaces one select object per area.
Everything there was crap besides the nice shiny stuff they inserted (car and chandelier); and they seemed to be proud of the fire even though it looked terrible.
I mention this because I am concerned about the performance of a game who is 100% modern & high quality textures, physics, models, etc.
Stuff like ONLIVE isn't going to work well with fast-paced games where timing is critical, such as FPS's, but imagine something like ONLIVE paired up with Intel cloud-powered raytracing
AMD and nVidia are competing so well right now. It's hard to envision someone else coming in and bringing out better products. Intel seems to be making some headway with their integrated GPUs, but it's going to be quite some time before they can create something more serious.
intel's main problem is that they are trying to extend their monopoly. Instead of scaling up their IGP, they are trying to get replace GPUs with inefficiency x86 multicore implementation so that they could extend their monopoly into new fields and crush the competition... the problem is that they pay too heavy a performance fine for it.
It is the exact same marketing driven design that resulted in the P4 being the fail that it was. Unless they wise up and tell their engineers "design the most powerful and power efficient GPU you can" rather then "design a GPU that will allow us to extend our x86 monopoly" then they will not be able to make headway into the field. Unless competing fabs continue to struggle and allow intel to leverage a frightening process advantage to mitigate their marketing driven design shortcomings.