thilanliyan
Lifer
http://www.tomshardware.com/ne...idia-geforce,7944.html
Not bad for a first go...unless G300/RV870 are like double the performance.
Not bad for a first go...unless G300/RV870 are like double the performance.
Originally posted by: Bob Dylan
:music: the times, they are a changin' :music:
Originally posted by: BFG10K
I doubt this is anywhere near true. Even if the hardware is that fast (which I doubt), the GTX285 is also fast because of good drivers. Based on Intel?s track record with the GMA, I don?t see good drivers coming into the picture for a long time.
Originally posted by: SickBeast
If that performance estimate is true, then the Larrabee will be faster than I thought it would be. That said, it's not good enough. IMO intel is releasing this thing in a desperate attempt to keep x86 relevant. IMO it will fail.
Originally posted by: Zstream
If we are discussing "game performance", then I say who cares!
Originally posted by: zagood
that's the big question though. I wouldn't doubt that larabee could match a GTX285 in cuda/stream/gpgpu tasks, but that stupid question pops up, "can it play crysis?"
Originally posted by: zagood
i'd just like to see something concrete to figure out what the hell this chip is supposed to do.
Originally posted by: SickBeast
If that performance estimate is true, then the Larrabee will be faster than I thought it would be. That said, it's not good enough. IMO intel is releasing this thing in a desperate attempt to keep x86 relevant. IMO it will fail.
Originally posted by: Keysplayr
Whatever Larrabee is, another alternative is always better. Whether it has GTX285, 9500GT performance or anywhere in between.
Originally posted by: Rusin
Larrabee is huge and very expensive chip to make. One 300mm wafer could hold only about 64 Larrabees when for comparison one 300mm wafer could hold 94 Nvidia's 65nm GT200 chips. Which means that Larrabee couldn't fight against GTX 285 price wise.
Originally posted by: Scali
Originally posted by: Rusin
Larrabee is huge and very expensive chip to make. One 300mm wafer could hold only about 64 Larrabees when for comparison one 300mm wafer could hold 94 Nvidia's 65nm GT200 chips. Which means that Larrabee couldn't fight against GTX 285 price wise.
You don't know that.
nVidia has to have a third party like TSMC manufacture their GPUs. Intel does everything in-house, and on a huge scale. Intel's production facilities are also more advanced than TSMC's, and their 45 nm process is very successful and mature, where TSMC is struggling with 40 nm, and 55 nm is still the bread-and-butter of nVidia and AMD GPUs.
Aside from that, yes you may get 94 GT200 chips out of a 300mm wafer, but how many of those are GTX285? Probably a minority, as most of them are salvaged as GTX260/275/295.
So it's really hard to make a comparison between nVidia and Intel... There are so many factors involved here.
Intel has proven before that they can beat a competitor with a much larger chip. The Pentium 4/D were much larger than the competing Athlon XP/64/X2 processors. Instinctively you might think that AMD would get the lowest prices and/or the highest profits, but in fact it was Intel undercutting AMD's prices with the Pentium D series, and Intel had the better business results during that era, so they would have had the higher profit margins.
I think Intel will take advantage of their production capabilities in a similar vein here.
Originally posted by: Keysplayr
You know, I'm still not quite sure what Larrabee is, from a physical standpoint. Is it a discrete PCI-e card? Is it a CPU? Will a specialized motherboard with a new socket be needed? Anyone know? I haven't really been keeping up. It seems like Larrabee has been in development for 30 years now. It got boring.
Most definitely, especially with OpenGL. I saw some driver queries being run on a GMA and it failed almost every test it was given.Originally posted by: ArchAngel777
GMA still has pathetic drivers despite Intel telling us for the last 2 generations (X3100 and X4500) that they are really going to improve their drivers.
Thanks for the kind words; I can?t wait to do the write-up for nVidia?s hardware. 🙂BTW, BFG - Nice work on your Anti Aliasing Comparison. I thought it was well written, easy to understand. I learned a few things that other reviewers didn't explain very well in the past.
Considering gaming is by far the biggest market these parts compete in, anyone that wants to be a viable discrete GPU competitor should care. Remember that Larrabee is a discrete PCIe part, not some kind of new CPU socket.Originally posted by: Zstream
If we are discussing "game performance", then I say who cares!
Then it can?t really compete with ATi?s or nVidia?s GPUs.The meaning of this chip is not to dominate games or have real any impact in that segment.