GF100 Previews from Digital Experience

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
3d vision surround (3 monitor in 3d)requires 2 GF100 cards.

http://www.tomshardware.com/news/Nvidia-3D-Surround,9394.html

so total cost of ownership:
2 x GF100 cards
1 x SLI enabled motherboard
3 x 120 hz TN monitors
1 x 3d glasses
----------------
= lol


mind you i'm happy to see that they recognize the significance of 3+ monitor output (even sans 3d vision). no word on whether you need 2 identical cards.

sidenote: projector setup is interesting way to avoid bezels.

http://www.seamlessdisplay.com/products_radius320.htm
 

Shilohen

Member
Jul 29, 2009
194
0
0
That company must be ridiculously happy with the current trend. I guess we should see more of the like quite soon.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
I don't understand what is so funny actually. The way I see it as more flexibility for GT-200 or potentially Fermi owners to have surround gaming with their 2d or 3d displays with Sli. Think it is great and was hoping for surround gaming when nVidia introduced Sli multi-monitor support a short time ago with one of their Big Bang driver releases.

The only problem I see with it is that it's obviously meant to be a very high end set up, so the cost isn't an issue, but I don't know how many people want to use their ultra high end rig with 22" monitors.

I'm personally not interested in Eyefinity, I'm not interested in Nvidia's copy-cat technology. 3D could be something of interest one day, but certainly not when I'm limitied to realitively expensive 22" monitors.

But as you said, at least the option is there, if you want it, it's there from Nvidia... or more accurately, will be there in a few months or so.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
The only problem I see with it is that it's obviously meant to be a very high end set up, so the cost isn't an issue, but I don't know how many people want to use their ultra high end rig with 22" monitors.

I'm personally not interested in Eyefinity, I'm not interested in Nvidia's copy-cat technology. 3D could be something of interest one day, but certainly not when I'm limitied to realitively expensive 22" monitors.

But as you said, at least the option is there, if you want it, it's there from Nvidia... or more accurately, will be there in a few months or so.
I remember my old 2nd-hand 19" CRT cost me 300 bucks. I know people go triple projectors for gaming, which the bulb cost like 400 bucks every now and then. As of now, those 120Hz LCD is still rare, but it is the trend for next year.

Money is not a problem, the problem is I don't have it.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
I'm personally not interested in Eyefinity, I'm not interested in Nvidia's copy-cat technology.

copy-cat technology? NVIDIA's quadro line had support for 3 displays long before the 5xxx series launched. Not to mention Matrox. So who's copying who?
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
I am impressed by the complexity of the scenese these cards are rendering at fluid frames.
I hope Anand does a comparison between C2D and the i5\i7 line of chips with the newest generation cards at avg resolutions. I really want to get a new card but not sure if it is worth upgrading the rest of my computer.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
copy-cat technology? NVIDIA's quadro line had support for 3 displays long before the 5xxx series launched. Not to mention Matrox. So who's copying who?

Tell me something . What differance does it make who did what first . Its pushing it into mainstream that matters and really it doesn't matter who does that. Because theres more than one player here. In five years you'll be telling all how NV invented X86 . GROW UP.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Hey, greedy dishonest arrogant nvidia wants all your money, there's nothing funny about it...;)

Interesting. Exactly how is having a greedy, dishonest and arrogant feature allow older generation tech to allow for not only 3d but 2d displays as well? You're not forced to buy next generational hardware for multi-monitor support, which may add value for current hardware for some time.

You're not forced to only use Fermi, 3d displays with surround view. Flexibility and choice -- not greed, dishonest and arrogant to me.

You're complaining for the sake of complaining and the ironic part this is more choice and flexibility.
 

solofly

Banned
May 25, 2003
1,421
0
0
Interesting. Exactly how is having a greedy, dishonest and arrogant feature allow older generation tech to allow for not only 3d but 2d displays as well? You're not forced to buy next generational hardware for multi-monitor support, which may add value for current hardware for some time.

You're not forced to only use Fermi, 3d displays with surround view. Flexibility and choice -- not greed, dishonest and arrogant to me.

You're complaining for the sake of complaining and the ironic part this is more choice and flexibility.

As long as it ONLY applies to nv... (I've seen your posts at rage3d, mister flexibility...lol)
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
copy-cat technology? NVIDIA's quadro line had support for 3 displays long before the 5xxx series launched. Not to mention Matrox. So who's copying who?

NV aren't copying anyone.
Multidisplay 3D accelerated graphics across two cards has been around for ages.
I did 3 monitors with a 7800GT + 7200GS years ago for games.

It's just... it wasn't really an easy solution, didn't work well, and either required an unusual system setup + software config, or an expensive card.
AMD have brought it to the mainstream, and NV still won't have even with their new implementation, which requires two graphics cards.

It could be compared to what you love so much, PhysX.
PhysX was around before NV bought it and repurposed it, but it wasn't (hell, it still isn't) mainstream, but they made it a lot more accessible by bringing it to cheaper cards, rather than requiring special setups with a specific accelerator card.
 

solofly

Banned
May 25, 2003
1,421
0
0
The funny part is the generous honest humble ati doesn't want any of your money.

I'm sure they do but they aren't forcing/locking-in/playing dirty tactics like nvidia is and that's the difference...
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
copy-cat technology? NVIDIA's quadro line had support for 3 displays long before the 5xxx series launched. Not to mention Matrox. So who's copying who?

It's not about support for three displays, it's stretching games on 3 - 6 high resolution (2560x1600) monitors on an affordable part made for the masses.

When I wrote 'copy-cat' I didn't mean it in a derogatory way, AMD came out with Eyefinity on the 5xxx cards, now Nvidia looks like they will do the same with Fermi to match them. I guess I should have put a disclaimer right after I mentioned 'copy-cat' so you wouldn't get your panties in a bunch... God forbid someone say something that gives AMD credit (even when it's due) instead of your BFF.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
It's not about support for three displays, it's stretching games on 3 - 6 high resolution (2560x1600) monitors on an affordable part made for the masses.

When I wrote 'copy-cat' I didn't mean it in a derogatory way, AMD came out with Eyefinity on the 5xxx cards, now Nvidia looks like they will do the same with Fermi to match them. I guess I should have put a disclaimer right after I mentioned 'copy-cat' so you wouldn't get your panties in a bunch... God forbid someone say something that gives AMD credit (even when it's due) instead of your BFF.

Mr Flexibility mode:

That's a good thing though because more gaming experience flexibility is offered. When nVidia offered their super-sampled TA with the 68XX family -- ATI countered by offering a similar feature. What difference does it matter what company was first but can someone enjoy the feature from nVidia or ATI?

When Eye-Infinity was offered it was about time one of the powerful IHV's did and was hoping nVidia would counter it with a 3d surround view type feature. Gamers win by having more choice with both offerings having their pros-and-cons.

This is why competition is great.

End of Mr. Flexibility mode:
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
So 380 will be close to 300W?
http://www.pcworld.fr/2010/01/08/ma...dissipation-thermique-tres-importante/468101/
It does not look good.

Google translation:
The engineer has also given Fermi consumption: 300 watts!
This is the limit allowed by the PCI-Express 2.0 and recall that Fermi is equipped with a single GPU! Delay, power consumption of 300 watts, heats important, all of which we are told that Fermi'm a technological marvel, it seems already born evil.
He also whispers that the most powerful Fermi also a Radeon HD 5870 but not 5970.
To go beyond 5870, should increase the frequency ... and then exceed the limit of 300 watts.
Fermi life could therefore be short and the new generation happen much more quickly ...
 

Shilohen

Member
Jul 29, 2009
194
0
0
How serious and reliable is PCWorld usually?

If we are to believe the article, there's a rumor that GF100 has the same performance as a 5870. However, from other posts it seems that other rumors say it's 35% faster and yet other saying more like 15% faster only.

I find the "Fermi certified" case cooling for SLI quite scary as well.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
So 380 will be close to 300W?
http://www.pcworld.fr/2010/01/08/ma...dissipation-thermique-tres-importante/468101/
It does not look good.

Google translation:
a single gpu Fermi will NOT consume 300 watts. thats just what a lot of ignorant people are saying because it has a 6 and and 8 pin connector. most of those same ignorant people dont realize that other cards have had 6 and and 8 pin connectors too. it just means that it could have a tdp thats close to 225 watts so those connectors are there just for reassurance.
 

Shilohen

Member
Jul 29, 2009
194
0
0
a single gpu Fermi will NOT consume 300 watts. thats just what a lot of ignorant people are saying because it has a 6 and and 8 pin connector. most of those same ignorant people dont realize that other cards have had 6 and and 8 pin connectors too. it just means that it could have a tdp thats close to 225 watts so those connectors are there just for reassurance.

Hence the question of reliability. If the article speaks true, then the author got that information from a Cooler Master engineer. Now the question is if whether or not he knows what he's talking about and/or if this was a misunderstanding between the author and the said engineer.