GT240 - nvidia's flagship for this holiday season

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
You guys are also forgetting that nVIDIA is replacing the G98/G96/G94 cores with GT218/GT216/GT215 cores respectively. The old cards aren't going to be around for much longer.

Just like how AMD is going to replace the HD48xx series because atm HD5770 performs worse than a HD4890 and other HD4870 cards can be had for less money. But they are going to get EOLed real soon.

AMD really needs a part between the 5770 and the 5850. Something like a 5830 with another cluster of SP's disabled from the 5850 would be great, but who knows how many GPU's they are getting with that many defective clusters. But as of right now, the 5770 performs a little worse than the 4890 and is in the $170ish range. The next card up is the 5850 at $299 right now. They're really leaving a pretty large gap in between and chances are it's performance would be what most of us want... faster than a 4890 but cheaper than the 5850 and 'fast enough' assuming it was a 1280SP part or something. I'd look Nvidia's way, but it could be late Spring before we see anything exciting from them. But I suppose I'm getting off the subject of the GT240 now.
 
Aug 11, 2008
10,451
642
126
Considering these cost $90+ to get a card without a power connector, and you can get a whole new PSU with a pci-e connector for $40ish (400cx) , paying much of a premium for a connector-less card has its limits


I agree. However, I feel comfortable replacing a graphics card myself, but not really comfortable replacing the power supply. I do not want to take the computer in for service to have the power supply replaced either. But you are correct, I have spent enough on 3 low power cards to have had the power supply replaced and bought a mid level graphics card such as a HD4850.

I originally put in a HD2600 pro which I was satisfied with for its time. I replaced that with a HD4650 DDR2, which was basically a waste. I was really not satisfied with that card. I replaced the 4650 with a Galaxy low power 9600GT, and am quite happy with the performance. However, it does run hot unless the fan is cranked to more that 50 percent of max.
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I agree. However, I feel comfortable replacing a graphics card myself, but not really comfortable replacing the power supply. I do not want to take the computer in for service to have the power supply replaced either. But you are correct, I have spent enough on 3 low power cards to have had the power supply replaced and bought a mid level graphics card such as a HD4850.

I originally put in a HD2600 pro which I was satisfied with for its time. I replaced that with a HD4650 DDR2, which was basically a waste. I was really not satisfied with that card. I replaced the 4650 with a Galaxy low power 9600GT, and am quite happy with the performance. However, it does run hot unless the fan is cranked to more that 50 percent of max.


If you can replace a video card than you can replace a power supply. There are four screws holding it in the case typically, and then just follow the wire sets. It's that easy.

The thing is, this card gives you 9600 performance for much more than 9600 money. It's not a bad performer for the power used, but it's literally about 2x the price of where it should be. The features this thing provide just caught up to cards that have been on the market for over a year by AMD. This card just doesn't make sense for 99% of us at it's current price.
 
Aug 11, 2008
10,451
642
126
If you can replace a video card than you can replace a power supply. There are four screws holding it in the case typically, and then just follow the wire sets. It's that easy.

The thing is, this card gives you 9600 performance for much more than 9600 money. It's not a bad performer for the power used, but it's literally about 2x the price of where it should be. The features this thing provide just caught up to cards that have been on the market for over a year by AMD. This card just doesn't make sense for 99% of us at it's current price.


Actually, I got the card at Best Buy on close out for 80.00, which I consider not too bad. Somewhat more than a 4670 from new egg, but I had a gift card to use also.
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
...putting yet more strain on the already overburdened & underperforming TSMC 40nm line, right? Sounds like a winner to me!
 

ther00kie16

Golden Member
Mar 28, 2008
1,573
0
0
nVidia's official statement about the lack of SLI support is "We did not see it fit for this product." Tells you how much crap they think this card is themselves when even the lowly 6200 had SLI.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I can give you reasons why I will buy a GT220 or GT240 videocard, despite the fact that they are a bad value and not DX 11:

--- they are about the maximum lenght that will fit in my crowded case
--- I can use the Nvidia Control Panel to create and scale correctly just about any custom resolution (a priority for me). Try to do that with Catalyst.
--- I'm only a light gamer
--- I'm keeping my Win XP/DX9 machine for another two and a half years
--- My system is already running on GF8200 integrated graphics.

I repeat: I agree the GT220 and GT240 are too expensive for their performance. But considering my own variables I'll reluctantly accept paying more for an NVidia product.

If you're only a light gamer then a 9600 gso probably makes more sense for you. That card has been hovering around the $40 range for 9 mos now and would save you a lot of money. Also, gt240 doesn't add any new features over 9600 gso like dx11, physx, eyefinity, etc so there's no reason to spend the extra money on it.

Nice comment about using gf8200 integrated graphics btw. THAT is a good reason to pick one card over another. Now that I think about it, I think that my next video card purchase will be a larrabee because my cpu is intel.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
nVidia's official statement about the lack of SLI support is "We did not see it fit for this product." Tells you how much crap they think this card is themselves when even the lowly 6200 had SLI.

The lack of SLI is suprising. I wonder how much it takes to add SLI? I would think it's just the connection on the PCB, everything else is handled through drivers... but maybe not, I really don't know. But I can't imagine it would have cost much to have it there. Just seems like an odd choice to not even have the option there for customers with this part.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
...putting yet more strain on the already overburdened & underperforming TSMC 40nm line, right? Sounds like a winner to me!

Could be some truth to that...no doubt both Nvidia and AMD have negotiated "quotas " of existing 40nm capacity and it is on a "use it or lose it" basis.

Yields may be craptastic enough that it wasn't worth Nvidia's (or TSMC's) efforts to push for an A2-stepping production release of Fermi (so may as well spend the idle time taping out an A3 stepping) but at the same time Nvidia knows they need to be strategic in the use of their existing 40nm capacity allocation lest AMD successfully convince TSMC that a few more wafer starts on Cypress won't overly burden the fab.

What would you do if you were an Nvidia decision-maker? I'd find something to consume my allocation of 40nm capacity just to keep it out of the hands of my competitor until the yields improve (then I'd switch over to starting Fermi wafers) or until capacity increases to the point that it is no longer practical to keep "sucking up capacity" with my token low-grade gpu parts.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
If you can replace a video card than you can replace a power supply. There are four screws holding it in the case typically, and then just follow the wire sets. It's that easy.

The thing is, this card gives you 9600 performance for much more than 9600 money. It's not a bad performer for the power used, but it's literally about 2x the price of where it should be. The features this thing provide just caught up to cards that have been on the market for over a year by AMD. This card just doesn't make sense for 99% of us at it's current price.

Please don't take this the wrong way, but if you're afraid to change out a psu then you need to go buy a dell. You unscrew it, unplug the connectors, screw in the new psu and install connectors and voila, you're done! If you are that concerned about it you could make a list of each connector and where it goes, but honestly this isn't rocket science here.

You appear to be the target market for a gt240. Go buy one asap before they're all gone.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Please don't take this the wrong way, but if you're afraid to change out a psu then you need to go buy a dell. You unscrew it, unplug the connectors, screw in the new psu and install connectors and voila, you're done! If you are that concerned about it you could make a list of each connector and where it goes, but honestly this isn't rocket science here.

You appear to be the target market for a gt240. Go buy one asap before they're all gone.

I assume you meant to quote someone else, maybe the person I quoted and told that it's easy to change the power supply? I've changed many power supplies and have been building my own computers for about 10 years now, and enjoy doing it, so I'll pass on buying a Dell. :)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
c'mon steve, we all know you aren't capable of swapping out your own PSU, we've been meaning to talk to you about this embarrassing subject for you for some time now...but since bryan let the cat out of the bag I guess we might as well get it out in the open...steve, we are here to help you, please just get a dell dude, your wife will stop crying at night when she thinks you are asleep and the rest of us pinkie-swear to not think any less of you than we already do. honest. for realz this time. cereouslee.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
c'mon steve, we all know you aren't capable of swapping out your own PSU, we've been meaning to talk to you about this embarrassing subject for you for some time now...but since bryan let the cat out of the bag I guess we might as well get it out in the open...steve, we are here to help you, please just get a dell dude, your wife will stop crying at night when she thinks you are asleep and the rest of us pinkie-swear to not think any less of you than we already do. honest. for realz this time. cereouslee.

Ooo... you guys are on to me. I would be better at swapping power supplies, but Nemesis wrote the instructions for me. :)

The wife already goes to bed less than satisfied and a littel disappointed, now that she knows I don't know the difference between active PFC and passive PFC, it's no wonder she cries at night!
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
nVidia's official statement about the lack of SLI support is "We did not see it fit for this product." Tells you how much crap they think this card is themselves when even the lowly 6200 had SLI.

It's probably just disabled in the drivers, I bought sli requires any real hardware support.

But I wouldn't be surprised to see SLI die with nvidia's chipset business. At the very least, only the highest end motherboards will include SLI, so there's no reason for anything but the highest end cards to have it.
 

BernardP

Golden Member
Jan 10, 2006
1,315
0
76
Yields may be craptastic enough that it wasn't worth Nvidia's (or TSMC's) efforts to push for an A2-stepping production release of Fermi (so may as well spend the idle time taping out an A3 stepping) but at the same time Nvidia knows they need to be strategic in the use of their existing 40nm capacity allocation lest AMD successfully convince TSMC that a few more wafer starts on Cypress won't overly burden the fab.

What would you do if you were an Nvidia decision-maker? I'd find something to consume my allocation of 40nm capacity just to keep it out of the hands of my competitor until the yields improve (then I'd switch over to starting Fermi wafers) or until capacity increases to the point that it is no longer practical to keep "sucking up capacity" with my token low-grade gpu parts.

Hmmm. Very good points... Now we could just add Charlie's musings and innuendo about the oddly-low 40nm yields at TSMC and the overall picture would almost be complete...

http://www.semiaccurate.com/2009/11/16/ati-58xx-parts-delayed-bit-more/
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Hmmm. Very good points... Now we could just add Charlie's musings and innuendo about the oddly-low 40nm yields at TSMC and the overall picture would almost be complete...

http://www.semiaccurate.com/2009/11/16/ati-58xx-parts-delayed-bit-more/

Charlie's needless innuendo is all that I find odd about his articles. That he found a few equally easily baffled engineers to confirm his desire to see mountains where there are mole-hills is of no surprise to me. I worked side by side with plenty of morons in the fab, god only knows what kind of absurdly inaccurate picture they painted for the vendors when they went on their business lunch and dinner meetings...

When I read a charlie article like that I just sigh and move on. He's got just enough knowledge to be dangerous by misrepresenting the likelihood of any given scenario in a way that makes it seem unavoidable to the otherwise uneducated layperson looking in from outside the industry and yet he's just crackers enough that its not like anyone who really knows what is going on is about to waste their life doing the kind of charity work needed to further educate the fellow.

If Nvidia were a female and not a business entity you'd swear charlie exhibits all the signs and symptoms of a stalker that has a restraining order out on him. Like a spurned lover, "if I can't have her then no one will! She wouldn't sleep with me so I'm going to go psycho and not rest a minute until the rest of the world is convinced she is a whore!". Dude needs medical help and that's my professional opinion.
 

Cattykit

Senior member
Nov 3, 2009
521
0
0
I don't get these cards at all.

They are horribly off target when it comes to pricing
They don't really compete with anything (due to the price)

If someone can give a decent reason for buying one of these over 9800GT, HD4470, HD4850 and GTS250, please let me know.

And by the way, Wreckage, did you even see what cards were in the review? Doesn't exactly look like hothardware.com included the cards in the sweetspot (mentioned above).

You should know people have different needs. For some, GPU may stand for gaming processing unit; for some it's graphics processing unit.
For gaming, yes 240 is quite ugly. For someone like me, it's quite attractive.
You see, there're a lot of people like me who buy GPU for 2D graphical needs. In other words, photoshop and NLE. People like me want GPU that consumes as low power as possbile while providing okay performances for once in a while gaming (Had it not been Modern Warfare 2, I'd have said I don't game at all.) Really, what'd be the point of having a card that can do 100+ fps on Crysis when I play it for one time or not even that one time? It just sucks up much power doing nothing much. That's just a pure waste.

One more thing is CUDA. After all years, CUDA is still in its infact stage but there's big hope for it among 2d graphics users. Especially so, considering upcoming CS5 is believed to have CUDA support and more NLE tools are making promises. In this region, Nvidia really shines and GT240 will be regared as a great GPU for it behaves well in terms of power consumption and its performance is a lot more than desired.

Do I like this situation? Hell no but as for now and as for someone like me, choices are quite limited. I can't go ATI due to CUDA and nVidia's offering sucked major ass at least until 240 came along.

Just go over photo editing, espeically NLE forums, you'll be amazed how atmosphere is different there.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126

Even if you need to stick with Nvidia the GT240 doesn't appear to make much sense. There is a GTS250 on Newegg right now for $99.99AR. Cheaper and better performance (I would assume better performance with CUDA as well). The GT240 w/DDR5 would be a much better part if it were $70, it just doesn't make much sense, even against other Nvidia parts, at it's current price.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Even if you need to stick with Nvidia the GT240 doesn't appear to make much sense. There is a GTS250 on Newegg right now for $99.99AR. Cheaper and better performance (I would assume better performance with CUDA as well). The GT240 w/DDR5 would be a much better part if it were $70, it just doesn't make much sense, even against other Nvidia parts, at it's current price.

Actually the GT240 would be a more capable card for GPGPU apps. Because the GT215 core inherits the GPGPU capabilities of the GT200, its supports CUDA 3.0 (quite a large difference against the G92) and may even support DP, something that could be found out (probably not). It also has 96 ALUs which isnt too far behind the GTS250.

From a spec point of view, I believe that the GT240 only falls behind the 9600GT because of its 8 ROPs limitation. In situations where its pixel fillrate limited, the latter will clearly out perform the former since it has 16 ROPs.

Previous generation cards have most of the time outperformed the new ones because of the price advantage. However this always happens during transitions, and its going to be soon that G9x parts are going to replaced all together. Once the G92/GT200 is replaced, then these parts will start to make sense (and also once the prices start to settle/mature a bit).
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Why do they keep releasing the same old cards on g92 architecture and give them a different label?

why do people keep on making this claim? they always make updates or improvements.
The problem with this card isn't its architecture, it is that it is under-performing and over priced.
 

Cattykit

Senior member
Nov 3, 2009
521
0
0
CattyKit said:
People like me want GPU that consumes as low power as possbile

Even if you need to stick with Nvidia the GT240 doesn't appear to make much sense. There is a GTS250 on Newegg right now for $99.99AR. Cheaper and better performance (I would assume better performance with CUDA as well). The GT240 w/DDR5 would be a much better part if it were $70, it just doesn't make much sense, even against other Nvidia parts, at it's current price.

http://www.techpowerup.com/reviews/MSI/GeForce_GT_240/28.html

Read. And, read Cookie Monster's reply again.
 
Last edited:

Cattykit

Senior member
Nov 3, 2009
521
0
0
BTW, this whole thing would get quite interesting if Fermi falls behind ATI's offerings in terms of gaming performance.

Majority, if not all, of of people in hardware sites would trash Fermi for what it doesn't do well while graphics people would drool all over it (Not because of what it can do as of now but what it will be able to do.)

Again, the thing is...for hardware sites' people think GPU as Gaming Processing Unit ignoring the fact there're vast amount of people who don't agree with such term.

Though those people are never mentioned in hardware forums, there're people who soley care about 2D graphics performance, power consumption, fan noise, and heat issues. For them, CUDA is the best thing. Though CUDA has been nothing more than a marketing hype, it's the driving force that is trying to drive this whole Gaming Performance Unit approach toward Graphics Processing Unit one. For NLE people, this is like 2nd coming of Jesus Christ, I tell ya.
 
Last edited:
Aug 11, 2008
10,451
642
126
Actually, after playing Modern Warfare 2 and seeing the overall state of PC gaming in general, nVidia may ultimately be on the right track to try to expand the use of "graphics cards" into other areas. Is it really worth having an expensive gaming PC with a 300 to 500 dollar graphics card when nearly every game is ported from consoles to the computer and plays more and more like a console game?? Sorry, go ahead and rip me if you want, but I am really disappointed with the games being made for the PC now.
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Actually, after playing Modern Warfare 2 and seeing the overall state of PC gaming in general, nVidia may ultimately be on the right track to try to expand the use of "graphics cards" into other areas. Is it really worth having an expensive gaming PC with a 300 to 500 dollar graphics card when nearly every game is ported from consoles to the computer and plays more and more like a console game?? Sorry, go ahead and rip me if you want, but I am really disappointed with the games being made for the PC now.

And what does that have to do with the GT240?

:\