Nvidia cuts out reviewers for the GTS250

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Creig
Originally posted by: keysplayr2003
And you missed the biggest point of all. These sites would not exist if it wasn't for the hardware. Nothing to write reviews on means no review site. I hope you didn't think that people go to H to see the wonderful banner ads there.

And yes, I suppose you're right about results being skewed in favor of the manufacturer if they show superiority is some way shape or form. In this case, beyond only gaming. Gaming isn't the only thing these cards are about anymore. Things are changing much faster than anyone has ever anticipated it would change. Suddenly, the GPU became a threat to the CPU. That was very fast and gaining momentum. You can't ignore these things.

I didn't miss anything. People go to the websites to read unbiased reviews about computer hardware. It is in the site's best interest to provide consistent, unbiased reviews that allow direct comparison to hardware from various manufacturers. By providing these reviews, they can generate traffic to their site which in turn provides revenue through banner ads.

We all know this already. Almost sounds like you are just discovering this for yourself the way you recited the process there. ;)

If a company dictates the terms of the benchmarks, that site can no longer provide direct comparison between products as the results will most likely be skewed. Also, by compelling reviewers to benchmark certain features, they are trying to generate free buzz for something that may or may not interest the reviewer and his/her readers.

Game benchmarks would be run anyway. Can you tell me exactly how it is "biased" if Nvidia wanted H to bench the latest CUDA and PhysX apps? What is biased about that?
Why isn't Nvidia allowed to request a review of these apps? H said no, so Nvidia said no.
Hell, half the people on this forum dislike H's benchmark methods anyway and always discount them because there are too many open "variable" and such to be considered a trustworthy source of information. So, no big loss right? Some folks say H will purchase their own GTS2x0's to bench, but some say they might not even bother. Who knows.


Nvidia should just leave well enough alone and let their cards speak for themselves. If there is enough public interest, the extra features like PhysX and CUDA will be benchmarked simply to satisfy the reviewers' audience desire to read about them and generate more website traffic. If not, then maybe those features simply aren't important.

Well, I can see PhysX being the less important feature to be sure. CUDA based apps on the other hand? It's pretty far from unimportant. Can you sit there and say CUDA isn't important for the world right now? Be warned, I have a smattering of links here to knock you down if you say it's unimportant. So be realistic, or I'll put you there. ;)

 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: chizow
The G92 GTS replaced the G80 GTS actually, so my comparison is correct. The only problem in the original transition from G80 to G92 was the 8800GT and GTS staying with 8800, they should've been 9800 but without a halo product that wasn't going to happen, which is why we had the addtional rebrand of G92 parts from 8800 to 9800.
For some reason though you missed the GTS G92 from your comparison. Also you forgot that the 8800GTS did offer 1GB models, so 1GB on the GTS 250 is nothing new.

Yep, Nvidia parts are adept overclockers, we know this because its been proven time and again. But thanks for confirming with your own experiences. The 55nm core shrink enabled higher clocks on both the core and shader rather than cutting down power draw, although later board revisions did require less power as demonstrated from the shift from 2x6 pin to a single 6 pin.
Yes the G92 has some headroom, but the G92b didn't increase that headroom. As I pointed out my 8800GTS G92 runs just fine at 800Mhz with only a 6pin connect. The 8800GTS also offered 1GB models, so what's really so great about the GTS250? * increased factory clocks *

Why would they replace it when its still competitive in all market segments? As for the GTS 250, its not meant for the sub-$100 market, sounds like the GTS 240 may fill that role. GTS 250 looks poised to go somewhere inbetween the 4850 and 4870 512MB in both price and performance.
If nVidia can do it, then all the power to them. The 4870 512MB is aready at 145AR, so I really don't see this card having much room on the price.

http://www.newegg.com/Product/...x?Item=N82E16814161268
http://www.newegg.com/Product/...x?Item=N82E16814161236
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Had? So what happened to it? It wasn't the same super overclocking sample producing artifacts all over your QW screenshots in the 9.2 driver thread is it?
Sold it along with the rig it was in to a friend.

The screenshots I posted in the 9.2 driver thread are with the X1950 XTX I'm borrowing from my brother. It's a driver rendering error, one that is corrected with disabling catalyst A.I.

You don't correct artifacts due to overclocks by disabling Catalyst A.I. Don't strawman.

It wasn't difficult to get a significant overclock at stock voltages on those parts, voltmodding was only necessary for "Extreme" overclocks.

Right. They overclocked. So did the X1900 I had. That was my point.

It did it well, and even when I later built a water cooling setup I could continue upping the clocks without ever having to resort to a volt-mod. Thus, it was a good overclocking card from ATi, contrary to what you want to believe based on them not selling them at factory overclocks.

If I can swing the money soon, I think a GTX 260 would be a great buy, but hearing them cut back features like the Volterra chip hurt the capability that card has in overclocking.

Possibly...

No. It will. No matter what you want to believe, taking away the ability to adjust voltages through software will put a limit on the degree of overclock you get without volt-modding. Period. Overclocking 101.

...but you'll be able to quickly tell from what kind of OCs variants the OEMs are offering.

Just as you would "be able to quickly tell" from what users are getting.

Again, not only is there a lack of overclocked parts...

As I said earlier, I don't know why ATi doesn't sell more SKUs as factory overclocked models, but I do know that just because they don't doesn't mean the cards can't overclock well.

...there haven't been rebrands or refreshes to close the gap in performance.

Yes, you're right, as you said there haven't been any rebrands ;)

This of course makes no sense as ATI has been trailing the competition so leaving performance on the table makes absolutely no sense at all.

They took a completely different strategy this time around. It's been discussed and you can read it here.

I'll leave the last word to you, as I believe my point is made. Though, something tells me your next post will continue to insinuate that 4800s are the worst cards on the planet.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: SSChevy2001
For some reason though you missed the GTS G92 from your comparison. Also you forgot that the 8800GTS did offer 1GB models, so 1GB on the GTS 250 is nothing new.
Why would I include the G92 GTS when it didn't fill the market segment or replace any of the parts I listed? As for a 1GB GTS, it was offered by a single board partner I believe for a limited time. Still, it wouldn't have performed as well as a GTS 250 with higher clocks on a 55nm process.

Yes the G92 has some headroom, but the G92b didn't increase that headroom. As I pointed out my 8800GTS G92 runs just fine at 800Mhz with only a 6pin connect. The 8800GTS also offered 1GB models, so what's really so great about the GTS250? * increased factory clocks *
Sure it did, particularly with the shader clocks. I believe there were factory clocked parts hitting 2000 on the shader with overclocked results hitting 2200 along with higher clocks on the core in the 850MHz range. Can your 8800GTS do that? Didn't think so.

As for what's so great about the GTS 250, nothing much, but it looks like its going to offer enough improvements over the 9800GTX+ to not only clearly beat the 4850, but to also beat the 4870 512MB in numerous instances. Not too bad, certainly worthy of a rebrand, with a spot just below the GTX 260.

If nVidia can do it, then all the power to them. The 4870 512MB is aready at 145AR, so I really don't see this card having much room on the price.

http://www.newegg.com/Product/...x?Item=N82E16814161268
http://www.newegg.com/Product/...x?Item=N82E16814161236
Looks like they can, GTS 250 rumored to be $130
and $129 quoted by Xbit
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: josh6079
Sold it along with the rig it was in to a friend.

The screenshots I posted in the 9.2 driver thread are with the X1950 XTX I'm borrowing from my brother. It's a driver rendering error, one that is corrected with disabling catalyst A.I.

You don't correct artifacts due to overclocks by disabling Catalyst A.I. Don't strawman.
Oh I never meant to suggest the artifacting was due to overclocking, I'm well aware of ATI's Z/depth buffer woes. I was just establishing a baseline reference for my BS meter. ;)

Right. They overclocked. So did the X1900 I had. That was my point.
No, your point was they didn't overclock well without voltmods, the link I provided showed they overclocked as well or better than what you claimed with your X1900 without the need for any such mods.

It did it well, and even when I later built a water cooling setup I could continue upping the clocks without ever having to resort to a volt-mod. Thus, it was a good overclocking card from ATi, contrary to what you want to believe based on them not selling them at factory overclocks.
So what were your final overclocks on said water-cooled, overclocked card? Again, just setting baseline BS meter.

No.....Overclocking 101.
Funny, because overclocking 101 tells you increasing voltage won't always result in increased clocks and that additional clocks may be achieved without any additional voltage, if voltage is already sufficient. Kinda like how, the voltage on GT200s is already high enough to maximize overclock and that increasing voltage yields very little, if any benefit.

Just as you would "be able to quickly tell" from what users are getting.
And what have you seen from GTX 260 or 285 users that would make you think they overclock worst than the 65nm versions? If anything they overclock better.

As I said earlier, I don't know why ATi doesn't sell more SKUs as factory overclocked models, but I do know that just because they don't doesn't mean the cards can't overclock well.
I know why, they like being 2nd? That doesn't make any sense.....

Yes, you're right, as you said there haven't been any rebrands ;)
Of course not, I clearly stated high-end. ;) There's been plenty of rebranding on low-end garbage though.

They took a completely different strategy this time around. It's been discussed and you can read it here.
How is that different than before? Overclock the hell out of their chips to stay competitive, underclock the rest to meet market segments.

I'll leave the last word to you, as I believe my point is made. Though, something tells me your next post will continue to insinuate that 4800s are the worst cards on the planet.
They're not the worst, why would I say such a thing. They're just not the best. ;)
 

mmnno

Senior member
Jan 24, 2008
381
0
0
Originally posted by: chizow
Originally posted by: mmnno
Yes undoubtedly, except you managed to miss the mark completely in this thread.

The problem with random, arbitrary naming conventions is that when you're breaking them once or twice a generation, they're no longer conventions. Yeah they're just names and shuffling them around isn't going to be a serious obstacle to purchasing them, but it definitely seems like an attack on me as a potential customer.

Incrementing the designations at the top of the line in concert with a high-percentage performance increase doesn't so much. The 9800GTX was annoying because it flubbed the second half of that routine. GTS 250 may not be any different from R200, but since that was in the non-gamer segment (also contemporary with GF4MX, a far greater transgression) it wasn't seen as objectionable. As you said the names are arbitrary, so there's a combination of factors that makes this scheme irritating.
I haven't missed the mark, it is what it is, a rebrand just as the RV770/4970 is going to be a rebrand of old tech. As long as the naming conventions fall in-line with performance relative to their product designations, what's the problem? In the case of this rebrand, the change clarifies performance level, if anything.

The only problem with arbitrary naming conventions is when people like you try to claim one is intrinsically better than another, or that one makes more sense than another. For example, ATI just rolled out their RV770LE refresh which happens to perform better than their 4830. Yet they're naming it 4750 and RV740 on a 40nm process. Does it make sense? No, it doesn't. I guess we need a rebrand right? If not, ATI's naming convention is clearly flawed, right?

You missed the mark on hypocrisy, even though you judged yourself 'undoubtedly' right.

4970 is not a rebrand. It's in the same 4xxx series and using the same gen GPU. That's like calling the 8800 Ultra a rebrand.

RV740 is a rebrand, and they do need to fix the 4830 situation, because the tech has moved beyond their current product line. Just like what we had with the 8800GT supplanting the G80 8800GTS. Rather annoying to try to keep up with that crap, and very annoying to keep your friends and family from getting ripped off by the older, low-value parts.

If nVidia was doing the same thing ATi is with RV790, they would be introducing a card called the 9900GTX. That would be stupid and calling it GTS 250 is probably a better idea, but that fact doesn't make the marketing more palatable.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: chizow
Why would I include the G92 GTS when it didn't fill the market segment or replace any of the parts I listed? As for a 1GB GTS, it was offered by a single board partner I believe for a limited time. Still, it wouldn't have performed as well as a GTS 250 with higher clocks on a 55nm process.
Why wouldn't you list the 8800 GTS G92? It's not the like 9800GTX replaced the 8800 Ultra for some people, yet you put in your comparison. The Gainward 8800GTS 1GB ran a stock clock 730Mhz which is a nice clock, with OCing it come very close to that GTS 250.

Sure it did, particularly with the shader clocks. I believe there were factory clocked parts hitting 2000 on the shader with overclocked results hitting 2200 along with higher clocks on the core in the 850MHz range. Can your 8800GTS do that? Didn't think so.
I can hit those factory clocks with my 8800GTS, and the core I could come very close. I'm sure with a volt mod I hit that 850 though. The realy question is how much OC headroom does the GTS250 going to offer?

As for what's so great about the GTS 250, nothing much, but it looks like its going to offer enough improvements over the 9800GTX+ to not only clearly beat the 4850, but to also beat the 4870 512MB in numerous instances. Not too bad, certainly worthy of a rebrand, with a spot just below the GTX 260.
Your more than likely using the pcgameshardware.com review for your info, but that review looked for cases where 512MB cards where limited. Also there using outdated drivers on ATi card, or they didn't compare titles like Fear 2 where the 4870 rocks. Overall the 4870 512 is still going to be a faster card, with if the user doesn't over do the memory limits with HD packs, or crazy Crysis Warhead settings.

$130 for 512MB version which if that's the case the 4870 512MB the $15 more for the extra memory bandwidth. The 1GB at $150 is nice though.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: keysplayr2003
Originally posted by: Creig
I didn't miss anything. People go to the websites to read unbiased reviews about computer hardware. It is in the site's best interest to provide consistent, unbiased reviews that allow direct comparison to hardware from various manufacturers. By providing these reviews, they can generate traffic to their site which in turn provides revenue through banner ads.
We all know this already. Almost sounds like you are just discovering this for yourself the way you recited the process there.

No, just explaining it all out to you since you seem to not understand how the process works.


Originally posted by: keysplayr2003
Originally posted by: Creig
If a company dictates the terms of the benchmarks, that site can no longer provide direct comparison between products as the results will most likely be skewed. Also, by compelling reviewers to benchmark certain features, they are trying to generate free buzz for something that may or may not interest the reviewer and his/her readers.
Game benchmarks would be run anyway.

Yes, but if reviewers are only allowed to bench certain games or certain resolutions, then the review is no longer valid as it paints an unrealistic picture of a cards performance.

Originally posted by: keysplayr2003
Can you tell me exactly how it is "biased" if Nvidia wanted H to bench the latest CUDA and PhysX apps? What is biased about that?
Why isn't Nvidia allowed to request a review of these apps? H said no, so Nvidia said no.
Hell, half the people on this forum dislike H's benchmark methods anyway and always discount them because there are too many open "variable" and such to be considered a trustworthy source of information. So, no big loss right? Some folks say H will purchase their own GTS2x0's to bench, but some say they might not even bother. Who knows.

There's nothing wrong if a manufacturer "requests" that certain features be tested. But if that company sets the condition that a reviewer MUST benchmark certain features in order to receive a sample, then it is no longer a "request", but a demand.


Originally posted by: keysplayr2003
Originally posted by: Creig
Nvidia should just leave well enough alone and let their cards speak for themselves. If there is enough public interest, the extra features like PhysX and CUDA will be benchmarked simply to satisfy the reviewers' audience desire to read about them and generate more website traffic. If not, then maybe those features simply aren't important.
Well, I can see PhysX being the less important feature to be sure. CUDA based apps on the other hand? It's pretty far from unimportant. Can you sit there and say CUDA isn't important for the world right now? Be warned, I have a smattering of links here to knock you down if you say it's unimportant. So be realistic, or I'll put you there.

:roll:

As I stated, it should be up to the reviewers and their respective audiences to determine what should be contained in their reviews, not Nvidia.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: mmnno
You missed the mark on hypocrisy, even though you judged yourself 'undoubtedly' right.
How could I miss the mark on hypocrisy when I've never claimed to distinguish between the ridiculousness of arbitrary naming conventions? The whole point is, there is no 'undoubtedly' right lol.

4970 is not a rebrand. It's in the same 4xxx series and using the same gen GPU. That's like calling the 8800 Ultra a rebrand.
Sure it is, it doesn't follow the established RV770 naming convention fo 48XX, like 4830, 4850, and 4870. But I guess that's because they also rebranded the chip to RV790 even though its the same chip. See how neat and clean that was before the rebrand? As for the 8800 Ultra, it did follow the same naming scheme and kept the 8800, only adopting a new surname which is consistent with their GT/GTS/GTX/Ultra usage of the past.

RV740 is a rebrand, and they do need to fix the 4830 situation, because the tech has moved beyond their current product line. Just like what we had with the 8800GT supplanting the G80 8800GTS. Rather annoying to try to keep up with that crap, and very annoying to keep your friends and family from getting ripped off by the older, low-value parts.

If nVidia was doing the same thing ATi is with RV790, they would be introducing a card called the 9900GTX. That would be stupid and calling it GTS 250 is probably a better idea, but that fact doesn't make the marketing more palatable.
LOL, its only annoying if you overcomplicate the issue and try to make a stink over trivial naming conventions.
 

Zap

Elite Member
Oct 13, 1999
22,377
7
81
Originally posted by: Idontcare
I am MOST interested in what Anand has to say about this when/if he gets a review sample.

Anand is not getting a review sample.

Derek is. ;)

Review should be up on 3/3 if UPS doesn't lose the card.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: SSChevy2001
Why wouldn't you list the 8800 GTS G92? It's not the like 9800GTX replaced the 8800 Ultra for some people, yet you put in your comparison. The Gainward 8800GTS 1GB ran a stock clock 730Mhz which is a nice clock, with OCing it come very close to that GTS 250.
I wouldn't list the 8800GTS G92 for the same reason I didn't list the 8800GTS G80 lol, this isn't obvious? And where did I claim the 9800GTX replaced the Ultra? It replaced the GTX. If I showed a product replacing the Ultra it'd be the 9800GX2 for the halo product.

And Gainward? I guess that's why we didn't hear much about that part, they only serve smaller markets nowadays.

I can hit those factory clocks with my 8800GTS, and the core I could come very close. I'm sure with a volt mod I hit that 850 though. The realy question is how much OC headroom does the GTS250 going to offer?
Again, there's no doubt Nvidia parts are great overclockers, we know this. We also know your 65nm G92 can't hit the same clocks as the 55nm G92s. Got it. As for how much headroom is left on the GTS 250? Sounds like about 100MHz on the core and 200MHz on the shader at least. Not too bad.

Your more than likely using the pcgameshardware.com review for your info, but that review looked for cases where 512MB cards where limited. Also there using outdated drivers on ATi card, or they didn't compare titles like Fear 2 where the 4870 rocks. Overall the 4870 512 is still going to be a faster card, with if the user doesn't over do the memory limits with HD packs, or crazy Crysis Warhead settings.

$130 for 512MB version which if that's the case the 4870 512MB the $15 more for the extra memory bandwidth. The 1GB at $150 is nice though.
Like I said, the differences look to be enough to place it firmly between the 4850 and 4870 512MB in both price and performance and it manages to beat the 4870 in many instances. Certainly seems worthy of its rebranded GTS 250 designation. But I guess we'll find out for sure soon enough. :)
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Originally posted by: chizow
I wouldn't list the 8800GTS G92 for the same reason I didn't list the 8800GTS G80 lol, this isn't obvious? And where did I claim the 9800GTX replaced the Ultra? It replaced the GTX. If I showed a product replacing the Ultra it'd be the 9800GX2 for the halo product.
It's not really obvious why you didn't list the 8800GTS G92, which clearly should of been listed as it's the G92 chip. Yeah you list the 8800GTX my bad.

And Gainward? I guess that's why we didn't hear much about that part, they only serve smaller markets nowadays.
Still the card does exist, so 1GB G92s are nothing new. Also it has a 730/1050 clocks which is very close to the GTS 250 1GB 738/1100 clock speeds.
http://forum.beyond3d.com/show...p=1270590&postcount=39

Again, there's no doubt Nvidia parts are great overclockers, we know this. We also know your 65nm G92 can't hit the same clocks as the 55nm G92s. Got it. As for how much headroom is left on the GTS 250? Sounds like about 100MHz on the core and 200MHz on the shader at least. Not too bad.
I hit 820MHz stock and instead of some G92b OC which hit 840, a whole 2% difference.
http://enthusiast.hardocp.com/...l?art=MTU1NCw4LCw5MA==

Like I said, the differences look to be enough to place it firmly between the 4850 and 4870 512MB in both price and performance and it manages to beat the 4870 in many instances. Certainly seems worthy of its rebranded GTS 250 designation. But I guess we'll find out for sure soon enough. :)
The 1GB model is only going to beat 4870 512, when it's memory limited or the really title favors nvidia drivers. If the clocks are still 738Mhz then NO it doesn't, but lets see what happens.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: SSChevy2001
It's not really obvious why you didn't list the 8800GTS G92, which clearly should of been listed as it's the G92 chip. Yeah you list the 8800GTX my bad.
I didn't list every G92, just as I didn't list every G80, just as I didn't list every G92b. How is this not obvious? lol. I listed 1 part with its logical successor or predecessor, listing something like the G92 GTS would not only be redundant, it wouldn't be accurate, as once again, the G92 GTS replaced the G80 GTS. How is this hard to understand?

Still the card does exist, so 1GB G92s are nothing new. Also it has a 730/1050 clocks which is very close to the GTS 250 1GB 738/1100 clock speeds.
http://forum.beyond3d.com/show...p=1270590&postcount=39
And? The GTS 250 is a rebranded overclocked 1GB G92 GTS available from one vendor in select markets, or a 1GB 9800GTX+. Great lol. I guess the 1GB G92 GTS was the best part nobody knew about.

I hit 820MHz stock and instead of some G92b OC which hit 840, a whole 2% difference.
http://enthusiast.hardocp.com/...l?art=MTU1NCw4LCw5MA==
I thought it was 800? Not that it matters. Again, you keep saying your G92 was a great overclocker, no one is denying that, we know Nvidia parts are awesome like that. But it still can't hit the clockspeeds seen by a 55nm die shrink, there's no shame in that.
860/2133/1245
855/2200/1275
863/2071/1250

The 1GB model is only going to beat 4870 512, when it's memory limited or the really title favors nvidia drivers. If the clocks are still 738Mhz then NO it doesn't, but lets see what happens.
Right, where did I say differently? GTS 250 soundly beats the 4850 and even manages to beat the 4870 512MB in some games. We'll have to see on the clocks, but I'm sure we'll see significant factory overclocks on these parts, which will certainly come close to that 4870 512MB even in non VRAM limited situations and games.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
No, your point was they didn't overclock well without voltmods, the link I provided showed they overclocked as well or better than what you claimed with your X1900 without the need for any such mods.

One thing I want to be clear on. I never said that the 78-7900 series didn't overclock. They did, pretty well in fact, as you've shown.

All I said was that compared to the X1900 counter parts, it was more difficult due to not being able to control voltages at a whim. Before he got the X1950 XTX, my brother had the 7800 GT and, although we ended up volt-modding the thing, it had already reached "good" clocks. Its just one more hurdle to tackle when voltage control isn't supplied from the factory.

Don't jump to conclusions and assume I meant the X1900s overclocked better as a whole. I don't really know if consumers saw higher % overclocks with X1900s vs. 78-7900 series, but I do know that by supplying the ability to adjust the voltages it didn't exactly restrict overclocking.

If ATi was serious about their cards not being able to overclock, they would have done away with the Volterra chip to save costs. That's not the case. Either they're being friendly by keeping it there as an option for consumers to pursue, or they're wasting money. One is quality service, the other is poor spending management. Either way, as a consumer, I like it there. And other consumers do as well given the fact that success has been had when overclocking the 4800s.

Thus, when I hear that nVidia is making cuts to their cards and installing cheaper components from the get-go, I start to wonder if things like buzzing will be even worse, or if I decide to overclock the card I'll be more limited than I was. I don't always game with headphones, but I do play less-stressful games, so the buzzing may or may not be any issue for me. But I can't help but bite my cheek when I see a larger company with a quality product cutting it down so much just to match the price of the competition that isn't making cuts on quality components. I'd much rather them have kept it slightly more expensive so I know that the product I'm getting really is worth more than the latter.

chizow, you're unlikely to comprehend other peoples' contemplations such as this because it seems you're firm in the belief that people will point out flaws limitations or negative characteristics in nVidia's hardware in an attempt to promote the competition. The case has not been that for me. If I don't like something that nVidia did, it's because it made me doubt that I was getting a truly "better" product from them. Perhaps I should not be concerned with hit-and-miss things like the buzzing, but for some reason I am. Especially when I know nVidia has the potential to provide great quality. Likewise with the greater limitation on user overclocking. Sure, they offer factory overclocks...for a premium. But being able to overclock on my own if I wish with my only limit being thermals is a benefit I hate to see disappear from a product I'm buying.

Way off topic, I know and apologize. As far as GTS 250 renaming, again, I get why they're doing it. And, indeed, if there are differences between a 9800 GTX+ and a GTS 250, I can see the sense in a rename. Albeit, I don't think categorizing it as a GT* 200 series is accurate, but hey, its not my product.

The latest Anandtech articles still show a 9800 GTX+ being a great bang for buck product, and I'm glad nVidia had an architecture that has been working so well for so long.
 

garritynet

Senior member
Oct 3, 2008
416
0
0
If you can buy stuff online you can read reviews online. The people that shop at store are not going to find 9800GTX and a 250 GTX on the shelf together anyway.

They renamed their card so that it matches up with its place on the hierarchy. Thats it. We all know if the second number is an 8 is must be pretty powerful right? 4800->5800->6800->7800->8800->9800->280 So if they stop calling it the 9800 and call it the 250 no one will think its a top of the line card anymore. Easy right?

If you don't like that they are releasing a new card with an old GPU ask yourself why they would they invest in new tech for a $149 entry level performance card?

Also: Why dose everything have to be a "debacle" or an "attack". I mean seriously? An attack?
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
as once again, the G92 GTS replaced the G80 GTS. How is this hard to understand?
Didn't the 9800GTX replace the 8800GTS G92, just like the 9800GT replaced the 8800GT. Also it's got to be in your list just for the looking like the 8800GTS G92! :laugh:

http://www.guru3d.com/news/geforce-gts-250-launch/

The GTS 250 graphics cards are NVIDIA's 'new' card series to address the 125-175 USD market.
$175 is coming very close to some GTX260 and 4870 1GB AR prices.

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: josh6079
One thing I want to be clear on. I never said that the 78-7900 series didn't overclock. They did, pretty well in fact, as you've shown.

All I said was that compared to the X1900 counter parts, it was more difficult due to not being able to control voltages at a whim. Before he got the X1950 XTX, my brother had the 7800 GT and, although we ended up volt-modding the thing, it had already reached "good" clocks. Its just one more hurdle to tackle when voltage control isn't supplied from the factory.

Don't jump to conclusions and assume I meant the X1900s overclocked better as a whole. I don't really know if consumers saw higher % overclocks with X1900s vs. 78-7900 series, but I do know that by supplying the ability to adjust the voltages it didn't exactly restrict overclocking.
Sure doesn't sound like that. You made a direct comparison between your experiences and made it sound as if the competing Nvidia parts at the time didn't overclock as well without voltmods. I linked some experiences that clearly refuted that, but in any case, I stand by what I said, AMD parts typically do not overclock as well as Nvidia parts. For every AMD example you provide there's going to be multiple exceptions. With Nvidia parts its clearly the opposite. You might not like hearing it, but its well established reality, sorry.

If ATi was serious about their cards not being able to overclock, they would have done away with the Volterra chip to save costs. That's not the case. Either they're being friendly by keeping it there as an option for consumers to pursue, or they're wasting money. One is quality service, the other is poor spending management. Either way, as a consumer, I like it there. And other consumers do as well given the fact that success has been had when overclocking the 4800s.
Or perhaps the 4870's power demands have such low fault tolerance that a high-end PWM is absolutely necessary in order to maintain stability? As far as I can tell the lower-end 4830 and 4850 do not have, nor require such a PWM. The same higher power demands on the 4870 which also requires the high-end VITEC inductor, which rumor has it, may be upgraded to accomodate the RV790's expected higher power requirements.

Thus, when I hear that nVidia is making cuts to their cards and installing cheaper components from the get-go, I start to wonder if things like buzzing will be even worse, or if I decide to overclock the card I'll be more limited than I was. I don't always game with headphones, but I do play less-stressful games, so the buzzing may or may not be any issue for me. But I can't help but bite my cheek when I see a larger company with a quality product cutting it down so much just to match the price of the competition that isn't making cuts on quality components. I'd much rather them have kept it slightly more expensive so I know that the product I'm getting really is worth more than the latter.
Rofl again, seems if you've been drinking the red kool-aid for far too long. The buzzing on video cards, particularly high-end parts, is vendor agnostic as has been confirmed numerous times. Theo Valich wants to claim its because of ATI's use of the same Volterra PWM you keep referring to regulating voltage, but somehow manages to overlook the fact the GTX 280 he isolates and dissects uses the very same Volterra chip...... Which leaves the inductor coils or solid state caps as the most likely causes, parts both chips share.

As for Nvidia making cuts to their cards and filling poor consumers like you with a sudden sense of dread, you do realize the 55nm 260 just uses a different PWM chip right? NCP5388 Perhaps a decision spurred by a Remark by one of AIC: the difficulty in purchasing Volterra has resulted in limits to mass production. more than anything else?

I'd say its safe to take the tin foil hat off now. I asked earlier, but what have you seen or read to make you think the 55nm GTX 260s overclock worst than their 65nm Volterra PWM counterparts. From everything I've read, they overclock better with expectations they'll start hitting GTX 285 overclocks as soon as the B3 cores starts phasing in.

chizow, you're unlikely to comprehend other peoples' contemplations such as this because it seems you're firm in the belief that people will point out flaws limitations or negative characteristics in nVidia's hardware in an attempt to promote the competition. The case has not been that for me. If I don't like something that nVidia did, it's because it made me doubt that I was getting a truly "better" product from them........some other FUD
I've never once claimed Nvidia graphics cards were flawless, they're just clearly superior to ATI's. I clearly understand and comprehend peoples' contemplations about product concerns and have no problems clearing up any "misconceptions" or "harmless bias", as demonstrated by the vast majority of my posts and as I've done above. Who knows, maybe you've actually learned something.

As for that bit about buzzing you keep commenting ignorantly about, don't look down, your bias is showing. :)

Way off topic, I know and apologize. As far as GTS 250 renaming, again, I get why they're doing it. And, indeed, if there are differences between a 9800 GTX+ and a GTS 250, I can see the sense in a rename. Albeit, I don't think categorizing it as a GT* 200 series is accurate, but hey, its not my product.

The latest Anandtech articles still show a 9800 GTX+ being a great bang for buck product, and I'm glad nVidia had an architecture that has been working so well for so long.
Looks like a great addition to the GTx 200 product line and should certainly clear up any confusion in the market with regards to the 8 and 9 series, with them having bigger numbers and all. :)
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: SSChevy2001
Didn't the 9800GTX replace the 8800GTS G92, just like the 9800GT replaced the 8800GT. Also it's got to be in your list just for the looking like the 8800GTS G92! :laugh:

http://www.guru3d.com/news/geforce-gts-250-launch/
No, it didn't or it would've been called the 9800GTX and not the 8800GTS. The 9800GT replaced the 8800GT months later and coincided with a die shrink to 55nm. As for looking like the G92 GTS....Its the same full shroud Nvidia has used for the last 2 years, why wouldn't it look similar?

$175 is coming very close to some GTX260 and 4870 1GB AR prices.
Yep, but I guess we'll see who's right on price, especially given the GTS 250 will have prices cut by rebates as well.

 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
lol, chizow, if calling me biased helps you find worth in your posts, so be it.

Again, if I made it sound as if nVidia couldn't overclock as well in my original post regarding the matter, it's only because my assumption was correct: you're firm in the belief that people will point out flaws limitations or negative characteristics in nVidia's hardware in an attempt to promote the competition. All I did was make the differentiation that when it came to overclocking on those parts ATi had a feature that complemented such actions, contrary to your asinine belief that they "can't do it".

I could dissect the rhetorical arguments you make - even go so far as misrepresenting them as you do by inserting fictitious ramblings. But there's absolutely no point. You're set in your ways, I'm wearing a "tin foil hat" just because I point out the dilemma in choosing between the products. Indeed, an act that bias people need not concern themselves with.

You've claimed nVidia's products are better than ATi's. For every price segment, I assume. That's all I need to know.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: josh6079
lol, chizow, if calling me biased helps you find worth in your posts, so be it....

.....I could dissect the rhetorical arguments you make - even go so far as misrepresenting them as you do by inserting fictitious ramblings..... I'm wearing a "tin foil hat" just because I point out the dilemma in choosing between the products. Indeed, an act that bias people need not concern themselves with......

You've claimed nVidia's products are better than ATi's. For every price segment, I assume. That's all I need to know.
Pointing out your bias wasn't to help me find worth in my posts, it was just to point it out to you, in case you hadn't noticed. ;)

Again, you'd have to be ignorant or intent to spread misinformation if you continually repeated misinformation that was provably false, ie buzzing being somehow related to inferior PWMs used only on Nvidia parts.

There's a reason I presented arguments based on factual data from widely available public resources to counter your nonsensical fearmongering, so please, dissect and refute my points, if you can.

Maybe its not safe to take off the tin foil hat just yet, given the various 4870s have started using the same "inferior" Onsemi NCP5388 PWM.

And yes, when I claim Nvidia's products are better than ATI's, I'd consider any anomalies the exception and not the rule. :)
 

MTDEW

Diamond Member
Oct 31, 1999
4,284
37
91
Ok, i finally read this thread since i was bored. :D
So heres what i think. (not that it matters...LOL )

1: Although Nvidia re-rebranding a part is definately annoying, its really is no big deal.
Anyone looking to purchase that part can easily look up a review of its performance before buying and know exactly what performance they are buying regardless of the parts name.

2: Expecting a review to cover all of a parts features in exchange for free hardware for the reviews is just fine IMO.
Why not? It shows what features the consumer is getting with the hardware, and the consumer reading the review is the only one who can decide if the features are important to them enough to buy the hardware.

3: Now telling review sites EXACTLY what software to benchmark and ONLY that software when reviewing the hardware is just plain misleading IMO.

I mean saying you'll give free hardware for reviews and those reviews must include the following benchmarks that really show off your products best performance/features is just fine as long as the review sites are free to benchmark any other software that they may feel is relevant.
After all, the hardware will be used for more than just a few titles and whether performance in any particular title matters or not will certainly vary from one consumer to the next.

But saying the benchmarks can ONLY be run on software the hardware maker specifies is downright dishonest whether or not that software is popular or not.
Its CLEARLY a biased review then since the hardware can be "tuned" for just those titles, so the review numbers are not as clear of the TOTAL performance across all kinds of titles as it would be otherwise.




And just to touch a bit on the specific HardOCP situation.
If they agreed to bench certain features in exchange for free review hardware and failed to honor their part of the agreement, then its their own fault no matter how you spin it IMO.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: MTDEW
Ok, i finally read this thread since i was bored. :D
So heres what i think. (not that it matters...LOL )

1: Although Nvidia re-rebranding a part is definately annoying, its really is no big deal.
Anyone looking to purchase that part can easily look up a review of its performance before buying and know exactly what performance they are buying regardless of the parts name.

2: Expecting a review to cover all of a parts features in exchange for free hardware for the reviews is just fine IMO.
Why not? It shows what features the consumer is getting with the hardware, and the consumer reading the review is the only one who can decide if the features are important to them enough to buy the hardware.

3: Now telling review sites EXACTLY what software to benchmark and ONLY that software when reviewing the hardware is just plain misleading IMO.

I mean saying you'll give free hardware for reviews and those reviews must include the following benchmarks that really show off your products best performance/features is just fine as long as the review sites are free to benchmark any other software that they may feel is relevant.
After all, the hardware will be used for more than just a few titles and whether performance in any particular title matters or not will certainly vary from one consumer to the next.

But saying the benchmarks can ONLY be run on software the hardware maker specifies is downright dishonest whether or not that software is popular or not.
Its CLEARLY a biased review then since the hardware can be "tuned" for just those titles, so the review numbers are not as clear of the TOTAL performance across all kinds of titles as it would be otherwise.




And just to touch a bit on the specific HardOCP situation.
If they agreed to bench certain features in exchange for free review hardware and failed to honor their part of the agreement, then its their own fault no matter how you spin it IMO.

:thumbsup: excellent synopsis
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,254
126
Originally posted by: BladeVenom
Some one's drank too much of the green Kool Aid.

More like chugged it. :)

If HoCP agreed to bench something they should. The only thing I don't like is when a company tells exactly which games to bench...when they do that they're obviously hiding something or haven't gotten to optimize for certain games...which is irrelevant IMO.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: MTDEW
Ok, i finally read this thread since i was bored. :D
So heres what i think. (not that it matters...LOL )
I stopped reading as soon as you started making sense. Well said, and agreed. ;)

Originally posted by: thilan29
Originally posted by: BladeVenom
Some one's drank too much of the green Kool Aid.

More like chugged it. :)

If HoCP agreed to bench something they should. The only thing I don't like is when a company tells exactly which games to bench...when they do that they're obviously hiding something or haven't gotten to optimize for certain games...which is irrelevant IMO.
Rofl, if given the choice, why drink anything but the best? :beer:;)