GTX 295

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
I was going to stepup to the 295 from my gtx 280 this week, but ended up cancelling the tradeup. I game at 1920x1200. After looking at benches and then looking at the titles I play, I realized the card is just not worth having, especially considering it will go the way of the 9800GX2 in a couple months.

Also, after playing the retail Cryostasis and seeing how underwhelming the graphics and physx in that game is, and also seeing that a gtx 280 cannot run that game at it's maximum settings and keep it playable, I took physx to be yet another stab by nvidia to get you to waste your money on crap.

The state of GPUs these days is great, if you own a GTX 260 / 4870 1GB or better card, and game at 1920x1200, you really don't need to upgrade, at all. Crysis does not justify it, and still is not smoothly playable at 1920x1200 with 4xaa on a 295.

Wait for the next gen cards if you want a new card and even then, at this point there is no need to upgrade anymore unless you start to game at 2560x1600. It's good days for gamers.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: apoppin

none taken
Those of us that work hard for our HW find it hard to justify buggy new MBs and DDR3 that is rather OVERpriced at a premium for early adopters like you that will probably upgrade again when i do this Spring

Still not getting any free hardware from your website that you advertise in your sig?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: apoppin

You are getting more inattentive to what i am writing all the time and attempting your old tricks of putting words in my mouth for your strawman arguments
:thumbsdown:

What 200mhz difference? Your QuadCore example runs at 3.2Ghz .. my Dual Core runs over 4.0Ghz .. 4.2Ghz is a 25% core speed difference!!


You're the one more inattentive here Apoppin- the comments you're answering are based on links to a comparison of a 3GHz E8400 and 3.2GHz QX9770 and 3.2GHz i7. The FS review uses a 3.2 GHz Quad compared to his 3 GHz quad. The 200MHz references are VERY relevant here, despite your attempts to distract.

Furthermore, can you link us to evidence the scaling improvements with CPU OCing are even close to linear in the GPU limited people buy 4870X2s and GTX295s for?

Originally posted by: apoppin
i think yous is more unbalanced than my system which IS getting a superfast penryn QuadCore - this week!
- did you miss that statement deliberately or just ignore it because it does not fit in with your own preconceived ideas?
Did you miss going to the benchmarks I linked twice that show a i7 destroys a quad core Pennryn at your chosen resolution of 19X12 clock for clock?

Besides which, I never said anything about my own system in this thread, so why are you??

Originally posted by: apoppin
Nvidia *stresses* the importance of GPU over CPU [period!!!]
- now to repeat, i AM getting a QuadCore q9550 Penryn that is faster than your own stock i7 in gaming - this week!:p
:Q
1. I am not NVIDIA.
2. I'll link some NVIDIA managers to this thread to see if they'd like to correct me, but I'm pretty sure they'll agree that:
a. For 2 way SLI a 3GHz Quad like the OP has is PLENTY fast, in direct contradiction to your advice.
b. If you're going to play with the big boys in 3 and 4 way configs, you need to match up your system to the graphics with current high end CPUs and graphics.
Your Pennryns and 19X12 monitor aren't good matches for 3way CF strictly speaking- definitely not a "review ready" rig for a tech site.

Originally posted by: apoppin
Half Life and Oblivion .. OLD games that run great on 8800GTX; what are you trying to show anyone by repeating your tired old benches with a STOCK CPU and also ancient reviews from last August? No worries, i will update it for you as i do a NEW comparison of Dual Core vs. Quad-core

The point of the HL and Oblivion SLi benches at 3.2GHz stock quad is to show that he will get great scaling with his 3GHz Quad, just like the 3.2Ghz Quad did for Firing Squad. I was disproving your advice that his 3GHz Quad is a "waste" for a GTX295, and referencing non multithreaded games so you couldn't try to evade by saying you were mistaken and had posted about the dual core 3GHz.

Originally posted by: apoppin
- it has always been easy for me to disprove what you say. i stick to facts. i don't think Focus Group is of much use here
rose.gif

You haven't disproved any of what I've said.

You haven't provided a single link to data on a review site that proves anything you've said, or disproves anything I've said.

You've only said I'm wrong without proof, and I posted links that back everything I've said.

Not good for a review site owner, my friend. Your advice to the OP was clearly wrong, and no amount of back pedalling here will help that.

This thread is about a GTX295 being a good choice for the OP with his 3GHz Quad. I linked to FS benches that show great scaling for GTX 260 and GTX280 Sli across games on a 3.2GHz quad. (a comparable setup)

Please link to an independent review site that backs your claim the GTX295 is a "waste", or retract and apologize.




 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nRollo
Originally posted by: apoppin

You are getting more inattentive to what i am writing all the time and attempting your old tricks of putting words in my mouth for your strawman arguments
:thumbsdown:

What 200mhz difference? Your QuadCore example runs at 3.2Ghz .. my Dual Core runs over 4.0Ghz .. 4.2Ghz is a 25% core speed difference!!


You're the one more inattentive here Apoppin- the comments you're answering are based on links to a comparison of a 3GHz E8400 and 3.2GHz QX9770 and 3.2GHz i7. The FS review uses a 3.2 GHz Quad compared to his 3 GHz quad. The 200MHz references are VERY relevant here, despite your attempts to distract.

Furthermore, can you link us to evidence the scaling improvements with CPU OCing are even close to linear in the GPU limited people buy 4870X2s and GTX295s for?

Originally posted by: apoppin
i think yous is more unbalanced than my system which IS getting a superfast penryn QuadCore - this week!
- did you miss that statement deliberately or just ignore it because it does not fit in with your own preconceived ideas?
Did you miss going to the benchmarks I linked twice that show a i7 destroys a quad core Pennryn at your chosen resolution of 19X12 clock for clock?

Besides which, I never said anything about my own system in this thread, so why are you??

Originally posted by: apoppin
Nvidia *stresses* the importance of GPU over CPU [period!!!]
- now to repeat, i AM getting a QuadCore q9550 Penryn that is faster than your own stock i7 in gaming - this week!:p
:Q
1. I am not NVIDIA.
2. I'll link some NVIDIA managers to this thread to see if they'd like to correct me, but I'm pretty sure they'll agree that:
a. For 2 way SLI a 3GHz Quad like the OP has is PLENTY fast, in direct contradiction to your advice.
b. If you're going to play with the big boys in 3 and 4 way configs, you need to match up your system to the graphics with current high end CPUs and graphics.
Your Pennryns and 19X12 monitor aren't good matches for 3way CF strictly speaking- definitely not a "review ready" rig for a tech site.

Originally posted by: apoppin
Half Life and Oblivion .. OLD games that run great on 8800GTX; what are you trying to show anyone by repeating your tired old benches with a STOCK CPU and also ancient reviews from last August? No worries, i will update it for you as i do a NEW comparison of Dual Core vs. Quad-core

The point of the HL and Oblivion SLi benches at 3.2GHz stock quad is to show that he will get great scaling with his 3GHz Quad, just like the 3.2Ghz Quad did for Firing Squad. I was disproving your advice that his 3GHz Quad is a "waste" for a GTX295, and referencing non multithreaded games so you couldn't try to evade by saying you were mistaken and had posted about the dual core 3GHz.

Originally posted by: apoppin
- it has always been easy for me to disprove what you say. i stick to facts. i don't think Focus Group is of much use here
rose.gif

You haven't disproved any of what I've said.

You haven't provided a single link to data on a review site that proves anything you've said, or disproves anything I've said.

You've only said I'm wrong without proof, and I posted links that back everything I've said.

Not good for a review site owner, my friend. Your advice to the OP was clearly wrong, and no amount of back pedalling here will help that.

This thread is about a GTX295 being a good choice for the OP with his 3GHz Quad. I linked to FS benches that show great scaling for GTX 260 and GTX280 Sli across games on a 3.2GHz quad. (a comparable setup)

Please link to an independent review site that backs your claim the GTX295 is a "waste", or retract and apologize.

Or what ?
- you might have to do research and actually learn something?

:)

i don't even think you have any clue about what i am saying :p

i recommended a Nvidia Card to the OP - the same one he says he is getting
- he seems to think GTX295 is ALSO overkill

- for HIS system

do not make sweeping generalizations from what i did not say

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
So now lets go over this point by point:

Originally posted by: nRollo
Originally posted by: apoppin

You are getting more inattentive to what i am writing all the time and attempting your old tricks of putting words in my mouth for your strawman arguments
:thumbsdown:

What 200mhz difference? Your QuadCore example runs at 3.2Ghz .. my Dual Core runs over 4.0Ghz .. 4.2Ghz is a 25% core speed difference!!


You're the one more inattentive here Apoppin- the comments you're answering are based on links to a comparison of a 3GHz E8400 and 3.2GHz QX9770 and 3.2GHz i7. The FS review uses a 3.2 GHz Quad compared to his 3 GHz quad. The 200MHz references are VERY relevant here, despite your attempts to distract.

Furthermore, can you link us to evidence the scaling improvements with CPU OCing are even close to linear in the GPU limited people buy 4870X2s and GTX295s for?

A useless comparison to unrelated i7 somehow "putdown" my system

Originally posted by: apoppin
i think yous is more unbalanced than my system which IS getting a superfast penryn QuadCore - this week!
- did you miss that statement deliberately or just ignore it because it does not fit in with your own preconceived ideas?
Did you miss going to the benchmarks I linked twice that show a i7 destroys a quad core Pennryn at your chosen resolution of 19X12 clock for clock?

Besides which, I never said anything about my own system in this thread, so why are you??
[/quote]
You brought up *my* system as though somehow it was inferior to i7. Overclocked, it is not inferior at all. You said mine was unbalanced. i said your is.
Originally posted by: apoppin
Nvidia *stresses* the importance of GPU over CPU [period!!!]
- now to repeat, i AM getting a QuadCore q9550 Penryn that is faster than your own stock i7 in gaming - this week!:p
:Q
1. I am not NVIDIA.
2. I'll link some NVIDIA managers to this thread to see if they'd like to correct me, but I'm pretty sure they'll agree that:
a. For 2 way SLI a 3GHz Quad like the OP has is PLENTY fast, in direct contradiction to your advice.
b. If you're going to play with the big boys in 3 and 4 way configs, you need to match up your system to the graphics with current high end CPUs and graphics.
Your Pennryns and 19X12 monitor aren't good matches for 3way CF strictly speaking- definitely not a "review ready" rig for a tech site.[/quote]
My advice was very specific to the OP, based on what he said. My own rig is fine and perfectly matched imho. i do not buy "bleeding edge" HW just to make my video cards look good, as you said you do. i aim for a balanced gaming experience which my PC provides at my chosen resolution
Originally posted by: apoppin
Half Life and Oblivion .. OLD games that run great on 8800GTX; what are you trying to show anyone by repeating your tired old benches with a STOCK CPU and also ancient reviews from last August? No worries, i will update it for you as i do a NEW comparison of Dual Core vs. Quad-core

The point of the HL and Oblivion SLi benches at 3.2GHz stock quad is to show that he will get great scaling with his 3GHz Quad, just like the 3.2Ghz Quad did for Firing Squad. I was disproving your advice that his 3GHz Quad is a "waste" for a GTX295, and referencing non multithreaded games so you couldn't try to evade by saying you were mistaken and had posted about the dual core 3GHz.
You linked to '05 Source Engine and '06 Oblivion. Try harder
Originally posted by: apoppin
- it has always been easy for me to disprove what you say. i stick to facts. i don't think Focus Group is of much use here
rose.gif

You haven't disproved any of what I've said.

You haven't provided a single link to data on a review site that proves anything you've said, or disproves anything I've said.

You've only said I'm wrong without proof, and I posted links that back everything I've said.

Not good for a review site owner, my friend. Your advice to the OP was clearly wrong, and no amount of back pedalling here will help that.

This thread is about a GTX295 being a good choice for the OP with his 3GHz Quad. I linked to FS benches that show great scaling for GTX 260 and GTX280 Sli across games on a 3.2GHz quad. (a comparable setup)

Please link to an independent review site that backs your claim the GTX295 is a "waste", or retract and apologize
[/quote]

You also got that wrong. i am not "owner" .. i am 'editior'; ABT has many great writers and reviewers.
Finally .. you are saying that my advice to the OP to get a GTX285 is wrong??

You pick on a few of my words - out of context - and build an entire strawman edifice

i made no claim that GTX295 is a waste .. i simply gave specific advice to the OP based on my experience .. which is for him to buy a Nvidia card. Do you think my suggestion of GTX285 is a waste for him?

:confused:

i think it is utterly ridiculous that anyone should have to apologize for their opinion
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: apoppin

Or what ?
- you might have to do research and actually learn something?

:)

i don't even think you have any clue about what i am saying :p

i recommended a Nvidia Card to the OP - the same one he says he is getting
- he seems to think GTX295 is ALSO overkill

- for HIS system

do not make sweeping generalizations from what i did not say

More personal attacks, but still no links to independent review sites to back your claims that the GTX295 is a waste for the OP. :(

Allow me to demonstrate how to prove a point without off topic personal attacks:

You said:
Originally posted by: apoppin
You really don't get it?

he has a slow CPU .. :p

the GTX is kind of a waste paired with a e6600 @ 3.0 Ghz
- so is X2 but a $100 less "waste"
:D

i'd just get 285 if i were in his position wanting to go with Nvidia
rose.gif

This is not even close to true, so I post links to data that shows scaling on a 3GHz quad like the OP has, proving it's not a "waste":

Ruh roh. Scaling at COD4 for GTX260 and GTX280 SLi at COH

Hmmm scaling at HL2 as well

Scaling at ETQW

Scaling at Crysis

Scaling at Assasin's Creed

Scaling at Grid

So now that we've proven (again) that the OP would get SLi scaling with his 3GHz processor, at his resolution, what else do people buy multi GPU for? Oh yeah- AA!

Hmmm, multiple GTX280s seem to be able to run 8XAA better too!

Or how about 3D Vision should he choose to go with the ultimate in immersion at some point?

Yep, looks like SLi is working to enhance 3D Vision as well

So I've shown that SLi will give him scaling, run AA and stereovision better, and you say it's a "waste".

Based on what? Your intuition? Link us up, so that we may see the basis of your claims.

I won't hold my breath waiting.


 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nRollo
Originally posted by: apoppin

Or what ?
- you might have to do research and actually learn something?

:)

i don't even think you have any clue about what i am saying :p

i recommended a Nvidia Card to the OP - the same one he says he is getting
- he seems to think GTX295 is ALSO overkill

- for HIS system

do not make sweeping generalizations from what i did not say

More personal attacks, but still no links to independent review sites to back your claims that the GTX295 is a waste for the OP. :(

Allow me to demonstrate how to prove a point without off topic personal attacks:

You said:
Originally posted by: apoppin
You really don't get it?

he has a slow CPU .. :p

the GTX is kind of a waste paired with a e6600 @ 3.0 Ghz
- so is X2 but a $100 less "waste"
:D

i'd just get 285 if i were in his position wanting to go with Nvidia
rose.gif

This is not even close to true, so I post links to data that shows scaling on a 3GHz quad like the OP has, proving it's not a "waste":

Ruh roh. Scaling at COD4 for GTX260 and GTX280 SLi at COH

Hmmm scaling at HL2 as well

Scaling at ETQW

Scaling at Crysis

Scaling at Assasin's Creed

Scaling at Grid

So now that we've proven (again) that the OP would get SLi scaling with his 3GHz processor, at his resolution, what else do people buy multi GPU for? Oh yeah- AA!

Hmmm, multiple GTX280s seem to be able to run 8XAA better too!

Or how about 3D Vision should he choose to go with the ultimate in immersion at some point?

Yep, looks like SLi is working to enhance 3D Vision as well

So I've shown that SLi will give him scaling, run AA and stereovision better, and you say it's a "waste".

Based on what? Your intuition? Link us up, so that we may see the basis of your claims.

I won't hold my breath waiting.

waiting for what?

you are fixating on my single word "waste" :p
- what kind of "waste" ?
i will answer for you .. it is the practical DIFFERENCE between X2 and your sandwich at 19x12 with a 3.0Ghz CPU

Here is my advice to the OP in correct time sequence:

what *might* hold him back with either card is his q6600 at only 3.0Ghz
--it might not really make any practical gaming difference running a 295 over an x2

then he says:

I decided that I'm going to go with the GTX 285. I will probably upgrade my mobo down the line, and at that time I will just pick up another GTX 285 for SLI.

i then agree with him and his options for SLi later [which certainly could include GTX295 and a faster CPU] - AFTER he says that, i reply .. and you clearly missed the >":D"<
the GTX is kind of a waste paired with a e6600 @ 3.0 Ghz
- so is X2 but a $100 less "waste"
:D

i'd just get 285 if i were in his position wanting to go with Nvidia
it is very specific and very much in-line with my continuing to compare the *differences* between GTX295 and X2 at 19x12 with a 3.0Ghz QC. That is when you took it way off-topic with snide comments about my system - which IS getting upgraded this week and will likely be faster than yours.

You are the master nit-picker - but you are not good at it because of you hostility to me and inability or unwillingness to understand - or even read correctly - what i am really saying .. worst of all imo, you have not showed anything but your willingness to "sell" Nvidia
rose.gif

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: apoppin
Originally posted by: nRollo
Originally posted by: apoppin

Or what ?
- you might have to do research and actually learn something?

:)

i don't even think you have any clue about what i am saying :p

i recommended a Nvidia Card to the OP - the same one he says he is getting
- he seems to think GTX295 is ALSO overkill

- for HIS system

do not make sweeping generalizations from what i did not say

More personal attacks, but still no links to independent review sites to back your claims that the GTX295 is a waste for the OP. :(

Allow me to demonstrate how to prove a point without off topic personal attacks:

You said:
Originally posted by: apoppin
You really don't get it?

he has a slow CPU .. :p

the GTX is kind of a waste paired with a e6600 @ 3.0 Ghz
- so is X2 but a $100 less "waste"
:D

i'd just get 285 if i were in his position wanting to go with Nvidia
rose.gif

This is not even close to true, so I post links to data that shows scaling on a 3GHz quad like the OP has, proving it's not a "waste":

Ruh roh. Scaling at COD4 for GTX260 and GTX280 SLi at COH

Hmmm scaling at HL2 as well

Scaling at ETQW

Scaling at Crysis

Scaling at Assasin's Creed

Scaling at Grid

So now that we've proven (again) that the OP would get SLi scaling with his 3GHz processor, at his resolution, what else do people buy multi GPU for? Oh yeah- AA!

Hmmm, multiple GTX280s seem to be able to run 8XAA better too!

Or how about 3D Vision should he choose to go with the ultimate in immersion at some point?

Yep, looks like SLi is working to enhance 3D Vision as well

So I've shown that SLi will give him scaling, run AA and stereovision better, and you say it's a "waste".

Based on what? Your intuition? Link us up, so that we may see the basis of your claims.

I won't hold my breath waiting.

waiting for what?

you are fixating on the word "waste" :p
- what kind of "waste" .. the practical DIFFERENCE between X2 and your sandwich at 19x12 with a 3.0Ghz CPU

here is my advice to the OP in time sequence:

what *might* hold him back with either card is his q6600 at only 3.0Ghz
--it might not really make any practical gaming difference running a 295 over an x2

then he says:

I decided that I'm going to go with the GTX 285. I will probably upgrade my mobo down the line, and at that time I will just pick up another GTX 285 for SLI.

i agree with him and his options for SLi later [which could include GTX295] - AFTER he says that, i reply .. and you clearly missed the >":D":
the GTX is kind of a waste paired with a e6600 @ 3.0 Ghz
- so is X2 but a $100 less "waste"
:D

i'd just get 285 if i were in his position wanting to go with Nvidia

you are the master nit-picker - you are not good at it because of you hostility to me and inability or unwillingness to understand what i am really saying .. worst of all imo, you have not showed anything but your willingness to "sell" Nvidia
rose.gif

I've proved my points Mark.

The OP would have a better gaming experience with a GTX295 on his current rig than he would with the single card you recommend.

Higher average framerates, better AA performance, better stereo performance- all add up to "not a waste".

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nRollo
Originally posted by: apoppin
Originally posted by: nRollo
Originally posted by: apoppin

Or what ?
- you might have to do research and actually learn something?

:)

i don't even think you have any clue about what i am saying :p

i recommended a Nvidia Card to the OP - the same one he says he is getting
- he seems to think GTX295 is ALSO overkill

- for HIS system

do not make sweeping generalizations from what i did not say

More personal attacks, but still no links to independent review sites to back your claims that the GTX295 is a waste for the OP. :(

Allow me to demonstrate how to prove a point without off topic personal attacks:

You said:
Originally posted by: apoppin
You really don't get it?

he has a slow CPU .. :p

the GTX is kind of a waste paired with a e6600 @ 3.0 Ghz
- so is X2 but a $100 less "waste"
:D

i'd just get 285 if i were in his position wanting to go with Nvidia
rose.gif

This is not even close to true, so I post links to data that shows scaling on a 3GHz quad like the OP has, proving it's not a "waste":

Ruh roh. Scaling at COD4 for GTX260 and GTX280 SLi at COH

Hmmm scaling at HL2 as well

Scaling at ETQW

Scaling at Crysis

Scaling at Assasin's Creed

Scaling at Grid

So now that we've proven (again) that the OP would get SLi scaling with his 3GHz processor, at his resolution, what else do people buy multi GPU for? Oh yeah- AA!

Hmmm, multiple GTX280s seem to be able to run 8XAA better too!

Or how about 3D Vision should he choose to go with the ultimate in immersion at some point?

Yep, looks like SLi is working to enhance 3D Vision as well

So I've shown that SLi will give him scaling, run AA and stereovision better, and you say it's a "waste".

Based on what? Your intuition? Link us up, so that we may see the basis of your claims.

I won't hold my breath waiting.

waiting for what?

you are fixating on the word "waste" :p
- what kind of "waste" .. the practical DIFFERENCE between X2 and your sandwich at 19x12 with a 3.0Ghz CPU

here is my advice to the OP in time sequence:

what *might* hold him back with either card is his q6600 at only 3.0Ghz
--it might not really make any practical gaming difference running a 295 over an x2

then he says:

I decided that I'm going to go with the GTX 285. I will probably upgrade my mobo down the line, and at that time I will just pick up another GTX 285 for SLI.

i agree with him and his options for SLi later [which could include GTX295] - AFTER he says that, i reply .. and you clearly missed the >":D":
the GTX is kind of a waste paired with a e6600 @ 3.0 Ghz
- so is X2 but a $100 less "waste"
:D

i'd just get 285 if i were in his position wanting to go with Nvidia

you are the master nit-picker - you are not good at it because of you hostility to me and inability or unwillingness to understand what i am really saying .. worst of all imo, you have not showed anything but your willingness to "sell" Nvidia
rose.gif

I've proved my points Mark.

The OP would have a better gaming experience with a GTX295 on his current rig than he would with the single card you recommend.

Higher average framerates, better AA performance, better stereo performance- all add up to "not a waste".
What are you talking about, Brian? NO ONE is saying the GTX295 is not faster than GTX285 :p
- or that there is no difference between them with an Q6600
- where did this nonsense even come from?
[rhetorical - you set it up as a Strawman again; i was commenting *specifically* on X2 vs 295 with Q6600]
:roll:



And yours is not better advice than what the OP's currently plans; they are awesome and include a potential MB, CPU upgrade and future SLi
- are you saying he would not have a better gaming experience with GTX285 over his 4870? My interim recommendation for him also.
:confused:

again .. my word *WASTE* ..
directed specifically at the performance difference between a 4870x2 and a GTX295 paired with a 3.0 Ghz Q6600 - at 19x12 resolution

what *might* hold him back with either card is his q6600 at only 3.0Ghz
--it might not really make any practical gaming difference running a 295 over an x2

i stand by the above ... you have the burden of proof to show the difference as being "practical" in his current PC


All you proved is your agenda and you inability to comprehend what i am posting nor you willingness to accept what i explain.
- do i need to bring out the puppets or shall i link you to a comic book that can explain it better to you?

i am getting very frustrated with your feeble and annoying attempts to twist everything i am saying. i have been talking about a very specific situation between two very powerful cards where the higher performing card extra performance may be wasted - at 19x12 with 3.0 Ghz Q6600.
rose.gif


edited
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: apoppin

All you proved is your agenda and you inability to comprehend what i am posting nor you willingness to accept what i explain.
- do i need to bring out the puppets or shall i link you to a comic book that can explain it better to you?

i am getting very frustrated with your feeble and annoying attempts to twist everything i am saying. i have been talking about a very specific situation between two very powerful cards where the higher performing card extra performance may be wasted - at 19x12 with 3.0 Ghz Q6600.
rose.gif


edited


You still aren't providing any links to info that backs up your claims, and 3GHz Quad with 19X12 resolution is probably 90% of the potential buyers for GTX295 or 4870X2.

Very few of us have 25X16, and I'm unaware of any evidence that OCs of the CPU makes any huge difference in dual GPU performance at the 19X12 4X16X and up.

Since you haven't provided any, I guess we'll all have to assume I am right.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nRollo
Originally posted by: apoppin

All you proved is your agenda and you inability to comprehend what i am posting nor you willingness to accept what i explain.
- do i need to bring out the puppets or shall i link you to a comic book that can explain it better to you?

i am getting very frustrated with your feeble and annoying attempts to twist everything i am saying. i have been talking about a very specific situation between two very powerful cards where the higher performing card extra performance may be wasted - at 19x12 with 3.0 Ghz Q6600.
rose.gif


edited


You still aren't providing any links to info that backs up your claims, and 3GHz Quad with 19X12 resolution is probably 90% of the potential buyers for GTX295 or 4870X2.

Very few of us have 25X16, and I'm unaware of any evidence that OCs of the CPU makes any huge difference in dual GPU performance at the 19X12 4X16X and up.

Since you haven't provided any, I guess we'll all have to assume I am right.

i don't think anyone assumes you are right here :p
- who is the "we" you speak of?

What is it you'd like me to show you?
What claims am i really making that you disagree with ??
:confused:

Here is my SOLE "claim" - in this thread - which is now being repeated for the nth time for all to see because you appear to be either obtuse or something else i am unsure of:

encore . . .

The GTX295 won't make much practical difference over 4870-X2 at 19x12 when either GPU is paired with a Q6600 at stock speeds.

Any OTHER claims about what *i* said ... are really coming from you - with a volume of unrelated links to obfuscate what i have been saying consistently
- you clearly twist facts and torture the truth to suit yourself and your agenda

rose.gif




 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
You're both right to a degree, a 3GHz Quad will still absolutely bottleneck the faster multi-GPU solutions as Apoppin is stating. nRollo you should know this first-hand having just moved from a C2Q to Core i7 and the well-known Guru3D benchmarks you link to. Also nRollo is right that there's a pretty significant boost in some games from Core i7 over C2D, even at the same clock speeds.

Yes, you will still see a benefit from the 2nd GPU as nRollo is claiming, especially at higher resolutions and/or with AA, however, the difference between a fast single GPU and the fastest multi-GPU may be very little due to CPU bottlenecking, even at reasonably high resolutions and with AA.

Here's some examples to clearly show this:

PCGH Multi-GPU SLI and CF Comparison with Core i7 @ 3.6GHz
Important to note here, that max FPS here are ~78 at 1920 with 4xAA and that all 4 configurations listed are still CPU bottlenecked. Only at 2560 with 4xAA do they show any difference due to VRAM/Bandwidth/GPU as the bottleneck shifts from the CPU to GPU. Unfortunately they didn't include a fast single GPU like the 280 or 285.

PCGH with multi-GPU and single-GPU with a C2D @ 3.6GHz
Right away you'll notice the maximum FPS is much lower than the Core i7 even at the same clock speed, only 61FPS, showing the fastest CPU possible is absolutely required for the fastest multi-GPU solutions. The multi-GPU are CPU bottlenecked regardless of resolution or AA but clearly do show an advantage over the single GPU. The lower the resolution or AA however, the closer the single GPU is, with only 12% gain from GTX 280 to GTX 295 at 1680 4xAA and only 20% gain from GTX 280 to GTX 295 at 1920 4xAA. Still a gain, but clearly very poor scaling due to CPU bottlenecking, which is further emphasized in the next link

PCGH with multi-GPU and GTX 285 single-GPU with a C2D @ 3.6GHz
Here we see just how little difference you may see between the fastest single-GPU and multi-GPU if you are CPU bottlenecked. The GTX 285 is actually faster than the GTX 295 and other multi-GPU solutions now at 1680 with 4xAA. Also note max FPS are holding constant at 61FPS as that's the limit of the CPU. It simply will not produce more frames, so a faster GPU or multi-GPU will go to waste at lower resolutions and settings. You see gains with 8xAA, but again, they're lower due to the lower max FPS overhead than they would be with a faster CPU. Even at 1920 with 4xAA, the GTX 285 OC is within 4FPS or 6% of the GTX 295.

Now, here's where the comparison comes full circle with the first link with Core i7. The differences between single-GPU and multi-GPU throughout would be much greater if the Core i7 was used and extended max frame overhead to 78 FPS, where the single-GPU GTX 285 might not continue scaling as much as it hits a GPU bottleneck. So do you need the fastest CPU in order to benefit from multi-GPU over single-GPU? Not at higher resolutions or with AA, however you will certainly see less or even no gain if your CPU is too slow.


 

Mr. Lennon

Diamond Member
Jul 2, 2004
3,492
1
81
Well picked up a GTX 285 @ $321 after CB and rebates. Now I just need to return this POS 4870 1gb to newegg. I will definitely be upgrading my system sometime in the summer, by then I'll either sell this card for whatever is kicking ass then, or just buy another to SLI.

Thanks everyone from the large amount of information presented in this thread!
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Zeppelin2282
Well picked up a GTX 285 @ $321 after CB and rebates. Now I just need to return this POS 4870 1gb to newegg. I will definitely be upgrading my system sometime in the summer, by then I'll either sell this card for whatever is kicking ass then, or just buy another to SLI.

Thanks everyone from the large amount of information presented in this thread!

Believe me it has been fun :p
- i think you are going in the exact right direction and it is what i would do with a Nvidia solution. i personally love my 280GTX and i highly recommend them

Best of all, you have kept your options open for maximum upgrading potential and you will not have WoW issues any longer; i had to play both Hellgate and WoW on my 280 instead of X2/x-3 because it would not run properly with multi-GPU on the DX10 pathways.

Originally posted by: chizow
You're both right to a degree, a 3GHz Quad will still absolutely bottleneck the faster multi-GPU solutions as Apoppin is stating. nRollo you should know this first-hand having just moved from a C2Q to Core i7 and the well-known Guru3D benchmarks you link to. Also nRollo is right that there's a pretty significant boost in some games from Core i7 over C2D, even at the same clock speeds.

Yes, you will still see a benefit from the 2nd GPU as nRollo is claiming, especially at higher resolutions and/or with AA, however, the difference between a fast single GPU and the fastest multi-GPU may be very little due to CPU bottlenecking, even at reasonably high resolutions and with AA.

i have zero doubt that there is a benefit from a 2nd GPU as my own published testing has already confirmed it with a C2D e4300@3.33Ghz; you DO get a boost but not like you get with a e8600@3.33Ghz and nowhere near the benefit from upping the clock to 4+ Ghz.
- in fact, the ONLY thing i stated might be wasted would probably be the *difference* between a GTX295 and a 4870-X2 - paired with a Q6600 at 3Ghz @ 19x12 resolution.

Rollo simply made up things i did not say and then argued against his own fake arguments - he *claims* much i did not say and refuses to look at any explanation from me
:|

the other thing i objected to was comparing stock i7 to stock penryn - clearly i7 is faster clock for clock .. even in gaming
BUT ... OCing the i7 is an expensive gamble with MBs that seem to have a lot of issues with certain configurations
- i would never do that for my own PC and the current state of the HW makes it a crapshoot to really get a faster i7 for gaming without spending even more major bucks for DDR3
.. i already know my e8600 clocks well over 4Ghz and i expect my new q9550 to hit close to [or hopefully over] 4 Ghz .

rose.gif