The Planets align in December

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Thanks to the op! I never really thought about it, but can likely afford to wait. :beer:

I do agree with the general idea of waiting for dx10, before buying for it. Besides no new game right now that makes me really want to upgrade.

edit: Crysis could change that, as farcry certainly had me upgrading from my 9700pro.
 

coolpurplefan

Golden Member
Mar 2, 2006
1,243
0
0
Unreal Tournament 3 is supposed to be extremely scalable. They say it will even run on an old machine (don't know at what settings though).

My single core AMD 4000+ with X1650 XT and 2GB of OCZ Platinum RAM runs the Colin Mcrae Dirt demo at high settings without a problem. Yes, this is not at Ultra settings but I didn't see a whole lot of difference in image quality with that setting anyway. It was kind of funny to see my machine slow down to a crawl like that. I guess I was getting less than 5 frames per second. lol

The REAL problem is people getting huge LCDs with resolutions of 12000x11000 that require video cards from hell to make them run at that resolution.
 

LightningRider

Senior member
Feb 16, 2007
558
0
0
Yeah, agreed. I only have an 19" LCD so I game at 1280x1024, so I think the 8800GTS 320MB will be fine for me for a while.

Although I just set up a new computer for a friend at work and the widescreen monitor is really sick.
 

ConstipatedVigilante

Diamond Member
Feb 22, 2006
7,670
1
0
Am I the only one here who can stand less than max settings? I sure like 'em, but upgrading so frequently is extremely expensive. I'm thinking that I'll get an Opty 165 and overclock to milk Socket 939 for what it's worth, maybe get better RAM (Nforce 4 underclocks 4x512 to 333mhz, not good with ValueSelect), and wait til the DX10 cards get cheaper (around G92).
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: apoppin
<div class="FTQUOTE"><begin quote>Originally posted by: Zstream
My take on the situation

If you are at a 3800+x2 specs right now then wait till December if you have a C2D now then wait till spring of 08'. </end quote></div>why a 3800+? ... it is slow even for a x1950xt<div class="FTQUOTE"><begin quote>

Let me explain my reasoning:

1. Alan Wake, Crysis, Unreal 2007 and games based on the above three titles will make any current hardware cry. We all read the same article saying that crysis was running on a 8800 series card but we also heard that the 2900xt is faster in the game. </end quote></div>they will make the g90 cry too ... it isn't going to be 2x faster then g80... and sli will make up for that deficit ... i know xfire will, with a cheap 2nd 2900xt

<div class="FTQUOTE"><begin quote>Given the current status of the cards being that they are hot, 'expensive and so far offer very little performance in dx10. No matter how bad the game is coded we should be noticing better numbers then these.</end quote></div>GTS is not hot and there is no guarantee g90 won;t be hotter and require even more power ... depending on what r660 brings in performance ... nvidia will do whatever it takes to stay competitive or ahead. And is is pure speculation about today's DX10 performance idicating anything about full dx10 </end quote></div>

2. Quad core Intel chips are hot, decent price but currently no game use all four cores. When Alan Wake does arrive I could see loads of cheap quad core chips selling. They could be AMD or Intel. I refer to #1 when I state that everything runs hot now and my room feels like a toaster. I will use Intel or AMD based on performance, price and temp. By the time games actually use all cores the current chips will be sub-par.</end quote></div>Wrong. Supreme Commander uses all 4 cores ... more to follow shortly ... by q4 '09 over half of PCs will be Q4
-and ... best of all my current MB will allow a drop-in penryn
<div class="FTQUOTE"><begin quote>

If your system is running fine why upgrade? Save your $$ and buy something much better a few months down the road. To whomever says why not just upgrade now... well upgrading is a hard thing to do. If you time it right your PC will last years, if you do not then you squander a bunch of money.</end quote></div>of course, if you are *satisfied*, you are not reading this thread

and as usual ... the people with the upgraded systems justify 'why' they upgraded and the people with the older systems are justifying why they don't spend the money. Whatever you do for yourself is OK with me


The 3800+x2 is not slow for an 1950xt, where do you get your numbers man? I swear your more full of BS then anyone here. At resolutions higher then 1280*1024 you start to be GPU limited more then anything. Everyone should know this by now... or so I thought.

Who goes for SLI? SLI & Xfirewith two older generation GPU's is never a SMART idea, if you do not believe me then ask around.

The GTS is hot as well as the 1900series and beyond. Anything at or hovers around 70C is hot.

Supreme Commander benchmarks prove that the total usage of quad core and dual core is not even remote. I believe it is 3% difference with quad core cpu's.


 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I swear your more full of BS then anyone here. At resolutions higher then 1280*1024 you start to be GPU limited
seems like you are more full of BS than helpful info, Zstream,; who cares about 10x12 anymore> .. big LCDs are cheap and very demanding on a CPU and that A64-3800+ IS slow for a modern GPU ... are you not keeping up? what is in your rig?

no i don't believe your general comment about SLI/xfire .. AND GPU s are often severely discounted in middle-of-life, never-mind "older generation" ... and i don't need to "ask around" to support your uninformed opinion.

"Anything at or hovers around 70C is hot" is pretty silly statement, even for you . Wrong, so wrong

and you are 100% clueless about SC ... you said "currently no game use all four cores" . --Wrong ... SC does use all 4 ... more games to arrive shortly.

try educating yourself before making irresponsible attacks ... you are so far, 0 for 4
 

VERTIGGO

Senior member
Apr 29, 2005
826
0
76
<div class="FTQUOTE"><begin quote>Originally posted by: Harmattan
I'm actually counting on this since I've timed my PC refresh towards Nov-Dec-Jan. My big wish is for less power-draw than my current system. I WILL NOT buy another PSU (been through 3 on this rig). I'd say I'd like to hold out for R700, but that would be laughable considering the R600 delay debacle.

If the planets in fact do align, of course, and there aren't any delays...</end quote></div>

Dude, you have 1000W of power. You aren't even close to your limit. Why are you even anxious about power right now??
 

Harmattan

Senior member
Oct 3, 2006
207
0
0
Because I will most likely be going SLI again. My TT Purepower is running at its limit powering one of my 8800 GTXs in SLI. If power requirements for G90 go up anymore from G80, I'll need to get a single 1000W+ PSU.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: coolpurplefan

The REAL problem is people getting huge LCDs with resolutions of 12000x11000 that require video cards from hell to make them run at that resolution.

what is that, like a 1000" jumbotron?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Harmattan
Because I will most likely be going SLI again. My TT Purepower is running at its limit powering one of my 8800 GTXs in SLI. If power requirements for G90 go up anymore from G80, I'll need to get a single 1000W+ PSU.

No you won't. You have more than enough power for sli in that system.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: MarcVenice
Bryan, why spend some decent money on a stopgap, if you could buy a dx10 card now, for 100$ more, and enjoy it till at least december when something better 'should' come out. I bet the 100$ is worth the extra performance, and the dx10 capabilities you're gonna need for crysis and such. For at least the 6 months you're gonna use it, the 100$ extra is worth it. Also, by then, if a new generation dx10 card comes out, dx9 is gonna be worth jack sh!t, but your 8800gts will still sell for a decent amount. Who knows, by then it might be very well possible that your 8800gts runs games just fine on high settings, and don't need a new card at all. 150$ is a good price for a 1950xt, but consider all that, do you not think the extra 100$ would be worth it ?
Because I'm a cheap bastard and I think that $149 for an x1950xt is a smokin' deal! Besides, $250 seems like a fortune to spend on a graphics card.
 

pcslookout

Lifer
Mar 18, 2007
11,959
157
106
I think $250 to $299 is the maximum video cards should be and its well worth even that price. Any higher and the price is just ridiculous.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
That's what I was saying acanthus, he prolly has the g90 and the g92 mixed up, in fact, the g92 is probably gonna sit between the 8800gts 320mb and the 8600gts.

And yeah, I wouldn't, EVER, spend more then 300$ on a videocard, because after that amount you're paying for bragging rights that won't do you much good in games, at least not enough to justify the extra money. This is all relative though, if you earn 100k+ a year, who gives a toss about 500$ :p
 

VinDSL

Diamond Member
Apr 11, 2006
4,869
1
81
www.lenon.com
Originally posted by: Zstream
I swear your more full of BS then anyone here. At resolutions higher then 1280*1024 you start to be GPU limited...
Sorry, but I take offense to that statement!

I'm twice as "full of BS" as apoppin... ask him (or anybody else that knows us)! :D
 

VERTIGGO

Senior member
Apr 29, 2005
826
0
76
Originally posted by: Harmattan
Because I will most likely be going SLI again. My TT Purepower is running at its limit powering one of my 8800 GTXs in SLI. If power requirements for G90 go up anymore from G80, I'll need to get a single 1000W+ PSU.

Have you tested it? My OC'd 165 (1.6v), HD2900XT, 6 HDD, etc. system maxes out at 520W. Adding another 2900XT would make that something like 650-700W and I'm positive that they suck more juice than a G80 (that's good for u).
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: VinDSL
Originally posted by: Zstream
I swear your more full of BS then anyone here. At resolutions higher then 1280*1024 you start to be GPU limited...
Sorry, but I take offense to that statement!

I'm twice as "full of BS" as apoppin... ask him (or anybody else that knows us)! :D

i just can't agree
... your PoV is 'unique' and - imo, welcome

btw, what if the planets somehow miss the speculative "alignment" that the OP suggests ... that performance increases isn't so great and that prices may be 50% higher then ...

it sounds like the OP is living in a "perfectly-imagined" future fantasy ...
--i just prefer to live mine, right now

What then?

and why wait? IF you are *satisfied* with performance, then you are not reading these arguments .. the rest of you are *also* justifying your waiting while performance deteriorates with every new game that is released ... either upgrade now or upgrade later ... big decision

i don't have to think about it anymore ... and my future upgrade path is clear ... Penryn and a 2nd 2900xt or sell whatever my current card is in six months for $200 and drop in another for $200-$250 more

it is my hobby
 

Demoth

Senior member
Apr 1, 2005
228
0
0
Actually, I am not confusing the G90 and G92. Until something to the contrary comes from Nvidia directly, not rumors from other web sites, I will consider the G92 the offical high end release for November. This is what was told to Nvidia investors by their own PR person.

http://www.bit-tech.net/news/2...says_g92_to_near_1tf/1

I also do consider a die shrink to 65nm, the ability to use PCIe 2.0, and 1 teraflops of processing power to be more then a simple G80 revision.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: VinDSL
<div class="FTQUOTE"><begin quote>Originally posted by: apoppin
What then?

and why wait?

it is my hobby</end quote></div>
Heh!

Given you're current deep-pocket daffiness, I hate to show you this, but...

"Thermalright HR-03/R600 - Release the full potential in your HD-2900XT" :)
he-he it looks just like my Thermalright CPU cooler ... how much more weight can my MB take before she cracks in half?
:Q

what daffiness? ... i just upgraded before you did - probably for half what you will finally pay [knowing your predilection to go for the 'best']... and i got to experience a lot of great HW and my knowledge of benchmarking has deepened .. plus i get to explore first hand the GTS vs XT ... the ONLY regret i have is getting Vista now

you guys that *love* Vista are masochistic ... once i am done benching marking with her, she gets an early temporary retirement [without activating it!!] till i get a game that actually REQUIRES it ... hopefully after SP1 is released ... i FEEL for the nvidia and AMD guys that have to write drivers for this beautiful bloatware

i consider this experiment and upgrade ...priceless ... considering ONE of those $400 cards get refunded ... it was just shy of a Grand - counting what i sold in FS/T


 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: apoppin
<div class="FTQUOTE"><begin quote>I swear your more full of BS then anyone here. At resolutions higher then 1280*1024 you start to be GPU limited</end quote></div>seems like you are more full of BS than helpful info, Zstream,; who cares about 10x12 anymore> .. big LCDs are cheap and very demanding on a CPU and that A64-3800+ IS slow for a modern GPU ... are you not keeping up? what is in your rig?

no i don't believe your general comment about SLI/xfire .. AND GPU s are often severely discounted in middle-of-life, never-mind "older generation" ... and i don't need to "ask around" to support your uninformed opinion.

"Anything at or hovers around 70C is hot" is pretty silly statement, even for you . Wrong, so wrong

and you are 100% clueless about SC ... you said "currently no game use all four cores" . --Wrong ... SC does use all 4 ... more games to arrive shortly.

try educating yourself before making irresponsible attacks ... you are so far, 0 for 4

Ok re-read my statement and then reply. I stated anything above 1280*1024, I hope that you are aware of resolutions higher then that....

If you had any clue you would know about the plenty of reviews that test SC at 4 and 2 cores. Alas you do not and so far full of it.

My three rigs are better then your own.

I have three rigs, one is a 6300@ 3.2ghz, 3800+x2 @ 2.8ghz, 6600q each with a 8800 series card or my 2900xt. All three are using Viewsonic 20.1
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Zstream
Originally posted by: apoppin
I swear your more full of BS then anyone here. At resolutions higher then 1280*1024 you start to be GPU limited</end quote></div>seems like you are more full of BS than helpful info, Zstream,; who cares about 10x12 anymore> .. big LCDs are cheap and very demanding on a CPU and that A64-3800+ IS slow for a modern GPU ... are you not keeping up? what is in your rig?

no i don't believe your general comment about SLI/xfire .. AND GPU s are often severely discounted in middle-of-life, never-mind "older generation" ... and i don't need to "ask around" to support your uninformed opinion.

"Anything at or hovers around 70C is hot" is pretty silly statement, even for you . Wrong, so wrong

and you are 100% clueless about SC ... you said "currently no game use all four cores" . --Wrong ... SC does use all 4 ... more games to arrive shortly.

try educating yourself before making irresponsible attacks ... you are so far, 0 for 4

Ok re-read my statement and then reply. I stated anything above 1280*1024, I hope that you are aware of resolutions higher then that....

If you had any clue you would know about the plenty of reviews that test SC at 4 and 2 cores. Alas you do not and so far full of it.

My three rigs are better then your own.

I have three rigs, one is a 6300@ 3.2ghz, 3800+x2 @ 2.8ghz, 6600q each with a 8800 series card or my 2900xt. All three are using Viewsonic 20.1
yes, i AM aware of resolutions above 12x10 .. i am also testing my HD2900xt vs 8800GTS OC at 16x12

So now i have gone from being "full of BS" to just "clueless and so far full of it"? :p
you have a NASTY habit of calling posters with an opposing view "names" .. and it will stop, very soon ... i guarantee it

OK, MR Clue-full - argue with this: Supreme Commander Uses *all* 4 Cores
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
In 6 months, you will hear news of other "planet alignment" events coming soon, like Nehalem and r700. Combine that with the possibility of delays throwing off the Penryn + g90 "alignment", the waiting game can be played forever. Six months is a long time to wait in the computer hardware world, I'd sure hate being stuck with a slow rig for that long. You'd be missing out on some great deals on current HW that would offer more bang/$ than the future "dream system" you're waiting for.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Yep the waiting game can be played forever, but that doesn't change the fact that many midrange buyers are looking for dx10 cards. This makes little sense right now as last gen mid range is killing current cards in current games.

If you want the best, now is always a good time to buy. If you want a long lasting midrange card - may be best to wait for dx10 - though if sm2.0 or 3.0 is any indication - could be another generation or 2 before we see any.

Hope I am wrong and some game comes out that forces me to upgrade!