This is crazy if true ***Update 40K 3dmarks possible???**

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Specs like these are definately useful... just look at what HDR does to Far Cry performance... and we'll soon find out what HDR does to HL2 performance. And look at those screenshots from the sequal to Morrowind... all that stuff will definately benefit from extra pipelines and core speeds and memory speeds. Plus, screen resolutions of 1600x1200 are not as uncommon as they used to be, and some people even want to use higher resolutions...
 

fstime

Diamond Member
Jan 18, 2004
4,382
5
81
Wow, you are dense. SLI was never said to be future proof. It was never said to bring value. It simply provides the best all around performance in any given generation of graphics cards. We've went over this ad nauseum. Get that through your head. SLI is not a value. SLI is simply the best at any given point in time. And odds are good that people with a 6800ultra/gt sli setup right now would buy an SLI setup of the next generation if they had the chance. Some people are not interested in buying a computer based upon value for the dollar. These are folks who will pay large sums of money to get the best in the consumer marketplace, even if it doesn't offer an absolutely massive performance gain. Don't generalize your computing experience to every single person that buys hardware. NV is pushing SLI and clearly sees a market there, and people are buying SLI and enjoying living on the cutting edge of technology.

I really dont care, SLI is worthless, not worth the money, but hey if your part of that small percentage thats willing to pay double the money and not get double the performance, go ahread, a smart person would'nt pay the extra for SLI. You are the dense one if you think SLI is actully worth it, let me guess, your the kind of person that buys 2 gb over 1 gb of ram or a 9800Xt over a 9800Pro, or a 6800 Ultra over a 6800Gt, sorry but I dont make stupid choices and I will continue to dislike SLI.


there would be ALOT of angry people
oh well, i have no pity for them..ill be laughing lol
BTW BouZouki, are you greek?
/ontopic

Yep.
 

sparkyclarky

Platinum Member
May 3, 2002
2,389
0
0
Originally posted by: BouZouki
Wow, you are dense. SLI was never said to be future proof. It was never said to bring value. It simply provides the best all around performance in any given generation of graphics cards. We've went over this ad nauseum. Get that through your head. SLI is not a value. SLI is simply the best at any given point in time. And odds are good that people with a 6800ultra/gt sli setup right now would buy an SLI setup of the next generation if they had the chance. Some people are not interested in buying a computer based upon value for the dollar. These are folks who will pay large sums of money to get the best in the consumer marketplace, even if it doesn't offer an absolutely massive performance gain. Don't generalize your computing experience to every single person that buys hardware. NV is pushing SLI and clearly sees a market there, and people are buying SLI and enjoying living on the cutting edge of technology.

I really dont care, SLI is worthless, not worth the money, but hey if your part of that small percentage thats willing to pay double the money and not get double the performance, go ahread, a smart person would'nt pay the extra for SLI. You are the dense one if you think SLI is actully worth it, let me guess, your the kind of person that buys 2 gb over 1 gb of ram or a 9800Xt over a 9800Pro, or a 6800 Ultra over a 6800Gt, sorry but I dont make stupid choices and I will continue to dislike SLI.


there would be ALOT of angry people
oh well, i have no pity for them..ill be laughing lol
BTW BouZouki, are you greek?
/ontopic

Yep.


Actually, no, I'm not a member of that market. However, if a person has the money and wants to spend it on their hobby, who the fvck are you to tell them that what they're spending their money on is worthless? And since when is worthless the equivalent of not worth the money? Perhaps in some fantasy world you live in there is a connection between the two, but in the real world, something being not worth the money for the majority of the market does not make it worthless. If you want the absolute best, you'll pay for it. Simple as that. Bleeding edge technology also bleeds the pocket book, but that doesn't make it anything less than the best. Crawl back into your hole troll.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
I really dont care, SLI is worthless, not worth the money, but hey if your part of that small percentage thats willing to pay double the money and not get double the performance, go ahread, a smart person would'nt pay the extra for SLI.

Very closed minded statement.

SLI is not worthless... it does increase performance.
Not worth the money... ok... maybe not to you, or me, or a lot of other people, but it is to some people.
Not get double the performance... ok... maybe it's not double, but it was never advertised as double the performance, and, if nVidia's fastest single card is 100%, ATI's fastest card is 120%... but TWO of nVidia's fastest cards is 150%... and someone wants to run 2048x1536 with 8XAA and 16XAF... well... I would assume two would do a better job than one.

As far as being smart or not... I don't think it's smart to pay over $300 for a set of 5.1 speakers. Does that mean you're a stupid sh!thead for buying Klipsch Promedia 5.1 Ultra speakers? No... it means sound quality is more valuable to you than it is to me. And to someone else, being able to run HL2 at 1600x1200 with 4XAA and 8XAF with an average of 97 FPS rather than 58 FPS in one area, 113 FPS rather than 77 FPS in another area might be more valuable to them than it is to you.

You're also forgetting that a couple 6800 Ultra in SLI are more than likely bottlenecked by the rest of the system... so when you can upgrade to dual core 3.0 GHz Athlon-64, the performance gains of SLI might be larger than they are now with a single core 2.6 GHz processor.
 

fstime

Diamond Member
Jan 18, 2004
4,382
5
81
My whole argument was simply the next line of cards will be the better buy and I stand by that.
 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
You're also forgetting that a couple 6800 Ultra in SLI are more than likely bottlenecked by the rest of the system... so when you can upgrade to dual core 3.0 GHz Athlon-64, the performance gains of SLI might be larger than they are now with a single core 2.6 GHz processor.

/nitpick
the whole point of SLI is to improve FPS in GPU intensive areas. your cpu wont help you there. as long as you have a 3500+ or 3.4GHz P4 or above your CPU will not noticibly bottleneck your GPUs.
the only places that are CPU limited already are going at over 100FPS, so a new cpu will not really help you.
/nitpick
 

sparkyclarky

Platinum Member
May 3, 2002
2,389
0
0
Originally posted by: BouZouki
My whole argument was simply the next line of cards will be the better buy and I stand by that.

And SLI using the next gen of NV (or possibly ATI) cards will still trump the performance of a single card of the next gen. "Better buy" doesn't tend to matter to the absolute highest end of the market (the people purchasing the fastest components available). Sure, for you and me SLI might not be a wise purchase, but to the balls to the wall performance enthusiast it easily could be.

Not to mention that "better buy" is highly dependent upon your system specs and what games you play/what you do with your computer. You're arguing based upon something that fluctuates (a notion of value to the individual consumer per dollar) and I'm arguing based upon a static fact - that in most cases, 2 video cards linked in SLI will give the highest gaming performance, allowing for satisfaction within the target audience of SLI.
 

Tab

Lifer
Sep 15, 2002
12,145
0
76
Originally posted by: malak
Originally posted by: zakee00
ok lets just end this now: the whole point of making faster videocards is to enable game developers to use more advanced effects/more realisitc graphics. an x800 isnt going to cut it for the next few years...therefore they release a new card. welcome to the computer indrustry.

That's ridiculous. The x800 is enough. My 9700 Pro lasted 2 years running even new games today. An x800 should last another 2 years fine.

And let's talk about those more advanced effects. Shaders 3.0 for instance, how many games have been developed for that yet? AoE4? That's it? This card wouldn't add any new tech that developers could take advantage of, so who cares? Sure it's faster, but all that matters is in benchmarks, in any games being developed now it's still too much. With everyone switching to LCD's that have low refresh rates, those high FPS counts don't matter at all. Current gen cards can push the FPS higher than our monitors support.

LCDs have low refresh rates? What? My Dell 2005 WFP is at 60, same with CRT Monitors...

They all push out more FPS than our monitors support, thats because sometimes there isn't that much being graphically displayed that is important. Do a different program it'll be much less..
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
I am very interested in how they know it is 5x as fast. What kind of processor is this that isn't limiting this GPU at all?

Additionally if it is coming in may i find it hard to believe that they will be using 90nm tech. Intel with all its power and might is having enough trouble with 90nm, and im sure AMD has had their fair share of problems.

This should remain rumor because some of these specs are way out there.

I dont remember the calculation but imagine what latency that memory would be at. We are hitting ~1.2 with 1.6ns memory. Jeez. What is this going to be .8ns (I dont think it scales linearly).

-Kevin
 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
I am very interested in how they know it is 5x as fast. What kind of processor is this that isn't limiting this GPU at all?
why do so many people have trouble understanding this?
ok, in the Valve VST, what is limiting your FPS?
your graphics card. that is going to be what all games are like in the near future, using advanced GPU features (pixel shaders, etc). there is nothing coming out that kills your CPU performance besides more advanced physics, which all current mid-top end processors can handle.
WE DONT NEED FASTER PROCESSORS FOR GAMES, WE NEED FASTER VIDEO CARDS!
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: zakee00
You're also forgetting that a couple 6800 Ultra in SLI are more than likely bottlenecked by the rest of the system... so when you can upgrade to dual core 3.0 GHz Athlon-64, the performance gains of SLI might be larger than they are now with a single core 2.6 GHz processor.

/nitpick
the whole point of SLI is to improve FPS in GPU intensive areas. your cpu wont help you there. as long as you have a 3500+ or 3.4GHz P4 or above your CPU will not noticibly bottleneck your GPUs.
the only places that are CPU limited already are going at over 100FPS, so a new cpu will not really help you.
/nitpick

I agree... but things are not standing still... newer games will of course have steeper system requirements. And I'm fairly confident that there will be a time when a 3500+ isn't capable of feeding two 6800 Ultra's with everything they can process.
 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
that time will be a LONG ways away. when 6800s cant push out enough FPS to make the game playable.
just within this last year is when games have started requiring more then 1GHz processors, and how long have they been out? 3, 4 years?
 
Mar 19, 2003
18,289
2
71
Originally posted by: zakee00
that time will be a LONG ways away. when 6800s cant push out enough FPS to make the game playable.
just within this last year is when games have started requiring more then 1GHz processors, and how long have they been out? 3, 4 years?

I agree with you to a point - but I certainly wouldn't want to use a 1GHz CPU for anything that lists that 1GHz as a "requirement". :) (Hell, even my Athlon XP feels a bit slow these days...)
 

RGebhart

Member
Nov 11, 2004
96
0
0
Originally posted by: malak
Originally posted by: zakee00
Originally posted by: malak
I don't know why they'd bother with something that good. Playing a new game(even doom3 on ultra) on that card would be like playing quake2 on an x850XT.

thats probably the stupidest thing ive ever heard.
did you say the same thing before the x800/6800's came out?
i doubt thats right though....700MHz core? thats a HUGE improvement. 512MB of memory is cool

Maybe you missed the last part, it says 3x the performance of an x800XT. An x800XT is not 3x the performance of a 9800XT, so what I said makes sense. Not to mention the fact that the release date of this rumor says Q2 05, which would be just 6 months after the x800XT release, and about 2 years before any game can really push it.

The 700mhz core isn't the big thing, it's the 1.8ghz memory clock.

 

RGebhart

Member
Nov 11, 2004
96
0
0
Originally posted by: malak
Originally posted by: zakee00
Originally posted by: malak
I don't know why they'd bother with something that good. Playing a new game(even doom3 on ultra) on that card would be like playing quake2 on an x850XT.

thats probably the stupidest thing ive ever heard.
did you say the same thing before the x800/6800's came out?
i doubt thats right though....700MHz core? thats a HUGE improvement. 512MB of memory is cool

Maybe you missed the last part, it says 3x the performance of an x800XT. An x800XT is not 3x the performance of a 9800XT, so what I said makes sense. Not to mention the fact that the release date of this rumor says Q2 05, which would be just 6 months after the x800XT release, and about 2 years before any game can really push it.


The 700mhz core isn't the big thing, it's the 1.8ghz memory clock.


x800xt was realeased in june/july 04 so it will be around 1 year after. it just wasnt in huge quanities


 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
Originally posted by: RGebhart
Originally posted by: malak
Originally posted by: zakee00
Originally posted by: malak
I don't know why they'd bother with something that good. Playing a new game(even doom3 on ultra) on that card would be like playing quake2 on an x850XT.

thats probably the stupidest thing ive ever heard.
did you say the same thing before the x800/6800's came out?
i doubt thats right though....700MHz core? thats a HUGE improvement. 512MB of memory is cool

Maybe you missed the last part, it says 3x the performance of an x800XT. An x800XT is not 3x the performance of a 9800XT, so what I said makes sense. Not to mention the fact that the release date of this rumor says Q2 05, which would be just 6 months after the x800XT release, and about 2 years before any game can really push it.


The 700mhz core isn't the big thing, it's the 1.8ghz memory clock.


x800xt was realeased in june/july 04 so it will be around 1 year after. it just wasnt in huge quanities

it still isn't...
 

SonicIce

Diamond Member
Apr 12, 2004
4,771
0
76
Originally posted by: SynthDude2001
Originally posted by: zakee00
that time will be a LONG ways away. when 6800s cant push out enough FPS to make the game playable.
just within this last year is when games have started requiring more then 1GHz processors, and how long have they been out? 3, 4 years?

I agree with you to a point - but I certainly wouldn't want to use a 1GHz CPU for anything that lists that 1GHz as a "requirement". :) (Hell, even my Athlon XP feels a bit slow these days...)

I think 1Ghz is a little too slow. I have an athlon 64 2800+ and radeon 9000 that beats my friends 9800xt with an athlon 2200+.
 
Mar 19, 2003
18,289
2
71
Originally posted by: SonicIce
Originally posted by: SynthDude2001
Originally posted by: zakee00
that time will be a LONG ways away. when 6800s cant push out enough FPS to make the game playable.
just within this last year is when games have started requiring more then 1GHz processors, and how long have they been out? 3, 4 years?

I agree with you to a point - but I certainly wouldn't want to use a 1GHz CPU for anything that lists that 1GHz as a "requirement". :) (Hell, even my Athlon XP feels a bit slow these days...)

I think 1Ghz is a little too slow. I have an athlon 64 2800+ and radeon 9000 that beats my friends 9800xt with an athlon 2200+.

That was basically my point. ;)
 

ohnnyj

Golden Member
Dec 17, 2004
1,239
0
0
I think that we will always need faster GPUs and CPUs for the forseable future. You aren't going to be running something like the Unreal 3 engine on a pair of 6800Ultras at 1920x1200 4xAA 16xAF. I can see the limitations even on my dual 6800GTs, I can't play at that res at 4xAA in HL2. And future games are going to need a lot more horsepower to crank up the graphics, unless of course they optimize the heck out of the engines.
 

RobotOfHatred

Member
Feb 10, 2005
76
0
0
Meh, That seems a little extreme. Ill just get an X850XT which should last me a good 2 or 3 years (my MX 440SE lasted me 2 (or 3 I think) and I got it when it wasnt even High-End (cost me $100) )

Plus, when it first comes out MSRP is gonne be around $500, while stores will be selling them for close to $500000000000000000000000000000000000000000000000000000 plus $20 shipping...
 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
Originally posted by: SynthDude2001
Originally posted by: zakee00
that time will be a LONG ways away. when 6800s cant push out enough FPS to make the game playable.
just within this last year is when games have started requiring more then 1GHz processors, and how long have they been out? 3, 4 years?

I agree with you to a point - but I certainly wouldn't want to use a 1GHz CPU for anything that lists that 1GHz as a "requirement". :) (Hell, even my Athlon XP feels a bit slow these days...)

yeah, but my point was that we didnt even NEED 1GHz processors until 4 years after they came out. so in other words, my 3000+ oced to 2.65GHz will play all games for the next 3 years, while my 6800 ultra WONT
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Originally posted by: humey
A 6800 has only 2.0ns ram most GTS and all Ultras have 1.6ns ram

I'm not sure if you were referring to my post about softmodding my 6800NU into a 128MB "6800GT" or not, but if XFX can call their 900mhz mem 6800GT a GT, I can certainly call my softmodded 6800 with a 900mhz mem clock a 128MB GT.
 
Mar 19, 2003
18,289
2
71
Originally posted by: zakee00
Originally posted by: SynthDude2001
Originally posted by: zakee00
that time will be a LONG ways away. when 6800s cant push out enough FPS to make the game playable.
just within this last year is when games have started requiring more then 1GHz processors, and how long have they been out? 3, 4 years?

I agree with you to a point - but I certainly wouldn't want to use a 1GHz CPU for anything that lists that 1GHz as a "requirement". :) (Hell, even my Athlon XP feels a bit slow these days...)

yeah, but my point was that we didnt even NEED 1GHz processors until 4 years after they came out. so in other words, my 3000+ oced to 2.65GHz will play all games for the next 3 years, while my 6800 ultra WONT

I understand what you're saying (and it makes sense), but I'm just trying to say that by the time we "need" processors of a certain speed, those processors are really too slow already. Yes, they'll work of course...but IMO it won't be an enjoyable gaming experience.

For example, UT2003 (when it came out on October 2002) listed a 733MHz P3 as the CPU requirement. I'm sure you could play it on such a system (indeed, I have a friend who plays it on a 1GHz P3) - but I sure as hell wouldn't want to play it on that. :)

Also...if you want to argue that your 2.6GHz A64 will play all games in the next 3 years, but the 6800U won't...I don't exactly agree with that either. Just as the CPU will (likely - unless speed increases hit even more of a wall than they already have in the past year) be slow for games a few years out - so will the 6800U. But it'll still play the games - just not with all the "eye candy" and such. Three years ago the top of the line was the Geforce 4 series (Ti4600 specifically). You can still get by in HL2 (for example) with one of those today. I wouldn't want to, and you'd have to sacrifice the DX9 effects, AA, and probably other stuff - but it won't "not run the game". :) Same idea goes for whatever CPU's were top of the line back then (~2 GHz P4's or Athlon XP ~2000+).

I think that your general point though is that graphics technology has increased at a much greater rate in the last few years than CPU's have - which is absolutely true.