Attn: josh6079, I know you love it...

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: tuteja1986
I think if nvidia start to develop its own cpu processor then you will have ATI , Intel ,IBM and AMD tag team to make sure nvidia don't live to see 2010.

Your right in that Nvidia is that devastating to their competition that you do NOT want to be competing against them.

I'd want them out like you do, and all those companies do as well.
Cant exist with Nvidia around.

Or you'll end up like ATI. Getting taken-over because you cant stand on your own two feet. :thumbsup:

Originally posted by: josh6079
Topic Title: Attn: josh6079, I know you love it...
Topic Summary: ...Inq says 12K 3D Marks for GeForce 8800 ;-)
ROTFL!!!! Oh, man Sable that one got me!!

3DMark, schmeedeeMark, I want to see some Oblivion frames with max settings and HDR+4xAA in there.

If it does have performance free HDR+AA.. you are in for a treat.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Crusader
Originally posted by: josh6079
Topic Title: Attn: josh6079, I know you love it...
Topic Summary: ...Inq says 12K 3D Marks for GeForce 8800 ;-)
ROTFL!!!! Oh, man Sable that one got me!!

3DMark, schmeedeeMark, I want to see some Oblivion frames with max settings and HDR+4xAA in there.

If it does have performance free HDR+AA.. you are in for a treat.

It won't be "perfomance free", the current 7 series already does that. Afterall, performance free would imply 0 FPS, therefore giving no performance. Like "smoke free" = no smoke, etc. and currently their FP16 HDR+AA is 0 FPS because you can't do it. ;)

But I get what you're saying in that it shouldn't drag down the frames very much. If you're right, I will be in for a treat. It's about time that Nvidia (and ATI for that matter) gives us something different, and I really do hope these cards are monsters.
 

coolpurplefan

Golden Member
Mar 2, 2006
1,243
0
0
Originally posted by: gersson
I'd have to do SLI 8800GTX to see a difference :D
I'm getting 11,457 3dmark06

LOL, you got CPU and 2 GPU overclocked and only have a 600 watt power supply^ Is that a single rail or dual rail PSU^
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
since i mentioned the nvidia foray into CPU0making . . . perhaps you want to know about Microsoft:
BREAKING NEWS! - - M$ to get into the chip business!
Microsoft Looks Within to Design and Test Chips
By JOHN MARKOFF
Published: October 19, 2006

MOUNTAIN VIEW, Calif., Oct. 16 ? For more than two decades, Microsoft?s software and Intel?s processors were so wedded that the pairing came to be known as Wintel. But as that computing era wanes, Microsoft is turning to a new source of chip design: its own labs.

The design effort will initially be split between research labs at the company?s headquarters in Redmond, Wash., and its Silicon Valley campus here. Tentatively named the Computer Architecture Group, the project underscores sweeping changes in the industry.

One reason for the effort is that Microsoft needs to begin thinking about the next-generation design of its Xbox game console, said Charles P. Thacker, a veteran engineer and Microsoft engineer who will head the Silicon Valley group. Voice recognition may also be an area where the research could play a significant role.
. . .

Microsoft is exploring hardware design now in part because of a new set of tools that will make it possible to test ideas quickly, he said. The researchers will employ a system designed by researchers at the University of California, Berkeley, that makes it possible to reconfigure computer designs without the cost of making finished chips.

[much more]
:Q

now Intel should make an Operating System :p
 

ayabe

Diamond Member
Aug 10, 2005
7,449
0
0
Originally posted by: coolpurplefan
Originally posted by: gersson
I'd have to do SLI 8800GTX to see a difference :D
I'm getting 11,457 3dmark06

LOL, you got CPU and 2 GPU overclocked and only have a 600 watt power supply^ Is that a single rail or dual rail PSU^



You're kidding right? Don't buy into the PSU myth perpetrated by those who don't know any better.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: keysplayr2003
And whats more amazing is that the INq. doesn't think that'll be enough to best R600. I wonder what they know.

It's the INQ and you want to know what They know...........
Be expecting a very short list. :D
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: redbox
Originally posted by: keysplayr2003
And whats more amazing is that the INq. doesn't think that'll be enough to best R600. I wonder what they know.

It's the INQ and you want to know what They know...........
Be expecting a very short list. :D


I think the reason the inq has a 50/50 correct/incorrect ratio is because they really do try to be the very first with any information/rumours about anything. That is a dubious place to be. Sometimes you get correct info, and sometimes someone from ATI calls them with BS info just for fun, and the inq. pays the price (not that I think they care). So at this point, I would cut them some slack.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
I think the point is, the two are going to be some overdue showdowns and I think I'll hold for at least the R600 since I've got an X19k series already capable of some great physics. Not only that, but if G80 comes out without any competition, the price won't really decrease until there is competition. If G80 beats R600 I'll buy with or without advanced physics processing.
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
Hey, how about they add this cool feature, not having to reboot to enable/disable SLI. Or just have SLI enabled but give you the option to display it using one or two monitors.
 

JackBurton

Lifer
Jul 18, 2000
15,993
14
81
Originally posted by: josh6079
I think they already did that.
They have?! So I can have two monitors connected and just turn SLI on and off on the fly? Or just leave it on and have it display in both monitor or just one? If so, that is GREAT! That was my only concern left with SLI. I'll take two 8800s when they come out then. :) Seriously. :)
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: SpeedZealot369
Originally posted by: Wreckage
Games are getting a serious kick in the ass next month. G80, DirectX 10, Wii, PS3, Xbox 360 1080p, Blu-ray, etc.

Keep it coming!

Not to mention Gears of War :thumbsup:


Is that out for PC?
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: JackBurton
Originally posted by: josh6079
I think they already did that.
They have?! So I can have two monitors connected and just turn SLI on and off on the fly? Or just leave it on and have it display in both monitor or just one? If so, that is GREAT! That was my only concern left with SLI. I'll take two 8800s when they come out then. :) Seriously. :)
Like I said, I think SLI and Crossfire have fixed that issue on their newer drivers. I believe with SLI you have to use 91.xx or higher to keep from rebooting when switching between SLI and regular.

I would ask around just to be sure, but if IIRC that's what I was informed a while back and I remember being pretty excited about that as well since dual-monitors are a must for me.

 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: keysplayr2003
Originally posted by: redbox
Originally posted by: keysplayr2003
And whats more amazing is that the INq. doesn't think that'll be enough to best R600. I wonder what they know.

It's the INQ and you want to know what They know...........
Be expecting a very short list. :D


I think the reason the inq has a 50/50 correct/incorrect ratio is because they really do try to be the very first with any information/rumours about anything. That is a dubious place to be. Sometimes you get correct info, and sometimes someone from ATI calls them with BS info just for fun, and the inq. pays the price (not that I think they care). So at this point, I would cut them some slack.

Ya I'm just joshing with ya. As bad as some of us hate the INQ they do give us something to chew on.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: redbox
Originally posted by: keysplayr2003
Originally posted by: redbox
Originally posted by: keysplayr2003
And whats more amazing is that the INq. doesn't think that'll be enough to best R600. I wonder what they know.

It's the INQ and you want to know what They know...........
Be expecting a very short list. :D


I think the reason the inq has a 50/50 correct/incorrect ratio is because they really do try to be the very first with any information/rumours about anything. That is a dubious place to be. Sometimes you get correct info, and sometimes someone from ATI calls them with BS info just for fun, and the inq. pays the price (not that I think they care). So at this point, I would cut them some slack.

Ya I'm just joshing with ya. As bad as some of us hate the INQ they do give us something to chew on.


They do indeed. And then some. ;)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Originally posted by: redbox
Originally posted by: keysplayr2003
Originally posted by: redbox
Originally posted by: keysplayr2003
And whats more amazing is that the INq. doesn't think that'll be enough to best R600. I wonder what they know.

It's the INQ and you want to know what They know...........
Be expecting a very short list. :D


I think the reason the inq has a 50/50 correct/incorrect ratio is because they really do try to be the very first with any information/rumours about anything. That is a dubious place to be. Sometimes you get correct info, and sometimes someone from ATI calls them with BS info just for fun, and the inq. pays the price (not that I think they care). So at this point, I would cut them some slack.

Ya I'm just joshing with ya. As bad as some of us hate the INQ they do give us something to chew on.


They do indeed. And then some. ;)

and now it's *CONFIRMED*

:p

http://www.pcwelt.de/news/hardware/video/61081/index.html

More G80 numbers outed, on more CPUs
WANT HARD NUMBERS on G80? Want specs? Well, you can find the specs here at PC Welt, but they are in German. As for numbers, we told you that they would be at about 12,000 give or take a driver revision.

That brings up the question of 'on what'? Those numbers are on a Kentsfield, how will it do on your CPU?

Well, a Conroe 2.93 will score about 10,500, and a FX-62 will hit almost 10,000 on the nose. It is interesting that the quad will hit notably higher with a lower clock but a higher bus.

theInq gets it -- on the nose.

disbelievers :p

:roll:

:D