More Rumours of G71 & G80

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

gotsmack

Diamond Member
Mar 4, 2001
5,768
0
71
eh, the chips coming out at the end of this year are UNLIKELY to be 65nm, maybe next years fall chips will be.

this years chipsets will be 90 or 80 nm. I believe theinquirer released news that one of the chip foundries gave the OK for the 80nm process last month or the month before.
 

Philippine Mango

Diamond Member
Oct 29, 2004
5,594
0
0
Am I the only one who believes that these graphics companies are only "beating" the competition by a little bit so that people buy their cards while easing in the "newer cards" when they could easily have released their highest end part? Just seems like they're artifically creating demand by having low supplies, high prices and beating the competition only by a little bit, forcing stupid people to upgrade every time a card is released..
 

Fraggable

Platinum Member
Jul 20, 2005
2,799
0
0
So how long until we get dual GPU cores with dual cards on one board (like this) that you can use in Gigabyte's 4-slot PCI-E mobo? I want 16 GPU cores to power my 1280X768 monitor...
 

acole1

Golden Member
Sep 28, 2005
1,543
0
0
Does anyone agree that the ATi / nVidia war is the most entertaining product war ever?

Gotta love good compatition! I just wish the prices were lower.
 

nib95

Senior member
Jan 31, 2006
997
0
0
Originally posted by: Ronin
You don't frequent CPUs much, do you? :p (not that it's nearly as bad as it is here)


Not half as entertaining as the GPU wars.
CPU wars are boring because lets face it, as of late AMD has the edge, and has it by some margin, and has done for a while now.
That may soon change however (I predict) with the new Intel Conroe range.

For now its R580 vs G71!!
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: nib95
Originally posted by: Ronin
You don't frequent CPUs much, do you? :p (not that it's nearly as bad as it is here)


Not half as entertaining as the GPU wars.
CPU wars are boring because lets face it, as of late AMD has the edge, and has it by some margin, and has done for a while now.
That may soon change however (I predict) with the new Intel Conroe range.

For now its R580 vs G71!!

LOL. Are the Intel fanboys also telling you to wait for Conroe for the next build?
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: nib95
Originally posted by: Ronin
You don't frequent CPUs much, do you? :p (not that it's nearly as bad as it is here)


Not half as entertaining as the GPU wars.
CPU wars are boring because lets face it, as of late AMD has the edge, and has it by some margin, and has done for a while now.
That may soon change however (I predict) with the new Intel Conroe range.

For now its R580 vs G71!!

Why wait for Conroe when Presler is here?
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Originally posted by: nib95
Originally posted by: Ronin
You don't frequent CPUs much, do you? :p (not that it's nearly as bad as it is here)


Not half as entertaining as the GPU wars.
CPU wars are boring because lets face it, as of late AMD has the edge, and has it by some margin, and has done for a while now.
That may soon change however (I predict) with the new Intel Conroe range.

For now its R580 vs G71!!

You are right, except most people here seem to think the GTX 512 & R580 are the 2 cards to compare. AMD have Intel over a barrel lately and I don't see that changing any time soon. The only interesting bits in CPU is choosing X2 or Opteron.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Ronin
Originally posted by: nib95
Originally posted by: Ronin
You don't frequent CPUs much, do you? :p (not that it's nearly as bad as it is here)


Not half as entertaining as the GPU wars.
CPU wars are boring because lets face it, as of late AMD has the edge, and has it by some margin, and has done for a while now.
That may soon change however (I predict) with the new Intel Conroe range.

For now its R580 vs G71!!

Why wait for Conroe when Presler is here?


Because Conroe is WAY better than presler. Presler is just a 65nm smithfield aka prescott.
Conroe is a whole new architecture however..

unless you were trying to say something else :)
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: Philippine Mango
Am I the only one who believes that these graphics companies are only "beating" the competition by a little bit so that people buy their cards while easing in the "newer cards" when they could easily have released their highest end part? Just seems like they're artifically creating demand by having low supplies, high prices and beating the competition only by a little bit, forcing stupid people to upgrade every time a card is released..

Just like democracy. Two parties, same result either way.

 

pyrosity

Member
Dec 20, 2004
42
0
0
How would Nvidia's acquisition of ULi have anything to do with their development of their GPUs? I thought ULi was just a motherboard company...
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: pyrosity
How would Nvidia's acquisition of ULi have anything to do with their development of their GPUs? I thought ULi was just a motherboard company...

That's just another way for NVIDIA to make money off of ATI :p

They get money from all of the ULI crossfire boards and all of the XboX 360's.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
ULi is/was a core-logic developer. Chipsets. They designed a southbridge core-logic chip for some ATI mobos, and from what Wreckage just said, I guess for the Xbox360 as well.
 

pyrosity

Member
Dec 20, 2004
42
0
0
Okay, I've read up a good bit on the Xbox 360's hardware, and I've never heard ULi come up. Could anyone provide a link to some information about ULi being associated with the 360? I'll do some independent searching.

I personally think it's a shame that such a promising, innovative independent company got bought out.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: pyrosity
Okay, I've read up a good bit on the Xbox 360's hardware, and I've never heard ULi come up. Could anyone provide a link to some information about ULi being associated with the 360? I'll do some independent searching.

NVIDIA makes money off of the 360 because of backwards compatibility with the original Xbox. Nothing to do with ULI.

 

TheJollyFellow

Senior member
May 14, 2005
723
0
0
Can you imagine the heatsink that will have to be on such a powerfull quad-core GPU?!?! If I were them, I'd be focusing on getting better performance per watt; as someone said, once every 2 months is way too fast for brand new products, they should be focusing on getting the ones they have to operate under lower power/with less requirements. This would also help them down the road, as they can implement those techniques so the quad-core's heatsink and power requirement doesn't need a whole other case with 5 power supplys..... :eek:
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,395
8,558
126
Originally posted by: Wreckage
Originally posted by: pyrosity
Okay, I've read up a good bit on the Xbox 360's hardware, and I've never heard ULi come up. Could anyone provide a link to some information about ULi being associated with the 360? I'll do some independent searching.

NVIDIA makes money off of the 360 because of backwards compatibility with the original Xbox. Nothing to do with ULI.

i don't think they even make that. the backward compatibility does not work with certain games that use nvidia only features. it simply doesn't make sense that MS would be paying a royalty and not getting full use.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
1. The time is way off.

2. GPUs are already internally 6 or more cores... so calling it dual core would mean nothing more than putting 2 GPUs on one board which has already been done.

3. The R5XX series is not "bugged" its just very complex, drivers will sort out the issues over time.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: ElFenix
Originally posted by: Wreckage
Originally posted by: pyrosity
Okay, I've read up a good bit on the Xbox 360's hardware, and I've never heard ULi come up. Could anyone provide a link to some information about ULi being associated with the 360? I'll do some independent searching.

NVIDIA makes money off of the 360 because of backwards compatibility with the original Xbox. Nothing to do with ULI.

i don't think they even make that. the backward compatibility does not work with certain games that use nvidia only features. it simply doesn't make sense that MS would be paying a royalty and not getting full use.

Nonetheless, they pay it.

http://www.nforcershq.com/article3153.html
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
i don't think they even make that. the backward compatibility does not work with certain games that use nvidia only features. it simply doesn't make sense that MS would be paying a royalty and not getting full use.

nV could hand them full use and it wouldn't take care of the missing functionality of the R500. As it appears right now, certain types of shadowing techniques that work on the GeForce3 still won't be possible on the R600 in the same fashion.