• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

More Rumours of G71 & G80

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
eh, the chips coming out at the end of this year are UNLIKELY to be 65nm, maybe next years fall chips will be.

this years chipsets will be 90 or 80 nm. I believe theinquirer released news that one of the chip foundries gave the OK for the 80nm process last month or the month before.
 
Am I the only one who believes that these graphics companies are only "beating" the competition by a little bit so that people buy their cards while easing in the "newer cards" when they could easily have released their highest end part? Just seems like they're artifically creating demand by having low supplies, high prices and beating the competition only by a little bit, forcing stupid people to upgrade every time a card is released..
 
So how long until we get dual GPU cores with dual cards on one board (like this) that you can use in Gigabyte's 4-slot PCI-E mobo? I want 16 GPU cores to power my 1280X768 monitor...
 
Does anyone agree that the ATi / nVidia war is the most entertaining product war ever?

Gotta love good compatition! I just wish the prices were lower.
 
Originally posted by: Ronin
You don't frequent CPUs much, do you? 😛 (not that it's nearly as bad as it is here)


Not half as entertaining as the GPU wars.
CPU wars are boring because lets face it, as of late AMD has the edge, and has it by some margin, and has done for a while now.
That may soon change however (I predict) with the new Intel Conroe range.

For now its R580 vs G71!!
 
Originally posted by: nib95
Originally posted by: Ronin
You don't frequent CPUs much, do you? 😛 (not that it's nearly as bad as it is here)


Not half as entertaining as the GPU wars.
CPU wars are boring because lets face it, as of late AMD has the edge, and has it by some margin, and has done for a while now.
That may soon change however (I predict) with the new Intel Conroe range.

For now its R580 vs G71!!

LOL. Are the Intel fanboys also telling you to wait for Conroe for the next build?
 
Originally posted by: nib95
Originally posted by: Ronin
You don't frequent CPUs much, do you? 😛 (not that it's nearly as bad as it is here)


Not half as entertaining as the GPU wars.
CPU wars are boring because lets face it, as of late AMD has the edge, and has it by some margin, and has done for a while now.
That may soon change however (I predict) with the new Intel Conroe range.

For now its R580 vs G71!!

Why wait for Conroe when Presler is here?
 
Originally posted by: nib95
Originally posted by: Ronin
You don't frequent CPUs much, do you? 😛 (not that it's nearly as bad as it is here)


Not half as entertaining as the GPU wars.
CPU wars are boring because lets face it, as of late AMD has the edge, and has it by some margin, and has done for a while now.
That may soon change however (I predict) with the new Intel Conroe range.

For now its R580 vs G71!!

You are right, except most people here seem to think the GTX 512 & R580 are the 2 cards to compare. AMD have Intel over a barrel lately and I don't see that changing any time soon. The only interesting bits in CPU is choosing X2 or Opteron.
 
Originally posted by: Ronin
Originally posted by: nib95
Originally posted by: Ronin
You don't frequent CPUs much, do you? 😛 (not that it's nearly as bad as it is here)


Not half as entertaining as the GPU wars.
CPU wars are boring because lets face it, as of late AMD has the edge, and has it by some margin, and has done for a while now.
That may soon change however (I predict) with the new Intel Conroe range.

For now its R580 vs G71!!

Why wait for Conroe when Presler is here?


Because Conroe is WAY better than presler. Presler is just a 65nm smithfield aka prescott.
Conroe is a whole new architecture however..

unless you were trying to say something else 🙂
 
Originally posted by: Philippine Mango
Am I the only one who believes that these graphics companies are only "beating" the competition by a little bit so that people buy their cards while easing in the "newer cards" when they could easily have released their highest end part? Just seems like they're artifically creating demand by having low supplies, high prices and beating the competition only by a little bit, forcing stupid people to upgrade every time a card is released..

Just like democracy. Two parties, same result either way.

 
How would Nvidia's acquisition of ULi have anything to do with their development of their GPUs? I thought ULi was just a motherboard company...
 
Originally posted by: pyrosity
How would Nvidia's acquisition of ULi have anything to do with their development of their GPUs? I thought ULi was just a motherboard company...

That's just another way for NVIDIA to make money off of ATI 😛

They get money from all of the ULI crossfire boards and all of the XboX 360's.
 
ULi is/was a core-logic developer. Chipsets. They designed a southbridge core-logic chip for some ATI mobos, and from what Wreckage just said, I guess for the Xbox360 as well.
 
Okay, I've read up a good bit on the Xbox 360's hardware, and I've never heard ULi come up. Could anyone provide a link to some information about ULi being associated with the 360? I'll do some independent searching.

I personally think it's a shame that such a promising, innovative independent company got bought out.
 
Originally posted by: pyrosity
Okay, I've read up a good bit on the Xbox 360's hardware, and I've never heard ULi come up. Could anyone provide a link to some information about ULi being associated with the 360? I'll do some independent searching.

NVIDIA makes money off of the 360 because of backwards compatibility with the original Xbox. Nothing to do with ULI.

 
Can you imagine the heatsink that will have to be on such a powerfull quad-core GPU?!?! If I were them, I'd be focusing on getting better performance per watt; as someone said, once every 2 months is way too fast for brand new products, they should be focusing on getting the ones they have to operate under lower power/with less requirements. This would also help them down the road, as they can implement those techniques so the quad-core's heatsink and power requirement doesn't need a whole other case with 5 power supplys..... 😱
 
Originally posted by: Wreckage
Originally posted by: pyrosity
Okay, I've read up a good bit on the Xbox 360's hardware, and I've never heard ULi come up. Could anyone provide a link to some information about ULi being associated with the 360? I'll do some independent searching.

NVIDIA makes money off of the 360 because of backwards compatibility with the original Xbox. Nothing to do with ULI.

i don't think they even make that. the backward compatibility does not work with certain games that use nvidia only features. it simply doesn't make sense that MS would be paying a royalty and not getting full use.
 
1. The time is way off.

2. GPUs are already internally 6 or more cores... so calling it dual core would mean nothing more than putting 2 GPUs on one board which has already been done.

3. The R5XX series is not "bugged" its just very complex, drivers will sort out the issues over time.
 
Originally posted by: ElFenix
Originally posted by: Wreckage
Originally posted by: pyrosity
Okay, I've read up a good bit on the Xbox 360's hardware, and I've never heard ULi come up. Could anyone provide a link to some information about ULi being associated with the 360? I'll do some independent searching.

NVIDIA makes money off of the 360 because of backwards compatibility with the original Xbox. Nothing to do with ULI.

i don't think they even make that. the backward compatibility does not work with certain games that use nvidia only features. it simply doesn't make sense that MS would be paying a royalty and not getting full use.

Nonetheless, they pay it.

http://www.nforcershq.com/article3153.html
 
i don't think they even make that. the backward compatibility does not work with certain games that use nvidia only features. it simply doesn't make sense that MS would be paying a royalty and not getting full use.

nV could hand them full use and it wouldn't take care of the missing functionality of the R500. As it appears right now, certain types of shadowing techniques that work on the GeForce3 still won't be possible on the R600 in the same fashion.
 
Back
Top