Do you think ATI can outdo Nvidia's 6800U?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
That has nothing to do with the fact that you dont know system settings or any of the demographics for that matter.

What if NVIDIA users overclock more?

What if theres more NVIDIA cards out there than ATi?

What if its some sepcific stupid error, like IE crashing and autoreporting it...

What if....
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
Originally posted by: Acanthus
That has nothing to do with the fact that you dont know system settings or any of the demographics for that matter.

What if NVIDIA users overclock more?

What if theres more NVIDIA cards out there than ATi?

What if its some sepcific stupid error, like IE crashing and autoreporting it...

What if....


Lets say Consumer Reports takes a sampling of car owners to determine which autos are the most reliable. Based on problems that the owners reported. They find chevy Malibu owners report fewer problems than Ford Taurus owners.
Ford cries fowl.

What if they polled more ford owners?
What if the Chevy owners didnt report their problems?
What if more ford owners drive faster?

"What ifs" arent really valid basis for invalidating statistical data.


 

Mem

Lifer
Apr 23, 2000
21,476
13
81
Do you think ATI can outdo Nvidia's 6800U?


I think it`ll be close,either slightly faster or slightly slower.The 6800U is really the first card Nvidia has got right since pre NV30.

Anyway time will tell .
 

g3pro

Senior member
Jan 15, 2004
404
0
0
to have the X800 being as fast would be a tough task to deal with, especially with a 12x1 architecture. keep in minds, guys, that the leap made by the 6800 was enormous.
 

sandorski

No Lifer
Oct 10, 1999
70,697
6,257
126
I really haven't a clue, but think that Nvidia has finally gotten its' act together after 2ish years of slacking off. The 6800U is about as much a leap as the 9700Pro was when it came out, which is mighty fine.

We'll soon know if ATI can match or exceed that leap, personally I hope they both are about the same. That will keep both on their toes and in a constant battle to out do each other.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Originally posted by: biostud666
Originally posted by: Shamrock
Robert Heron of TechTV's "The Screen Savers" reviewed the new 6800Ultra, and he said the 6800 is drawing over 100w of power @ 12 amps, but he said he ran it fine on a 420w PSU.

That would be 100w @ 12V right?

So that would be around 9-10 amps @ 12V
and the highend CPUs (specially Prescott) are around the same, then add another 9-10 amps @ 12V
and then for the hdd/Cd-drives fans etc. maybe max up to 5A?

that would be ~25A @ 12V

But it didn't seem like it used much more power than the NV35/R350, and they can run fine in SFF with 250W PSUs.

I'm puzzled



No, he stressed this point TWICE, he specifically said 12 AMPS and a little over 100watts of power drawn. I didnt get it either, but if he said it twice, he meant to say it.
 

caz67

Golden Member
Jan 4, 2004
1,369
0
0
The review i read at Tom's Hardware, claims Nvidia recommend 480W PSU, for best results.

Review

At the end of the day, both cards are going to be awesome, and heaps better than the current crop.

So no matter which one you buy, you wont be disappointed.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: reever
Originally posted by: CaiNaM
Originally posted by: TheSnowman
lol Sneaky, that goes for any card.


i disagree.. since r420 is simply an extension of r3xx architecture. a completely new architecture is more likely to improve over time from driver optimization than an existing one... unless you think nv got it completely right out of the gate and there's nothing left to optimize....

Completely new architecture doesnt mean much anymore, the NV40 still shares features with the NV30/35 generation

Features like what? DirectX support? It's built out of transistors?

And doesn't mean much? FFS, it DOUBLED the performance of the NV38 in most areas (even more than that in shader operations). Not only that, but it blew by the 9800XT as well. Now, granted we don't know what ATi has up their sleeve this round, but I wouldn't say that this new architecture "doesn't mean much".


 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: ZimZum
Originally posted by: Acanthus
That has nothing to do with the fact that you dont know system settings or any of the demographics for that matter.

What if NVIDIA users overclock more?

What if theres more NVIDIA cards out there than ATi?

What if its some sepcific stupid error, like IE crashing and autoreporting it...

What if....


Lets say Consumer Reports takes a sampling of car owners to determine which autos are the most reliable. Based on problems that the owners reported. They find chevy Malibu owners report fewer problems than Ford Taurus owners.
Ford cries fowl.

What if they polled more ford owners?
What if the Chevy owners didnt report their problems?
What if more ford owners drive faster?

"What ifs" arent really valid basis for invalidating statistical data.



Are you nuts? That's the ONLY reason for invalidating statistical data. Anyone who's taken a statistics class knows this. What ifs, in the field of statistics, are called lurking variables.

These variables are correlated to the issue at hand, but not causitive. For example, in the Summer, incidence of eating ice cream goes up and incidence of shark attacks goes up. Does this mean that eating ice cream causes you to be more prone to a shark attack?

No, it's a lurking variable. Heat. The increased heat of the summer months cause more people to eat ice cream, and more swimming in the ocean. Hence, more shark-targets.

WHAT IFs are what must be stroven to be eliminated when gathering statistical data, otherwise the conclusions drawn from some data are useless; you'd be sitting here telling me that eating ice cream makes you more prone to a shark attack.


So to compare to the situation above, incidence of crashes and nvidia hardware may be higher simply because more people who have no idea how to use/maintain a PC use Nvidia hardware that came from their OEM or some such. Or there may be another reason.

There are tons of LURKING VARIABLES, which is why we CAN'T state a direct causitive relationship between M$ OCA reports and stability and relianility of GPU drivers.

Thank you, please stop talking now.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I would be surprised if the card that is released in a week will be faster or even as fast. But the card they plan on rushing out in June could very well be as fast or slightly faster in some instances if the 600Mhz clock rumor comes true.
 

HobartPaving

Junior Member
Oct 13, 2000
16
0
0
Originally posted by: Insomniak

So to compare to the situation above, incidence of crashes and nvidia hardware may be higher simply because more people who have no idea how to use/maintain a PC use Nvidia hardware that came from their OEM or some such. Or there may be another reason.

There are tons of LURKING VARIABLES, which is why we CAN'T state a direct causitive relationship between M$ OCA reports and stability and relianility of GPU drivers.

Thank you, please stop talking now.

much of statistics is based on correlation, not causation. you don't use linear regression to nail down causation, you use it to find a correlation. you can never isolate every variable. okthxbye. :)

 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Nebor
Umm, OCA isnt based on people calling its based on error reporting by windows after a crash. The user can allow or disallow the error report but I dont see how that would skew the data.

That's the little thing that pops up after an error asking if you'd like to report the problem to microsoft. Most people probably click no, but for all we know, more ATI than Nvidia people could click no. Who knows? That's why the data can't be accurate.

actually that's a resonable assumption, as most people who own the high end stuff would NOT click the button, and frankly it's THESE owners who most likely own ati products, as they've had the highest performance these last 2 years. why do you think nv sells the a crapload of gf2 architecture cards?

sure, it's inconclusive, but so is the orignal statement that ms oca data means anything in this case...
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: ronnn
I think Nvidia will win this round, which is really for the best. :beer: But being a sucker I am still a little disapointed that it isn't 2 - 7 times as fast.
rolleye.gif
But maybe with the bugs out and pci express and faster processors - we could see those figures come true.

the HL2 benchmarks are about 10x faster....

 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: HobartPaving
Originally posted by: Insomniak

So to compare to the situation above, incidence of crashes and nvidia hardware may be higher simply because more people who have no idea how to use/maintain a PC use Nvidia hardware that came from their OEM or some such. Or there may be another reason.

There are tons of LURKING VARIABLES, which is why we CAN'T state a direct causitive relationship between M$ OCA reports and stability and relianility of GPU drivers.

Thank you, please stop talking now.

much of statistics is based on correlation, not causation. you don't use linear regression to nail down causation, you use it to find a correlation. you can never isolate every variable. okthxbye. :)

Exactly, but that's not what's being discussed here. He's saying supposedly the M$ OCA results show causation of PC problems with Nvidia drivers when they only show correlation.

Correlation means nothing. Causation does. Therefore, his point is moot and your post is off topic.

kthxbyegg. :) ;)
 

bpt8056

Senior member
Jan 31, 2001
528
0
0
Originally posted by: caz67
The review i read at Tom's Hardware, claims Nvidia recommend 480W PSU, for best results.

Review

At the end of the day, both cards are going to be awesome, and heaps better than the current crop.

So no matter which one you buy, you wont be disappointed.

The reviewer at IGN used a "small" 400W power supply and it ran the 6800U just fine without any hiccups. I believe the 480W power supply requirement is pretty conservative.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
It seems people under NDA are fairly confident that ATi will make a good showing.

Pinch of salt, of course, but I'm expecting near-parity with a 500/500 X800 Pro--at least, for right now, with current drivers. As Cainam said, nV may have more to gain from driver updates (especially with their dual co-issue).

Edit:
HE also showed the difference in Farcry with PS2 vs PS3, it's like daylight and dark. He made note that it had ZERO performance loss in the process.
If TechTV showed screenshots similar to PCPer, then the IQ difference may not be as dramatic as the original commentary implied.

Edit: The Sequel: I don't think a 12-pipe X800P @ 475/900 has a chance at beating the 6800U, but a higher-clocked 16-pipe X800XT may well maul the 6800U. Pity about PS2.0, thus likely no AA/AF improvements--ATi may well have just cranked out a tweaked RV350, as predicted. 6800U is so impressive not just because it's so fast, but it's so darn feature-complete (SM3.0, programmable 2D core). I wonder if R420's lack of feature improvements means we'll see R500 as soon as this year....
 

Auric

Diamond Member
Oct 11, 1999
9,591
2
71
Originally posted by: bpt8056

The reviewer at IGN used a "small" 400W power supply and it ran the 6800U just fine without any hiccups. I believe the 480W power supply requirement is pretty conservative.

I think you mean liberal or excessive.

If they have not published a specific amperage requirement then a nebulous suggestion of a "480W" unit is likely based upon the assumption of the crappiest manufacturer with the most overrated output claim.
 
Apr 14, 2004
1,599
0
0
Pinch of salt, of course, but I'm expecting near-parity with a 500/500 X800 Pro--at least, for right now, with current drivers. As Cainam said, nV may have more to gain from driver updates (especially with their dual co-issue).
I am guessing you meant x800 XT? Comparing the x800 pro to the 6800 ultra is unfair.

And I am also guessing nvidia's 480 watt figure is assuming more components, like 2 HDD, 2 opticals, etc. If you made a bare bones gaming machine you could get by with less.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Where are people getting these insanely high core clocks anyway for the X800 series?

Generally with a redesign of this magnitude, and the raw size of the core, speeds like that wont be possible until much later steppings.
 
Apr 14, 2004
1,599
0
0
Where are people getting these insanely high core clocks anyway for the X800 series?

Generally with a redesign of this magnitude, and the raw size of the core, speeds like that wont be possible until much later steppings.
I believe these clock speeds have been confirmed by ATI, or at least someone with inside info has posted them somewhere.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: ZimZum
Originally posted by: Acanthus
That has nothing to do with the fact that you dont know system settings or any of the demographics for that matter.

What if NVIDIA users overclock more?

What if theres more NVIDIA cards out there than ATi?

What if its some sepcific stupid error, like IE crashing and autoreporting it...

What if....


Lets say Consumer Reports takes a sampling of car owners to determine which autos are the most reliable. Based on problems that the owners reported. They find chevy Malibu owners report fewer problems than Ford Taurus owners.
Ford cries fowl.

What if they polled more ford owners?
What if the Chevy owners didnt report their problems?
What if more ford owners drive faster?

"What ifs" arent really valid basis for invalidating statistical data.

But this isnt balanced data intended to work for the purpose its being presented. MS's total error reporting is not fair or balanced in any way, theres thousands of variables you could come up with.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Originally posted by: Pete
It seems people under NDA are fairly confident that ATi will make a good showing.

Pinch of salt, of course, but I'm expecting near-parity with a 500/500 X800 Pro--at least, for right now, with current drivers. As Cainam said, nV may have more to gain from driver updates (especially with their dual co-issue).

Edit:
HE also showed the difference in Farcry with PS2 vs PS3, it's like daylight and dark. He made note that it had ZERO performance loss in the process.
If TechTV showed screenshots similar to PCPer, then the IQ difference may not be as dramatic as the original commentary implied.

Edit: The Sequel: I don't think a 12-pipe X800P @ 475/900 has a chance at beating the 6800U, but a higher-clocked 16-pipe X800XT may well maul the 6800U. Pity about PS2.0, thus likely no AA/AF improvements--ATi may well have just cranked out a tweaked RV350, as predicted. 6800U is so impressive not just because it's so fast, but it's so darn feature-complete (SM3.0, programmable 2D core). I wonder if R420's lack of feature improvements means we'll see R500 as soon as this year....


Pete, that is EXACTLY what TechTV showed on their review. the 2 lower screenshots, of the statue scene, you can tell ALOT of difference in the 2 pics, but they claim at "NO" performance loss? I mean look at that statue!