STICKY: ATi 5xxx pre-release thread

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: bryanW1995
wrong,

Originally posted by: bryanW1995
Your statement was wrong

Originally posted by: bryanW1995
You're still wrong.

Originally posted by: bryanW1995
I've consistently said that you were wrong

Originally posted by: bryanW1995
you were wrong to compare

At this juncture I am not interested in your opinion, faulty or otherwise, as we have yet to put to rest your repeated accusations that my post is wrong.

Your proof continues to be lacking. Stay on message please.

Here is my original post, quoted below, please provide proof that any portion of it is wrong.

Originally posted by: Idontcare
Originally posted by: Nemesis 1
Why are you putting ATI marketing with AMD marketing .

Uhm...because this happened in 2006?

AMD Completes ATI Acquisition and Creates Processing Powerhouse

ATI is a brand name, like Hummer or XEON, not a company, like GM or Intel.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: bryanW1995
correct me if I'm wrong,...

snip

This thread is supposed to be about the HD 5800 specs, not how your definition of "brand identity" differs from other people's.

It's getting way OT.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: bryanW1995
correct me if I'm wrong, but we're actually talking about a "brand identity" here. A brand identity is The totality of brand associations including name and symbols that must be communicated As I said earlier, you were wrong to compare a brand identity like xeon with a brand identity like ati. Ati's brand has much more history/name recognition, was it's own company for a long time, and in fact many people probably still don't even know that it's a part of amd now. Xeon is and has always been a subset of intel, if it were spun off (like GF) it would be completely reliant upon intel. If xeon starts to suck in a few years then it will go the way of itanium and nobody on forums (or anywhere else for that matter) will give a shit. If amd completely shuts down ati the managers would be lucky to make it home alive from the public outcry. In fact, it is so unlikely as to be ludicrous to think that they WOULD shut down ati; if things got too terribly bad they would spin it back off.

Now it's your turn. Please show me how my logic is faulty. Prove to me that ati was never a standalone company. Show me how xeon is different from itanium, new coke, or tab. Send a nasty message to paul otellini telling him why you can't live without xeon.

Bryan. W.....T......H...... ??? End it. Please.
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
Who cares about brand names? We should be talking about performance!

I'm really impressed with these released benches (if they're true) considering all the AA being used. Nothing I hate more than jagged lines in an otherwise beautiful looking game. I'm seriously considering upgrading to a 30" screen now.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Red Storm
Who cares about brand names? We should be talking about performance!

I'm really impressed with these released benches (if they're true) considering all the AA being used. Nothing I hate more than jagged lines in an otherwise beautiful looking game. I'm seriously considering upgrading to a 30" screen now.

Hey thanks you just jogged my memory on something I wanted to ask about...with these cool eyefinity setups will come the already well-discussed eyerritating bezel placement situation, to which some have proposed using projector technology (both rear-projection as well as standard front-screen projection) to eliminate thru careful alignment with imperceptible overlap of adjacent projections.

My question is - do these type of projection-based technologies eliminate much (if not all) of the necessity for AA and AF since the pixels are not so sharply defined as they are in LCD/plasma based screens?

This (lack of need for serious AA/AF when coupled with DLP/LCD projection tech) may make eyefinity setups all the more reasonable in the fps dept. Or so I am thinking/wondering.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,818
1,553
136
Not that anyone will be able to tell the difference (vs. G80) except for very rare situation at 600% zoom, but it looks like evergreen has finally been able to perfect AF quality.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: alyarb
Originally posted by: Idontcare
eyerritating

you.. didn't

Oh yes eye did, and eye'll do it again if the mood strikes me :laugh:

Originally posted by: HurleyBird
Not that anyone will be able to tell the difference (vs. G80) except for very rare situation at 600% zoom, but it looks like evergreen has finally been able to perfect AF quality.

Can NOT wait to read AT's review on this chip and the tech behind it. Must be some pretty amazing stuff going on there.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: Scholzpdx
Originally posted by: Nemesis 1
Originally posted by: T2k
Originally posted by: Nemesis 1
I am not interested in eyeinfinity. But The Apples open CL for pyphisics on the Cpu and GPU is great stuff. Now if AMD and Intel can figure out away to stop Havok Physics on Their CPUs when NV card is onboard, it would be a pertfect storm .
As it is the NV 300 had better be all NV is hyping as ATI isn't standing still GO ATI.

Aside of the random CaPitAl LetTErS I swear to God I'm trying hard to make something out of these words but they DO NOT COMPUTE...


...seriously, dude: WTF? :shocked:

Funny . neighbor kids were outside playing . So I called in one . He was 9 . He read it with zero problems . On the other hand he struggled with your comment because it didn't hold up to eyes that perfection as seen by 9 year old . He says you have a problem . LOL

:confused:

Wait, you bring 9 year old boys into your home to read internet pages?

I'd love to be that kids dad.

With that kinda thinking ! You should be in prison . Just WOW!!!

 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Originally posted by: Idontcare
Originally posted by: Red Storm
Who cares about brand names? We should be talking about performance!

I'm really impressed with these released benches (if they're true) considering all the AA being used. Nothing I hate more than jagged lines in an otherwise beautiful looking game. I'm seriously considering upgrading to a 30" screen now.

Hey thanks you just jogged my memory on something I wanted to ask about...with these cool eyefinity setups will come the already well-discussed eyerritating bezel placement situation, to which some have proposed using projector technology (both rear-projection as well as standard front-screen projection) to eliminate thru careful alignment with imperceptible overlap of adjacent projections.

My question is - do these type of projection-based technologies eliminate much (if not all) of the necessity for AA and AF since the pixels are not so sharply defined as they are in LCD/plasma based screens?

This (lack of need for serious AA/AF when coupled with DLP/LCD projection tech) may make eyefinity setups all the more reasonable in the fps dept. Or so I am thinking/wondering.

Interesting. I only see distance (from your eyes to the screen) being the major factor in AA/AF necessity. I believe CRT monitors don't rely on the sharply defined pixels of LCD/Plasmas either, and I know for certain that AA/AF provides noticeable improvements in IQ.

I don't see the big hate with the bezels. From watching the Crysis three-monitor setup, we the gamer aren't losing anything. Looks like the center monitor would produce the same image as a stand-alone monitor. With the two additional monitors, we get "extra" peripheral vision of the game, which is nice.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: happy medium
So by the looks of it the 5870 is a little slower then a gtx295 ,and about 40% faster then a gtx285??
So if history repeats itself the gtx285 was ~ 2x the 8800gtx, the 8800gtx was ~ 2x the 7900gtx,the 7900gtx was ~ 2x the 6800ultra.
So the gtx380 should be ~ 2x the gtx285 (which is faster then the gtx295) and should be faster then the 5870 but cost more?
Just gota see the prices now.

I'm not sure what results you were looking at, but the 5870 looked to be the overall faster part in 8 of the 14 tests, and many of the tests are essentially a wash as to which is the superior part. And the 5870 stretches its legs with 8xAA and/or 2560x1600...I certainly wouldn't label the 5870 anything less than on par with the GTX295 based on these early results.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: cusideabelincoln
Interesting. I only see distance (from your eyes to the screen) being the major factor in AA/AF necessity. I believe CRT monitors don't rely on the sharply defined pixels of LCD/Plasmas either, and I know for certain that AA/AF provides noticeable improvements in IQ.

Its been so long since the CRT age, but was AA/AF a big deal with CRT's? I thought it was more of a problem brought on ourselves with the transition to LCD screen and there much better defined (and easily resolvable by the naked eye) square pixels. I may be out in left-field here though, feel free to tell me so if this is the case.

Originally posted by: cusideabelincoln
I don't see the big hate with the bezels. From watching the Crysis three-monitor setup, we the gamer aren't losing anything. Looks like the center monitor would produce the same image as a stand-alone monitor. With the two additional monitors, we get "extra" peripheral vision of the game, which is nice.

When I think about it, it bothers me to think I'll be seeing it (the bezel break). But then I think it's just possible that after playing on such a setup for a while that my mind will just blank out the presence of the bezels and will piece together the info seamlessly anyways and I won't be consciously aware of the bezels after a while.

For example I know there is a door frame that breaks my view between the front window and side window on my car, but I rarely recognize or dwell on the thought that my view is obscured there when I'm driving around. Not until there is something specifically blocked from view that I am trying to see, then I notice something is obscuring my view.

So if the pixel-shift bezel-aware thing works then it may all become much ado about nothing as well. That's why I want to see this thing in action at a Best Buy demo setup at some point.
 

natty1

Member
Apr 28, 2008
169
0
0
I'm kinda unimpressed by these latest benchmarks. They need to price the 5870 at $299 if they wanna sell lots of cards.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Originally posted by: natty1
I'm kinda unimpressed by these latest benchmarks. They need to price the 5870 at $299 if they wanna sell lots of cards.

i think a 5870 for $299 is perfect but new generation cards with that kind of improvements usually start high, very high so expect a street price of $399.99 for it.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: natty1
I'm kinda unimpressed by these latest benchmarks. They need to price the 5870 at $299 if they wanna sell lots of cards.

Really? So far it appears to perform on par with a $470 GTX 295 while doing it with only one GPU, so it will most likely have lower power consumption. Oh, and it adds DX11 to boot.

I think they're going to sell quite a few at $399.
 

zod96

Platinum Member
May 28, 2007
2,872
68
91
Agree. The 5870 is pretty much on par with a $500 dual gpu card yet the 5870 is $100 less, and it offers better AA and AF and less power consumption. And before anyone says I'm a fanboy take a look at my sig :) I love the competition between ati and nvidia at the moment it really benefits us. I've seen some benchmarks and the 5870 pretty much spanks my GTX285 in everything. I'll be selling my GTX285 and picking up a 5870 :) I bet places will have them for around $379 on launch...
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Idontcare
Originally posted by: alyarb
Originally posted by: Idontcare
eyerritating

you.. didn't

Oh yes eye did, and eye'll do it again if the mood strikes me :laugh:

Originally posted by: HurleyBird
Not that anyone will be able to tell the difference (vs. G80) except for very rare situation at 600% zoom, but it looks like evergreen has finally been able to perfect AF quality.

Can NOT wait to read AT's review on this chip and the tech behind it. Must be some pretty amazing stuff going on there.

Yea, I'm really looking forward to reading the nitty gritty stuff... I get my nerd off on that. :p

The one thing I've seen mentioned a number of times on these forums is Nvidia's superior AF, it looks like AMD will finally address this. I can't tell, but some people can. Nice to see they're not just pushing out a higher performing card and considering that 'enough', they're at least trying to add some features (Eyefinity... meaningless to me, but I can see how some would be excited) and improve in other areas (AF).

I'll probably upgrade after the initial prices drop, but it's kind of disappointing to me that games lately seem so uninspired to me... I play AoC as a time waster, not really into it. I bought Drakensang since it was cheap, it's not bad but not great either. The game I play the most is from 2006 (HOMM5). But I have hard time resisting new hardware, regardless. Bah.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Originally posted by: Idontcare

Its been so long since the CRT age, but was AA/AF a big deal with CRT's? I thought it was more of a problem brought on ourselves with the transition to LCD screen and there much better defined (and easily resolvable by the naked eye) square pixels.
Yes, it?s a huge deal with any display tech. AA will always be required due to the inherent nature of converting from vector space to screen space.

That and AF tackles a separate class of problem that doesn?t relate to AA or resolution.
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Originally posted by: zod96
Agree. The 5870 is pretty much on par with a $500 dual gpu card yet the 5870 is $100 less, and it offers better AA and AF and less power consumption. And before anyone says I'm a fanboy take a look at my sig :) I love the competition between ati and nvidia at the moment it really benefits us. I've seen some benchmarks and the 5870 pretty much spanks my GTX285 in everything. I'll be selling my GTX285 and picking up a 5870 :) I bet places will have them for around $379 on launch...

I reserve my judgement about the 5870 having better AA and AF than Nvidia until reviews show this! :)
 

PUN

Golden Member
Dec 5, 1999
1,590
16
81
Originally posted by: zod96
Agree. The 5870 is pretty much on par with a $500 dual gpu card yet the 5870 is $100 less, and it offers better AA and AF and less power consumption. And before anyone says I'm a fanboy take a look at my sig :) I love the competition between ati and nvidia at the moment it really benefits us. I've seen some benchmarks and the 5870 pretty much spanks my GTX285 in everything. I'll be selling my GTX285 and picking up a 5870 :) I bet places will have them for around $379 on launch...

Even HD4890 spanks GTX285 in more than half of the benchmarks under 1920x1200 (anand's review)