Not buying an ATI card until they get SLI out.

HDTVMan

Banned
Apr 28, 2005
1,534
0
0
Im tired of hearing about SLI from ATI. Now the R520 is coming and you still need to buy a Master card to make it SLI.

Who is to say the Master Cards arent going to cost a hefty premium?

There must be something different about the master over the standard card which means its going to cost more than the standard card.

Cant they just have a jumper on the standard card that says whether its a Master or slave? Or their SLI motherboards coming out can have a master or slave slot. Then you could use the same card whether you have 1 or two.

I guess what is really got me in a rage today is if I commit to a R520 when it comes out who is to say that the R530 to make it SLI isnt going to cost me $100 more than a R520?
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
I guess ATi will price master cards at the same price. Making them more expensive would be like shooting themselves in the foot.

EDIT: I agree that Master cards are not the best solution.
 

HDTVMan

Banned
Apr 28, 2005
1,534
0
0
But manufacturing and demand will be lower on a Master Card since most people wont do sli so it will be a premium price.

I can barely hold off building my new machine any longer.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
I think the only difference is related to BIOS, but don't quote me on that ;)
 

HDTVMan

Banned
Apr 28, 2005
1,534
0
0
If its just the bios then they could easily have if = 1 is master if = 0 is slave and have a jumper. ATI SLI is looking like a disaster that never occured.

What happened to the ATI fury staff?
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
Originally posted by: McArra
Crossfire guys, don't be picky :D
Yeah, and it requires a Visa Card. ;D

:D Hi HDTVMan... You want one anyway for H.264 and you know it... ;)
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: gsellis
Originally posted by: McArra
Crossfire guys, don't be picky :D
Yeah, and it requires a Visa Card. ;D

:D Hi HDTVMan... You want one anyway for H.264 and you know it... ;)

Which the Geforce 7 series also has :)

I could care less if they get Crossfire out. But i seriously wish they would release the R520.

-Kevin
 

HDTVMan

Banned
Apr 28, 2005
1,534
0
0
One would think I would want it for H.264 but I think thats a useless feature for me since I dont watch HD on a PC nor do I output PC to my HDTV.

I have a Dedicated Media Server running windows XP which records HD content then streams that to my IODATA AVEL linkplayer 2 out to my 65" HDTV. It contains all my Photos, MP3, DIVX, and HD Movies. I also have 2 X-boxes which also tie into that Media Server for Divx, MP3, Photo, and Shoutcast. Linkplayer 3 is coming out soon but my Hope is the Xbox 360 will allow for streaming of HD content on release. I have all the convenience of Media Center in the living room without the Overpriced ugly box sitting there.

Funny enough my Onboard nforce 2 graphics can do HD video (WMV9 and TS) without a hitch using a Athlon XP 2700+ with slight overclock at 1280x960. I have no desire to run it at 1920x1080 because its on a 19" monitor and the onboard graphics wont go to 1920x1080 using onboard memory.

H263 doesnt do you any good. If it takes 15% CPU or 90% CPU the movie still plays. Unless you have a dinosaur Processor then you really shouldnt be buying a Top end video card. I never could understand pairing a $400 video card with a $50 cpu? I dont see anyone needing to multitask a h264 movie and say a computer game.

Thats just me.
 

Saga

Banned
Feb 18, 2005
2,718
1
0
I agree that the master and slave card is a horrid way to go about this, it simply adds another level of complication, especially when replacing, or trading components.

I don't dig this from ATI at all, and until my 6800 I was strictly an ATI buyer.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Izusaga
I agree that the master and slave card is a horrid way to go about this, it simply adds another level of complication, especially when replacing, or trading components.

I don't dig this from ATI at all, and until my 6800 I was strictly an ATI buyer.

OTOH, with ATI's master/slave arrangement, the two cards don't have to be identical.

With NVIDIA's current SLI implementation, you need two identical cards running at the same clock speed. This can be troublesome if you plan on upgrading to SLI later, since you need to find the exact model of card you currently have -- and you may need to BIOS-flash one or both of them to make it work.

It's a tradeoff, as almost everything in computing is. I'd wait until the product is actually on the market before jumping all over Crossfire as fatally flawed.
 

McArra

Diamond Member
May 21, 2003
3,295
0
0
Originally posted by: Matthias99
Originally posted by: Izusaga
I agree that the master and slave card is a horrid way to go about this, it simply adds another level of complication, especially when replacing, or trading components.

I don't dig this from ATI at all, and until my 6800 I was strictly an ATI buyer.

OTOH, with ATI's master/slave arrangement, the two cards don't have to be identical.

With NVIDIA's current SLI implementation, you need two identical cards running at the same clock speed. This can be troublesome if you plan on upgrading to SLI later, since you need to find the exact model of card you currently have -- and you may need to BIOS-flash one or both of them to make it work.

It's a tradeoff, as almost everything in computing is. I'd wait until the product is actually on the market before jumping all over Crossfire as fatally flawed.

Not with the 80.xx drivers ;)

 

BigCoolJesus

Banned
Jun 22, 2005
1,687
0
0
Originally posted by: Matthias99
Originally posted by: Izusaga
I agree that the master and slave card is a horrid way to go about this, it simply adds another level of complication, especially when replacing, or trading components.

I don't dig this from ATI at all, and until my 6800 I was strictly an ATI buyer.

OTOH, with ATI's master/slave arrangement, the two cards don't have to be identical.

With NVIDIA's current SLI implementation, you need two identical cards running at the same clock speed. This can be troublesome if you plan on upgrading to SLI later, since you need to find the exact model of card you currently have -- and you may need to BIOS-flash one or both of them to make it work.

It's a tradeoff, as almost everything in computing is. I'd wait until the product is actually on the market before jumping all over Crossfire as fatally flawed.


yea but from a marketing point of view, if someone wants to build a new computer, and they want to go the dual vid card path, what would be easier for them (note, about 60-70% of computer users dont have all the knowledge or know how, so they want it simple as possible)

To buy a 7800GTX now, and then in a few months or a year buy the same 7800GTX (its not that hard to match up the maker) later

or

Buy an R520 now, and then realize that they need a specific type of card (mastercard) that isnt the same as thier current card, which (in a lot of peoples minds who look at building a PC as being worrysome) can worry them and make them question if it will work


 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: McArra
Originally posted by: Matthias99
Originally posted by: Izusaga
I agree that the master and slave card is a horrid way to go about this, it simply adds another level of complication, especially when replacing, or trading components.

I don't dig this from ATI at all, and until my 6800 I was strictly an ATI buyer.

OTOH, with ATI's master/slave arrangement, the two cards don't have to be identical.

With NVIDIA's current SLI implementation, you need two identical cards running at the same clock speed. This can be troublesome if you plan on upgrading to SLI later, since you need to find the exact model of card you currently have -- and you may need to BIOS-flash one or both of them to make it work.

It's a tradeoff, as almost everything in computing is. I'd wait until the product is actually on the market before jumping all over Crossfire as fatally flawed.

Not with the 80.xx drivers ;)

I believe the 77.76 drivers fixed this. While they still need to be the same graphics chipset, they do not the same BIOS, and what not. I am not sure if it allows OCing yet though.

One would think I would want it for H.264 but I think thats a useless feature for me since I dont watch HD on a PC nor do I output PC to my HDTV.

Once again i say that the 7 Series already supports this.

-Kevin
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: BigCoolJesus
Originally posted by: Matthias99
Originally posted by: Izusaga
I agree that the master and slave card is a horrid way to go about this, it simply adds another level of complication, especially when replacing, or trading components.

I don't dig this from ATI at all, and until my 6800 I was strictly an ATI buyer.

OTOH, with ATI's master/slave arrangement, the two cards don't have to be identical.

With NVIDIA's current SLI implementation, you need two identical cards running at the same clock speed. This can be troublesome if you plan on upgrading to SLI later, since you need to find the exact model of card you currently have -- and you may need to BIOS-flash one or both of them to make it work.

It's a tradeoff, as almost everything in computing is. I'd wait until the product is actually on the market before jumping all over Crossfire as fatally flawed.


yea but from a marketing point of view, if someone wants to build a new computer, and they want to go the dual vid card path, what would be easier for them (note, about 60-70% of computer users dont have all the knowledge or know how, so they want it simple as possible)

To buy a 7800GTX now, and then in a few months or a year buy the same 7800GTX (its not that hard to match up the maker) later

or

Buy an R520 now, and then realize that they need a specific type of card (mastercard) that isnt the same as thier current card, which (in a lot of peoples minds who look at building a PC as being worrysome) can worry them and make them question if it will work

I just really don't see it as being that much more complex. I would hope that anyone who feels capable of taking apart their computer and upgrading it could grasp the concept of a "master" video card. You want to enable Crossfire, you go buy a "Crossfire Edition" card and hook it up to your current video card. Not exactly rocket science here, guys. :)

Also, at least with SLI right now, you have to match up not just the maker, but often the BIOS version on the cards as well. Is it easier for someone to plug the two cards together, or to have to flash one or both cards (or ship one back to the manufacturer to be flashed)? :p What if you bought your original card locally, and now they don't carry that brand anymore? I mean, you can come up with silly reasons why either approach is bad.

Edit: If NVIDIA has really fixed this, that's great. However, I still don't think ATI's approach is necessarily worse than NVIDIA's wrt multiple graphics cards. At the very least, everyone should wait until the damn thing is available before declaring it a failure.
 

BigCoolJesus

Banned
Jun 22, 2005
1,687
0
0
Originally posted by: Matthias99
Originally posted by: BigCoolJesus
Originally posted by: Matthias99
Originally posted by: Izusaga
I agree that the master and slave card is a horrid way to go about this, it simply adds another level of complication, especially when replacing, or trading components.

I don't dig this from ATI at all, and until my 6800 I was strictly an ATI buyer.

OTOH, with ATI's master/slave arrangement, the two cards don't have to be identical.

With NVIDIA's current SLI implementation, you need two identical cards running at the same clock speed. This can be troublesome if you plan on upgrading to SLI later, since you need to find the exact model of card you currently have -- and you may need to BIOS-flash one or both of them to make it work.

It's a tradeoff, as almost everything in computing is. I'd wait until the product is actually on the market before jumping all over Crossfire as fatally flawed.


yea but from a marketing point of view, if someone wants to build a new computer, and they want to go the dual vid card path, what would be easier for them (note, about 60-70% of computer users dont have all the knowledge or know how, so they want it simple as possible)

To buy a 7800GTX now, and then in a few months or a year buy the same 7800GTX (its not that hard to match up the maker) later

or

Buy an R520 now, and then realize that they need a specific type of card (mastercard) that isnt the same as thier current card, which (in a lot of peoples minds who look at building a PC as being worrysome) can worry them and make them question if it will work

I just really don't see it as being that much more complex. I would hope that anyone who feels capable of taking apart their computer and upgrading it could grasp the concept of a "master" video card. You want to enable Crossfire, you go buy a "Crossfire Edition" card and hook it up to your current video card. Not exactly rocket science here, guys. :)

Also, at least with SLI right now, you have to match up not just the maker, but often the BIOS version on the cards as well. Is it easier for someone to plug the two cards together, or to have to flash one or both cards (or ship one back to the manufacturer to be flashed)? :p What if you bought your original card locally, and now they don't carry that brand anymore? I mean, you can come up with silly reasons why either approach is bad.

Edit: If NVIDIA has really fixed this, that's great. However, I still don't think ATI's approach is necessarily worse than NVIDIA's wrt multiple graphics cards. At the very least, everyone should wait until the damn thing is available before declaring it a failure.


Im not declaring it a failure, im declaring it a poor choice of planning....... as has been suggested earlier in the forums it would have made more sense to make all cards master/slave, or have a bios option on the motherboards then have two different kinds of cards