Xbit Article on Eyefinity vs Surround Biased?

Paratus

Lifer
Jun 4, 2004
16,666
13,405
146
Am I being overly sensitive? If not it sad to see as I usually liked xbits reviews.

http://www.xbitlabs.com/articles/graphics/display/amd-eyefinity-nvidia-surround.html

Example
Just like AMD’s Eyefinity Technology, Nvidia allows tying together three monitors for a better gaming experience. Unlike their competitors, Nvidia offers a different approach. They do not expect the users to invest extra money in adapters with DisplayPort. On the other hand, Nvidia expects you to have three identical displays. This isn't necessarily a huge disadvantage for someone building a new gaming rig from the ground up. But if you happen to have a slightly outdated display you'll most likely have to get rid of it. Chances are that you won’t find the same exact model anymore.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Am I being overly sensitive?

Yea, just a little.

I was more concerned about there 6950 Toxic article saying that somehow a 6950 Toxic at 880 core 5200 memory was faster than a 6970 at 880 core 5500 memory, with the same amount of shaders.

With the 6970 running faster memory, how is this possible?
Faster no, as fast yes.

Somehow they got a 6950 superchip. :)
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Am I being overly sensitive? If not it sad to see as I usually liked xbits reviews.

http://www.xbitlabs.com/articles/graphics/display/amd-eyefinity-nvidia-surround.html

Example

I'm not even sure what is setting your radar off. The only thing in that quote that I figured you might have issue with is:

Just like AMD’s Eyefinity Technology, Nvidia allows tying together three monitors for a better gaming experience. Unlike their competitors, Nvidia offers a different approach.

I interpreted the "better gaming experience" not over AMD's Eyefinity but over single monitor usage.

Also, the short blurp lists the two limits of the hardware; one requires you to use adapters/dongles the other identical displays.

Maybe there is something in the article you didn't quote?
 

gorobei

Diamond Member
Jan 7, 2007
3,666
993
136
depends on what kind of bias you are talking about. the card selection and system setup is a little bizzare (6990, 590, 6970, 560 SLI) or lazy in its under-representation; why no xfire setup?

the line about mandatory displayport adapter is asinine since you can now easily find monitors with native dp inputs(and most of them will be better quality monitors that techies would recommend).

the link to amd's step by step setup is wrong: http://support.amd.com/us/kbarticles/Pages/gpu50-ati-eyefinity-display-groups.aspx
anyone looking for a guide from their link would end in massive frustration. whether this is intentional is up for debate.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
depends on what kind of bias you are talking about. the card selection and system setup is a little bizzare (6990, 590, 6970, 560 SLI) or lazy in its under-representation; why no xfire setup?

the line about mandatory displayport adapter is asinine since you can now easily find monitors with native dp inputs(and most of them will be better quality monitors that techies would recommend).

the link to amd's step by step setup is wrong: http://support.amd.com/us/kbarticles/Pages/gpu50-ati-eyefinity-display-groups.aspx
anyone looking for a guide from their link would end in massive frustration. whether this is intentional is up for debate.

Unless things have changed, nVidia surround only works on SLI. So the cards used (or setup) are proper in my opinion.

Mentioning the limitation of needing adapters/dongles is very important because we've had posters here who just used three monitors they had, made sure they all supported the same resolution, than found out they couldn't run them all as advertised. I think it's equally as important mentioning that nVidia requires all the same model monitors (which to me is more of a hinderance).

I still see nothing wrong with the blurp quoted. Unless there is something more nefarious in the actual article.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
the line about mandatory displayport adapter is asinine since you can now easily find monitors with native dp inputs(and most of them will be better quality monitors that techies would recommend).

I can see there point if you allready own 3 monitors.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
i posted the correct link

the xbit link is:
http://www.amd.com/us/products/technologies/amd-eyefinity-technology/how-to/Pages/how-to.aspx

and trying to find a step by step from that page will search in vain.

Using that link I found a really useful Java/Flash based instruction manual with questions such as: how many displays, primary use (gaming/advanced/productivity), and then instructions how to install it with images for each device (DVI/VGA/DP) and limitations (you will need a DP adapter).

Did you click the bottom right panel that said
Set Up

Set up and configure your AMD Eyefinity display
Learn more
?

EDIT: Nevermind, I get what you meant. My fault.

EDIT #2: No, I guess I still don't get it. What I just quoted, if that was the Xbit link - it has a helpful guide, otherwise, haha you lost me some where.

EDIT #3: Go straight to the horse, Link I found on xbit article:
http://www.amd.com/us/products/technologies/amd-eyefinity-technology/how-to/Pages/how-to.aspx
So, my post applies, there is a very helpful guide found on that page, just click the right panel "Learn More" link for an interactive guide.
 
Last edited:

Paratus

Lifer
Jun 4, 2004
16,666
13,405
146
The following comments from the article capture the bias I saw:

Why is the starting tone of the article so biased towards NVIDIA? I know the AMD technology allows multiple (3 or 6 depending on the exact card you get) mixed monitors NATIVELY, only requiring possible adapaters if your monitors don't have the right inputs, yet their technology is described as "not so rosy" and having "a few problems to overcome". Also the way it is written make it seem that the fact AMD can connect 6 monitors (which NVIDIA can only dream of) is a disadvantage because of the powerbills... The fact that NVIDIA requires TWO graphic cards (or one very expensive flagship card) for more than 2 monitors is written of only as a "slight inconvenience" (which by the way increases the powerbill more in an otherwise equal 3 monitor setup...) . I am neither an AMD or NVIDIA fanboy, but I'd like these reviews to be an objective comparison so I (and other readers with me) can make up my own mind in an informed way.***


WTH is this bullshit? Why don't you put the [PR] tag in this?

You come saying that AMD's approach is expensive because of the monitors, takes to much space, consumes energy... WTF? When did nvidia started giving monitors for free? Where do I sign for their "we will pay your energy bills" program?

And you come with an "unfortunately, this is not all yet. You see..." and bash AMD for offering "only" two displays "no matter what" the protocol. "If you 'want' another monitor", you need to buy a Displayport adapter... And you say that is a DRAWBACK? (you follow it with "on the positive note..." ) Why didn't you mention the FLEX brand (and maybe others), which doesn't need that? (SPECIALLY when you took your time to mention gtx590 in the nvidia section)

The malicious intention of the writer are all over when you see how he writes the nvidia part: "unlike their competitors... (Nvidia) do not expect the users to invest extra money in adapters". Sure! They expect you to buy ANOTHER GRAPHIC CARD! Does the PR-stunt-man even underlines that in the next sentence? (after all, it makes SENSE to do it now) NO! He comes with a "Nvidia expects you to have three identical displays... (which) isn't necessarily a huge disavantage..." Sure! Then why did you said that AMD's way is expensive because of the monitors and dumbed-down when Nvidia asks you to throw away your old one?

That's not all! The unprofessional guy even says that there's "another LITTLE issue or what some MAY EVEN consider a disavantage" of Nvidia, because they need two GPUs... But don't worry! There's the gtx590 and you can even choose to run an SLI setup... You literaly waited for the last sentence to inform how this means in investing into ANOTHER graphic card, (possible ANOTHER) SLI-ready mainboard and possibly (ANOTHER) more powerful PSU (besides the negligible configuration bit).

And I wouldn't even comment how you started your nvidia part saying that "Unlike their competitors... (nvidia) give you an additional feeling of involvement and supremacy". And, also, I won't even comment the numbers, like comparing a gtx560 Ti SLI system ($500) with a single HD6970 ($340, both from newegg) and not pointing that out.

Man... Seriously. WTF? Don't you have any shame? How could you spin that SO HARD?

Normally xbit articles strike me as thorough and neutral.

This one seemed....off...
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
The following comments from the article capture the bias I saw:

Normally xbit articles strike me as thorough and neutral.

This one seemed....off...

I guess I need to read the whole article. If they don't list the SLI requirement for SurroundView as a drawback they are definitely being misleading, in my opinion. That is a huge drawback.
 

gorobei

Diamond Member
Jan 7, 2007
3,666
993
136
Using that link I found a really useful Java/Flash based instruction manual with questions such as: how many displays, primary use (gaming/advanced/productivity), and then instructions how to install it with images for each device (DVI/VGA/DP) and limitations (you will need a DP adapter).

Did you click the bottom right panel that said
?

EDIT: Nevermind, I get what you meant. My fault.

EDIT #2: No, I guess I still don't get it. What I just quoted, if that was the Xbit link - it has a helpful guide, otherwise, haha you lost me some where.

EDIT #3: Go straight to the horse, Link I found on xbit article:
http://www.amd.com/us/products/technologies/amd-eyefinity-technology/how-to/Pages/how-to.aspx
So, my post applies, there is a very helpful guide found on that page, just click the right panel "Learn More" link for an interactive guide.

if you look at the nvidia stepbystep link in the article, the page is an actual guide like i posted. the article's amd link is to the overall eyefinity portal. if you are looking for an actual step by step guide like the nvidia one, you will have to do more than a few minutes worth of searching.
the amd eyefinity portal page is a nice little resource but not what xbit couched the link as being(updated step by step guide on how to actually set up the eyefinity SLS cluster).
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
if you look at the nvidia stepbystep link in the article, the page is an actual guide like i posted. the article's amd link is to the overall eyefinity portal. if you are looking for an actual step by step guide like the nvidia one, you will have to do more than a few minutes worth of searching.
the amd eyefinity portal page is a nice little resource but not what xbit couched the link as being(updated step by step guide on how to actually set up the eyefinity SLS cluster).

I'm not sure I follow you, because using the link on the Xbit article I told you what I found, a very useful interactive guide that offered a lot of information, such as how to setup a 1x3, 1x4, 2x2, 1x31, 1x3+2, and 3x2 setup. It then told you how to connect all the cards to your card and if you picked a +4 monitor option even asked if you had a secondary GPU (onboard or dedicated), then it let into the CCC instructions.

It was right there on the Xbit link, all I had to click was "Setup." It took me no longer than 5 seconds - which was me reading the page.

EDIT: I noted on the guide you posted it is just the CCC instructions, but no actual instructions on how to physical connect the monitors, what adapter you'd require, what hardware, or etc. I found the xbit link much more useful.
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
All I read was important info that I'd like to know like identical vs non identical monitors etc.
 

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
I read what was either bias against AMD or a poorly informed author, both inexusable IMO.

The biggest advantage of AMD Eyefinity is that it can be activated on virtually any of the currently available graphics cards. However, you will need to purchase an additional active DisplayPort adapter, which are quite expensive.

I highly doubt that the author is not aware that when using a Display Port capabale monitor with the AMD solution that the display port adapter is not required. In the above paragraph the author mentions one of the biggest advantages of the AMD solution simply to assault it with misinformation/bias/an agenda. Whatever you call it, it reflects very poorly on the integrity of the article/site. I saw a tactful misrepresenation of information troughout the article that was used to undermine AMD's solution while championing nVidia's. We can do without this sort of propaganda in our review sites.

Xbit is crossed off my list of spots to check in the future.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
All I read was important info that I'd like to know like identical vs non identical monitors etc.

Actually even this is misleading. I have 3 non identical monitors and have used them on both an AMD and Nvidia setup. I think they mean identical resolution... which seems like it would be about the only way to run this setup. I could not imagine, running surround on different resolutions.

I have 2 gateways and an hp.. no problems on either setup.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
I think they mean identical resolution... which seems like it would be about the only way to run this setup. I could not imagine, running surround on different resolutions.

I've seen people run one 1920 x 1200 center and two 1600 x 1200 on the sides. That wouldn't suck, especially if you had some PVA or IPS 4:3 monitors.
 
Feb 19, 2009
10,457
10
76
Yea, just a little.

I was more concerned about there 6950 Toxic article saying that somehow a 6950 Toxic at 880 core 5200 memory was faster than a 6970 at 880 core 5500 memory, with the same amount of shaders.

With the 6970 running faster memory, how is this possible?
Faster no, as fast yes.

Somehow they got a 6950 superchip. :)

Review sites often use recycled numbers from previous reviews if they include the same cards. Obviously drivers affect performance a great deal, especially the recent batch, with improvements in a lot of games of 10-20% here and there.

Edit: That was a good review, a big variety of games even if some are old, more are better in general as long as newer dx11 titles are included. I'm very glad NV has been working hard on their drivers, their multi res performance and SLI scaling has improved a lot.
 
Last edited:

Bill Brasky

Diamond Member
May 18, 2006
4,345
1
0
Harping on AMD for a $25 dollar adapter was probably not a good idea. LOL. Talk about pissing away credibility...
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Am I being overly sensitive?
Yes you are, because nothing he said was wrong, or showed partiality.
Nvidia does need 3 identical monitors to work together.
And monitor manufactures do stop doing old models of their monitors at some point.

That usually does mean, you cant expect people that upg 1->3 monitors will be able to all get 3x of the same ones (unless they throw out the old, and buy 3 at the same time = which is what happends to alot of nvidia users that go that route).

It seems fair to mention it, as its probably a situation that happends alot (with nvidia guys that go to 3 monitors).








Why is the starting tone of the article so biased towards NVIDIA?

I know the AMD technology allows multiple (3 or 6 depending on the exact card you get) mixed monitors NATIVELY, only requiring possible adapaters if your monitors don't have the right inputs, yet their technology is described as "not so rosy" and having "a few problems to overcome".

Also the way it is written make it seem that the fact AMD can connect 6 monitors (which NVIDIA can only dream of) is a disadvantage because of the powerbills...

The fact that NVIDIA requires TWO graphic cards (or one very expensive flagship card) for more than 2 monitors is written of only as a "slight inconvenience" (which by the way increases the powerbill more in an otherwise equal 3 monitor setup...) .

I am neither an AMD or NVIDIA fanboy, but I'd like these reviews to be an objective comparison so I (and other readers with me) can make up my own mind in an informed way.***

^This.... very biased towards Nvidia article, if anything.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Crysis2_photo_1.JPG


What is going on with this photo? Looks like the 2 monitors on the left and middle arent full screen.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Unless things have changed, nVidia surround only works on SLI. So the cards used (or setup) are proper in my opinion.

Mentioning the limitation of needing adapters/dongles is very important because we've had posters here who just used three monitors they had, made sure they all supported the same resolution, than found out they couldn't run them all as advertised. I think it's equally as important mentioning that nVidia requires all the same model monitors (which to me is more of a hinderance).

I still see nothing wrong with the blurp quoted. Unless there is something more nefarious in the actual article.

All else equal, single GPU is better than more-than-single-GPU. Less hiccups with games. Forcing SLI to get Surround is a fail. NV knows this, which is why I expect them to have at least one option for single-GPU Surround within 2 years... I'd say within 1 year, except that whatever they put out next cycle was probably designed prior to Eyefinity, so it might not have been possible to tack on single-GPU Surround. Yeah some games might pretty much need dual GPU at high settings on 3 screens, but it shouldn't be forced on people just to get Surround.
 

WMD

Senior member
Apr 13, 2011
476
0
0
Yea, just a little.

I was more concerned about there 6950 Toxic article saying that somehow a 6950 Toxic at 880 core 5200 memory was faster than a 6970 at 880 core 5500 memory, with the same amount of shaders.

With the 6970 running faster memory, how is this possible?
Faster no, as fast yes.

Somehow they got a 6950 superchip. :)

The 6970 results looks wrong. In many case it performs nearly the same as the stock 6950. Reviews elsewhere puts a 6970 10-15% faster than a 6950. Maybe a powertune bios bug. The results for stock 6950 and toxic looks as one would expect.