Eyefinity - multimonitors from ATI (done right)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MODEL3

Senior member
Jul 22, 2009
528
0
0
For me, ATI to promote 3 or 6 monitor gaming is not mainstream at all...

But for some gamers with money maybe, if they don't mind the bezel's space between...


From Anand:
Eventually someone looked at all of the outputs and realized that without too much effort you could drive six displays off of a single card - you just needed more display engines on the chip. AMD's DX11 GPU family does just that.

I thought that the display engines must be dividable with the ROPs in ATI's architecture.
So if this is correct, it means that in the high-end (16-32ROPs models?) we will have 8 display engines (6 DisplayPorts + 2 DAC?)
And for the low end (4-8ROPs models?) we will have 4 display engines (3 DisplayPorts + 1 DAC?)


Also from Anand:
Any configuration is supported, you can even group displays together. So you could turn a set of six displays into a group of 4 and a group of 2

So this probably means interesting stuff for various applications like BD playback or various combinations of applications... (even for web browsing with various different browsers like FF & IE & Chrome together...)

EDIT*
I changed my mind.
I think the display engines are irrelevant with the ROPs.
So the above logic is wrong.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: aka1nas
Originally posted by: MarcVenice
Originally posted by: Idontcare
Marc what was your impression of the visual presence of the LCD frames bisecting the image horizontally in the 6-screen setup? From the photos it looks annoying to me, but maybe real-life impression is different.

In my own article I mentioned problems with the hud, that can sometimes end up at the bezels, so it will stretch over 1/2 screens. That's a bad thing, but AMD said they'd be working on that with gamedevs.

As for the bezels, the screens shown in London had 7mm bezels, so times two thats 1,4cm. From a distance it looked okay, up close I think it could get annoying. I only flew around for a couple of minutes though. Some pressguys quickly agreed that three screens might be the sweet spot, or well, maybe 4/5 screens in a row. But 3x2 might indeed suck because of the bezels.

I already do triple screen gaming with Softth, the bezels aren't that big of an issue. You start tuning them out after a few minutes of gaming. For some games (i.e. space shooters, flight sims, mech' games) it just looks like part of a cockpit and blends in just fine.

Edit: Not sure how ATI will handle this, but Softth lets you optionally program in the bezel width in pixels so that the image doesn't appear to "jump" from display to display. I.E. I have 3x 1920x1200 displays, so my combined resolution is 5760x1200. If I add the bezel padding (say 120 pixels), then my effective resolution would be 6240x1200.

Well...

http://www.brightsideofnews.co...a-single-computer.aspx

The caveat of avoiding screen overlapping is very interesting. According to AMD people we spoke with, the driver is making a 120 pixel vertical and horizontal adjustment, e.g. rendering each display as 1920+120 pixel horizontally and 1200+120 pixel vertically [2040x1320 pixels], achieving near-perfect alignment.

By rendering those 120 pixels extra, the resolution load was significantly increased for this 24 display setup [12240x5280], i.e. the next-generation GPU from AMD has to render 64.63 million pixels in order to show perspective-corrected 55.29 million pixel image.

That answers that.
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
Holy shit! :Q

I've used all kinds of display setups, from single to multi to extra large (40"+) single displays, but this is what I've been waiting for! I cannot wait to play my favorites across 3+ displays (there's more to gaming than FPShooters, this will be huge for other genres)!

 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: Lonyo
[
Well...

http://www.brightsideofnews.co...a-single-computer.aspx

The caveat of avoiding screen overlapping is very interesting. According to AMD people we spoke with, the driver is making a 120 pixel vertical and horizontal adjustment, e.g. rendering each display as 1920+120 pixel horizontally and 1200+120 pixel vertically [2040x1320 pixels], achieving near-perfect alignment.

By rendering those 120 pixels extra, the resolution load was significantly increased for this 24 display setup [12240x5280], i.e. the next-generation GPU from AMD has to render 64.63 million pixels in order to show perspective-corrected 55.29 million pixel image.

That answers that.

Nice, looks like they have really done their homework on this implementation.

 

n7

Elite Member
Jan 4, 2004
21,281
4
81
This is just amazingly awesome.

I now have reason to be very interested in a next gen AMD card.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
Originally posted by: fleshconsumed
I wonder if this works in both D3D and OpenGL, or if it matters at all. Would be nice to get 360 degree FOV in FPS games with three 24" monitors. No one will ever be able to sneak up on me again. Too bad most online servers restrict FOV settings.

Considering AMD had a 24 display demo on Linux you can assume this is both OpenGL and DirectX. This is gonna be sweet :).
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: Sylvanas
Originally posted by: fleshconsumed
I wonder if this works in both D3D and OpenGL, or if it matters at all. Would be nice to get 360 degree FOV in FPS games with three 24" monitors. No one will ever be able to sneak up on me again. Too bad most online servers restrict FOV settings.

Considering AMD had a 24 display demo on Linux you can assume this is both OpenGL and DirectX. This is gonna be sweet :).

It's exposing it to the OS as a single large display, so it shouldn't be dependent on the 3d API.
 

uclaLabrat

Diamond Member
Aug 2, 2007
5,632
3,045
136
Considering that a lot of nvidia fans were touting physx as a tangible (albeit underutilized) advantage to buying an AMD card (along with CUDA I suppose), do you think that eyefinity will provide a much broader reason to favor one brand over another? You don't have to worry about developer support with eyefinity, unlike physx, although it will be more effective when the thin bezel displays are developed by samsung.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: uclaLabrat
Considering that a lot of nvidia fans were touting physx as a tangible (albeit underutilized) advantage to buying an AMD card (along with CUDA I suppose), do you think that eyefinity will provide a much broader reason to favor one brand over another? You don't have to worry about developer support with eyefinity, unlike physx, although it will be more effective when the thin bezel displays are developed by samsung.

Unless Nvidia come out with the same thing (minimum of 3 displays per card), then I am absolutely sold on ATI.
I had a 7800GT + 7200GS for 3 monitors a few years ago, then I upgraded to an HD4850 and went back to dual, and then I changed mobo to a board with a single PCIe x16 slot.
I was considering buying a secondary PCI card to give me the extra monitor back, but now that ATI have shown this, I am very very tempted to get an HD5850 as an upgrade, even though I don't really need it. It's just the perfect card, assuming the price (in the UK) isn't too high.

I can live without PhysX if I get triple (or more) monitors from a single card.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: uclaLabrat
Considering that a lot of nvidia fans were touting physx as a tangible (albeit underutilized) advantage to buying an AMD card (along with CUDA I suppose), do you think that eyefinity will provide a much broader reason to favor one brand over another? You don't have to worry about developer support with eyefinity, unlike physx, although it will be more effective when the thin bezel displays are developed by samsung.

Agreed, I will likely switch to a 58xx-based card from my current setup if Nv doesn't come out with something similar soonish. I actually use CUDA quite a bit for distributed computing apps, even (hopefully BOINC will get OpenCL support soon).
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
From AMD's Press Release:
This unique AMD innovation gives PCs the ability to seamlessly connect up to six ultra high definition displays in a variety of portrait and landscape configurations giving viewers a stunning new perspective on their PC experience. ATI Eyefinity is powered by one AMD graphics card for up to 12 times 1080p high-definition resolution, which approaches eye-definition optical clarity.

ATI Eyefinity technology brings AMD closer to delivering true eye-definition experiences, where the display of a virtual environment is so detailed that it seems optically real to the human eye. Using ATI Eyefinity technology in a single PC, it is now possible to power displays with a combined theoretical resolution of 268 megapixels, roughly equivalent to the resolution of a 90 degree arc of what the human eye sees. For reference, today's average 19 inch LCD display typically has an image quality of only slightly more than 1 megapixel.

Info for AMD VISION Technology:

VISION Technology
PREMIUM VISION Technology
ULTIMATE VISION Technology

http://www.techpowerup.com/103...Buying_Experience.html
 

EliteRetard

Diamond Member
Mar 6, 2006
6,490
1,022
136
Me thinks a 9 screen setup with dual GPUs would be the sweet spot. Your center view is clear of bezel and youve got more GPU per screen to help push those pixels.

Ive been holding out on LCDs...but theres no real way to mount 9 24" CRTs together and make it look good (or is there?).

Ive seen displayed 4x HDTVs seamlessly joined since 2006, like 4x40" at 3840x2160. Maybe now someone will actually sell something like that. Id be fine with 4x24" at 3840x2400 for say $1k?
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Kyle is flipping out in happiness. Also suggests "monster" performance increases. This could be that can of whoop ass.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Also, in the screenshots posted they were only showing off a flat array of panels. You would probably want to angle them a little bit if you're doing 3 displays, especially if you are increasing the FOV. This minimizes the bezel disturbance by quite a bit.

Here's an example of my current SoftTH setup to demonstrate:
http://2.bp.blogspot.com/_mpbD.../s1600-h/IMG_1429a.jpg

Even numbers of monitors will end up looking not so great as there will always be a bezel line right in the center of where you usually look at. If you have an odd number of displays, you really won't notice the bezels.
 

WelshBloke

Lifer
Jan 12, 2005
33,072
11,250
136
Originally posted by: aka1nas
Also, in the screenshots posted they were only showing off a flat array of panels. You would probably want to angle them a little bit if you're doing 3 displays, especially if you are increasing the FOV. This minimizes the bezel disturbance by quite a bit.

Here's an example of my current SoftTH setup to demonstrate:
http://2.bp.blogspot.com/_mpbD.../s1600-h/IMG_1429a.jpg

Even numbers of monitors will end up looking not so great as there will always be a bezel line right in the center of where you usually look at. If you have an odd number of displays, you really won't notice the bezels.

I was always too lazy to set up SoftTH :(
Now I wont have to :)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: ronnn
Kyle is flipping out in happiness. Also suggests "monster" performance increases. This could be that can of whoop ass.

I always love to read Kyle's "bottom line" reports, he cuts straight to the chase and puts it in words I can relate to and understand :thumbsup:

The Bottom Line

It has been a good long while since I walked into a GPU demo that truly gave me a "Wow!" feeling. In fact the experience even surpassed "Wow!" It was more like, "Holy sheepdip Batman, that kicks ass!" I was simply blown away by the Eyefinity experience. ATI Radeon Eyefinity technology really made me excited about gaming again. Technology demos are sometimes fun to see, but hardly ever do I walk out of a demo feeling like I just saw something that would impact my computing experience. Without a doubt, Eyefinity is ready to deliver a tangible computing experience benefit now...not 6 months from now when some promised-for-years game finally makes it to the market.
 

CurseTheSky

Diamond Member
Oct 21, 2006
5,401
2
0
I haven't heard people getting THIS exciting about something AMD / ATI has done in a while. I'm hoping this is the peak of a great come back for them (with HD4000 / Phenom II being, IMO, the beginning).
 

WelshBloke

Lifer
Jan 12, 2005
33,072
11,250
136
Originally posted by: CurseTheSky
I haven't heard people getting THIS exciting about something AMD / ATI has done in a while. I'm hoping this is the peak of a great come back for them (with HD4000 / Phenom II being, IMO, the beginning).


I am almost expecting ATI to fuck this up. They alway promise but don't always follow through.

Still, fingers crossed eh?

 

CurseTheSky

Diamond Member
Oct 21, 2006
5,401
2
0
Originally posted by: WelshBloke
Originally posted by: CurseTheSky
I haven't heard people getting THIS exciting about something AMD / ATI has done in a while. I'm hoping this is the peak of a great come back for them (with HD4000 / Phenom II being, IMO, the beginning).


I am almost expecting ATI to fuck this up. They alway promise but don't always follow through.

Still, fingers crossed eh?

Heh, at least they showed a working demo.

I'd be more concerned if they "paper launched" this - gave everyone the details, raved about everything it could do, and spent another 6-12 months working out the bugs and cutting features before it saw the light of day (meanwhile, Intel and / or nVidia releasing competing technology, stealing their thunder).

Hopefully slim bezel LCDs will be out shortly. I won't be jumping on this myself, but having it as a possibility is always nice.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: CurseTheSky
I haven't heard people getting THIS exciting about something AMD / ATI has done in a while. I'm hoping this is the peak of a great come back for them (with HD4000 / Phenom II being, IMO, the beginning).

You should be exiled to backwoods louisiana for 6 months for mentioning phenom 2 and hd 4xxx in the same sentence. ATI is the only decent thing about amd right now. 4xxx is extremely competitive and was developed on a shoestring budget; phenom II is not competitive in anything over $200. You might compare it to 3xxx series in that it is certainly better than the shit that came before it, but don't think for one minute that amd will be smiling (or doing much ofanything for that matter) if bulldozer is as late/shitty as their recent cpus have been.