Lucid Hydra 200 in 30 days

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Last week I was at local MicroCenter window-shopping P55 boards, I saw a tri-SLI setup w/ dual loop water cooling. It was an artifact/stutter galore. (while running Street Fighter 4 bench no less)

Yes it had 3 cards instead of 2, and yes it's possible an idiot built that system, but it was more than enough to re-affirm my long-held (and experienced first-hand) opinion on multi-GPU systems.

Now, this latest development in multi-GPU world is definitely something different, but it will have to do VERY WELL to prove itself. Merely adding up performance in several titles won't cut it, IMO.
 

bfdd

Lifer
Feb 3, 2007
13,312
1
0
I have to post again, WHERE ARE THE X58 BOARDS WITH THIS!? If one doesn't show up on the horizon soon, I might have to sell my NiB i7 920 and pick up an i7 860 and go p55 if this works out how it has been touted since we first heard of it.
 

WelshBloke

Lifer
Jan 12, 2005
30,544
8,230
136
Originally posted by: Majic 7
Just saw a post by Kyle at HardForums in which he says Nvidia is talking about blocking Lucid with code on its products. They had better pray that Lucid doesn't work all that well because I can't see them staying in business for long if they do this. .................


If it does work really well and ATI see some terrific gains I would bet that NV would have to support it or develop their own version.
They may well block mix and match vendors though (ATI + NV in the same PC) but I'll bet that will come with plenty of its own problems even with out any dirty tricks.
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Originally posted by: v8envy
Code would look a lot like this:

if( present_ati() || present_lucid_chip() ) {
halt_and_catch_fire();
}
:laugh:
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: ronach
How about a plug-in Hydra PCI-e card so we all don't have to buy new mobos to get into this technology. My sys is no slouch..not the newest..but tough enough to do the deed.

it wouldn't work that way, the chip actually provides PCI-e lanes, and they talk about 24 and 48 lanes, having a PCI-e card as an add-in would limit the entire setup to 16 lanes between two video cards that would otherwise have 16 lanes themselves...
 

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
Originally posted by: bfdd
I have to post again, WHERE ARE THE X58 BOARDS WITH THIS!? If one doesn't show up on the horizon soon, I might have to sell my NiB i7 920 and pick up an i7 860 and go p55 if this works out how it has been touted since we first heard of it.

This kills the one advantage of the X58 - dual x16 PCIe 2.0 slots for multi GPU setups.
If you can only have 16 lanes going in, why get an X58 setup over a P55?

The only advantage the X58 platform is left with is future CPU developments coming first (and being very expensive), and triple channel RAM (which won't be of much use until those future CPUs - with 6 cores - appear).
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: gigahertz20
Don't get too excited until we see the numbers, but I'm guessing if a big mobo maker like MSI is including the chip on one of their mobos, it must work decently at least.

Intel bought a huge stake in the company.

It's not vaporware.

The claim is 95% scaling across the board. Although mixed vendor is still beta for a while.
 

AyashiKaibutsu

Diamond Member
Jan 24, 2004
9,306
3
81
My prediction is it'll allow mixed cards, but scale worse than either sli or crossfire. For it to scale better, I'd have to entertain the idea that both AMD and nVidia have been sabotaging their setups to make expensive single cards look more appealing.
 

machineheadg2rr

Junior Member
Apr 11, 2009
17
0
0
We won't know if Hydra is worth it from a price perspective until we get some real benches on it. I think the only people that it makes sense for, are those with outdating rigs that need upgrading. If your current cpu provides enough gaming performance, im not sure they would spend 200 on a new mobo. For most users, ditching your current card for a 5850/70 is the way to go.

If Hydra does indeed scale as well as claimed, Nvidia won't be able to keep it from working with their cards. Anyone who needs a multi card solution would be forced to go amd. It would take a lot of hor$epower to negate the scaling edge that hydra may bring.
 

GlacierFreeze

Golden Member
May 23, 2005
1,125
1
0
Originally posted by: AyashiKaibutsu
My prediction is it'll allow mixed cards, but scale worse than either sli or crossfire. For it to scale better, I'd have to entertain the idea that both AMD and nVidia have been sabotaging their setups to make expensive single cards look more appealing.

Did you even attempt to read the older article, about how it's supposed to work, that is linked in the new one? I'm guessing not.
 

alyarb

Platinum Member
Jan 25, 2009
2,444
0
76
i don't think he did. you can't "sabotage" a multi-GPU setup any more than they are already doing with AFR.

hydra is not AFR. hydra is asymmetric multiprocessing. even in groups of identical GPUs, the load is dynamic and asymmetric. the memory is not shared so you actually do get larger effective frame buffers with additional cards. Everything is decomposed and segregated, and composited from one frame to the next. bandwidth is conserved.

the problem is that they are not showing any impressive demos. vender agnosticism is impressive in itself because it shows the load balancing technology works and is only respective at the D3D level and has nothing to do with the architecture of the GPUs, but they aren't demonstrating hydra in scenarios where AFR sucks. DMC4 is not a graphically intensive game and scales pretty well with AFR.

there are a few games with utterly abysmal scaling under AFR, and lucid knows damn well which games they are, and they aren't showing us what hydra does for them. they need to show hydra's technique scales linearly whereas AFR only gives you 20-40% per additional GPU. They need to show that hydra's scaling does not drop off after the third GPU. They need to give every dumbass with a pair of 295's a glimpse of hope that their second card might someday be utilized.
 

GlacierFreeze

Golden Member
May 23, 2005
1,125
1
0
Originally posted by: Fox5
Crossfire already comes basically free with most high end motherboards, so I can't really see this being worth it for the ati side of things, and it doesn't look like the premium will be less than the nvidia side of things.

This works way differently than Crossfire or SLI. Read the older articles on it from a year or so ago.
 

OCNewbie

Diamond Member
Jul 18, 2000
7,603
24
81
What if you mix a DX9 card with a DX11 card on a DX11 game? Will this chip know what parts of the code it can send to the DX11 card and what parts to the DX9? I'm guessing nobody here knows the answer yet.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: PCTC2
Originally posted by: geoffry
I take it this thing does nothing for SLI on a stick cards?

Someone will have to try 4x GTX295's or 4x HD 4870X2's and tell us if they can play Crysis at 2560x1600.

don't be ridiculous. that might be able to play crysis at 1366x768... ;)
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: yacoub
Originally posted by: alyarb
why are people so obsessed with mixing radeons with geforces anyway?

Because for the first time, multi-GPU becomes a possibility for those of us who only buy one GPU per generation. Now we'd be able to keep our existing GPU in the system while adding the new one. Sometimes NVidia has the best bang-for-the-buck card, sometimes ATi does. So this way, with a board that has Hydra, the brand of your upcoming GPU purchase doesn't matter - take whatever it is, swap it into the primary PCIe slot, and move your existing GPU down to a secondary slot. Now your system has the new hardware generation for primary usage and the added processing power of your old card to assist in games.

Hydra, if it works as we anticipate, will allow you to have the performance of both cards (in most 3D intensive circumstances) instead of just your new card. So you are gaining an extra performance equal (or near enough) to your old card in addition to your new card that would otherwise have simply replaced the old card.

There are a lot of important questions though, including:

Does this intermediary chip add input latency?

Do you only get the hardware features of the least powerful or oldest card in your system? i.e. if you have a DX11 card and a DX10 card, can you use DX11 graphics in a game and still get any use out of the DX10 GPU or would that game then only use your DX11 card?

Does it really work with an ATi and an NVidia card together or only different cards from a single manufacturer?

that AT article stumbled several times regarding hydra. at first it said that the first gen would only combine cards from the same company, later on it said that it would combine cards from different companies.

I see drivers being the kicker here. If ati and nvidia, who actually make/design/refine the cards, have trouble with xfire and sli drivers, what's going to happen with hydra when trying to combine two completely different architectures? I only see this happening if it gets at least two of the big players (assuming larrabee eventually gets there) on board with driver design.
 

AyashiKaibutsu

Diamond Member
Jan 24, 2004
9,306
3
81
Originally posted by: GlacierFreeze
Originally posted by: AyashiKaibutsu
My prediction is it'll allow mixed cards, but scale worse than either sli or crossfire. For it to scale better, I'd have to entertain the idea that both AMD and nVidia have been sabotaging their setups to make expensive single cards look more appealing.

Did you even attempt to read the older article, about how it's supposed to work, that is linked in the new one? I'm guessing not.

How it's suposed to work and how well it will work are two different things. My prediction is just a random guess; shouldn't get too worked up about it.
 

alyarb

Platinum Member
Jan 25, 2009
2,444
0
76
in other words, you know nothing about the technology or the discussion at hand, but expect it to fail. that's like, an on-topic nef.
 

Elias824

Golden Member
Mar 13, 2007
1,100
0
76
Even if this chip only worked with 2 of the same brand and model cards, it would be amazing. Hardware level use of multiple cards has been on everyone want list for a long time, and now everyone keeps saying its too good to be true.
 

TantrumusMaximus

Senior member
Dec 27, 2004
515
0
0
I remember the article and grew very excited about this technology. And then it just was off the radar for so long. The only thing that seriously screams success is how much money Intel tossed into this. Obviously it has merit. Can't wait for this to come to fruition and get some reviews.
 

WelshBloke

Lifer
Jan 12, 2005
30,544
8,230
136
Originally posted by: bryanW1995
would be really funny if nvidia AND ati both put a block on it. problem solved :)

I still dont see how this would be to anyones advantage to block it. :confused:

You still have to buy the video cards, if anything this would drive sales not hinder them.