What happened to the Lucid Hydra Fuzion board?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
lol IDC



yeah that sounds more like it, the number I saw months ago was ~$40 which would be in line with the lower part
 

AyashiKaibutsu

Diamond Member
Jan 24, 2004
9,306
4
81
I love threads that make the AMD Wreckage equivilent crew shine bright; In most threads, they're usually pretty good at hiding themselves.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
lol IDC



yeah that sounds more like it, the number I saw months ago was ~$40 which would be in line with the lower part

Key being: that's the low end part. The enthusiast part is right where I said it was.

We don't know whether that's the manufacturing cost or the expected addition to the final retail product either. Consider that low end motherboards go for as little as $50 and low end video cards are in the $40 range as well and suddenly even the entry level SKU still looks like a healthy chip in terms of funding R&D and silicon.

My point still stands -- it's a huge amount of relative expense to fund two relatively minor features (perfect scaling and multi-vendor in a single system support). Which might explain why NV and ATI marketing isn't pushing for a similar product from their end. Not that it's impossible for their engineers to deliver.
 

YearZero

Junior Member
May 9, 2007
17
0
0
I tend to agree with Idontcare.

This product is essentially promising the silver bullet of multi-GPU scaling, and also indirectly claims its engineers are somehow smarter than ATis and nVidia’s for their own technology.

If something sounds too good to be true it usually is, so until we see proper benchmarks in multiple games, I’m of the opinion that this product will fail.

Well, Lucid with the Hydra, are trying a totally different solution to the problem so they're not really beating ATI and nVidia's engineers at they're own game, they're changing the rules.

Also I for one am holding thumbs for the Hydra as this could truly be something that all PC gamers could benefit from.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Lucid's engineers don't have to be smarter than ATI or NV ones. They have a whole new batch of resources to draw from. Remember that Lucid's solution would add about $80 to the price of a motherboard -- this is far more than mainstream products from NV and ATI cost retail!
Well, Lucid with the Hydra, are trying a totally different solution to the problem so they're not really beating ATI and nVidia's engineers at they're own game, they're changing the rules.
I’m not sure what you mean there.

To recap, Lucid promises completely transparent and perfect multi-GPU scaling that is vendor agnostic and doesn’t need to be programmed for individual games.

Again, if this were possible while attaining equal to or better performance to what IHVs get now, don’t you think ATi/nVidia would’ve done by it now? Both invest in huge amounts of resources into constant driver development for SLI/CF scaling. If they could come up with a magic chip that does scaling automatically for all current and future titles, don’t you think they would’ve already done so?

The Lucid chip is essentially SFR that operates at the API level instead of the driver level like nVidia/ATi, and is also a compositing chip. Again, if such a magic chip were possible, it would be trivial for nVidia/ATi to add it to their existing billion plus transistor boards. Indeed, back when CF started, the master cards already had composition chips, which is essentially half of Lucid.

I predict at best this product will deliver marginal performance gains (say 10%-20% from a second identical card), and will still need regular driver updates for individual games.
Also I for one am holding thumbs for the Hydra as this could truly be something that all PC gamers could benefit from.
If by some miracle it actually delivers on its promise, it’s highly likely it’ll be shut down by nVidia, and possibly by ATi too. That’ll leave perfect scaling to S3 and Intel, LOL. ;)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
I’m not sure what you mean there.

To recap, Lucid promises completely transparent and perfect multi-GPU scaling that is vendor agnostic and doesn’t need to be programmed for individual games.

Again, if this were possible while attaining equal to or better performance to what IHVs get now, don’t you think ATi/nVidia would’ve done by it now? Both invest in huge amounts of resources into constant driver development for SLI/CF scaling. If they could come up with a magic chip that does scaling automatically for all current and future titles, don’t you think they would’ve already done so?

Alternatively, but still the getting at the same point, even if it weren't cost-justified to produce such a product AMD and NV are not ran by fools and they would have locked up the IP to explicitly prevent someone from coming along and upstaging them like this.

I personally have a couple of patents that were files on my behalf by my employer at the time as a matter of "carving out a buffer" in the IP space that surrounded the actual implemented manifestation of that IP in product at the time. The concepts are not unique to my then-employer.

If there were low-hanging fruit in the IP space for creating a product as Lucid claims to have done then I will really be astounded that neither AMD nor Nvidia scooped up the patent on it first even if the intent was just to put that patent on a shelf. At TI we were incentived to apply for patents, we got $1k if TI legal decided our IP was worth applying for a patent, and another $1k if the patent was awarded. And applying for patents is big business, no one treats it lightly.

So I just find it very very hard to fathom that Lucid's claims will actually work as claimed and yet thousands of the smartest engineers in the industry had overlooked the opportunity to at least file an application covering the IP even if it was done simply as a "move to block".

It happens, I'm not saying it is impossible, but I am saying the likelihood is just so fantastically small that I am applying Occam's razor here and continue to remind myself that extraordinary claims carry the burden of providing extraordinary proof...and to date all we have are claims and actually no proof whatsoever that the claims are valid.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Alternatively, but still the getting at the same point, even if it weren't cost-justified to produce such a product AMD and NV are not ran by fools and they would have locked up the IP to explicitly prevent someone from coming along and upstaging them like this.

Excellent points. However, the Lucid guys somehow got Intel to invest. That seems to imply they found *something* of value to develop.

The patent swamp is exactly why we shouldn't expect any innovation from anyone but the well established players with global cross-licensing agreements. As a nerd hardware enthusiast I still tend to latch on to hope of a plucky David making some noise before Goliath pounds him to jelly with the litigation club. In reality this isn't likely to ever happen precisely for the reasons you bring up.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
A review for those who thought this was vaporware. Mixed results and not linear but it does work. A little speculation about NVidia's role in the delay also. http://www.pcper.com/article.php?aid=815

Awesome! Thanks for adding this link to the thread.

Wow, look at some of these results, now that is pretty cool. Finally some bite to back up their bark.

multi-coj.jpg


I particularly like what I see here in this following graph in terms of how adding a powerful GPU (GTX285) as an upgrade to say your existing weaker GPU (GT250) allows you to essentially use that weaker GPU to boost those min-frame rates:

gtx285-coj.jpg


To me that is "value-add" right there...I'd love to be able to upgrade my GPU but not have to toss my existing one.

Not sure if that value-add represents a value to me personally (varies by person of course) that exceeds the ~$70 price premium a hydra-enabled mobo is expected to garner...but clearly the hardware is very much NOT vaporware as I had characterized it earlier in this thread.

Given that the testing done by PCPer is all DX10 based (or did I miss the DX11 tests?) I wonder if the delay on the MSI Big Bang is DX11 driver related?

Maybe MSI/Lucid were thinking Fermi would be out by now and as such Lucid would have had a chance to verify their drivers work with Nvidia's DX11 hardware before Lucid releases their driver-set?

Since Fermi appears to be coming to market later than most people would have presumed NV had targeted its release date to be (<- that's a politically correct way of saying its late...) it might just be that Lucid has no choice but to wait for Fermi's release (or more specifically for the drivers to be available) so Lucid can make sure there isn't any anti-hydra poison pills contained in the new drivers as there are in the existing drivers for Physx-stuff.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91

Those guys had this observation to make:

Although there were driver issues that prevented us from completing some benchmarks, we've been informed that the problems will be resolved over time as Lucid's driver development team is works to squash bugs with a wider range of hardware and software.

Doesn't this basically confirm MSI's side of the "where is Hydra big-bang?" story in which they claim the delay is caused by Lucid needing more time to get their drivers worked out?

I mean if the drivers have issues today with completing a few carefully controlled in-house benchmarks then how ready can the drivers really be for commercial rollout by MSI in which their customers are going to throw all kinds of configurations and games at the hydra-drivers?

Here is the quote from the PCPer link you provided earlier:
MSI stated to us that some driver issues on Lucid’s side of things were holding back the release while Lucid clearly told us that the driver was ready to go and that they didn’t know what the hold was.

To me it would appear that hothardware has called BS on Lucid's claim to PCPer and that MSI is correct and NV has nothing to do with the current delay.
 

Majic 7

Senior member
Mar 27, 2008
668
0
0
I got that impression also IDC. I'm still open to speculation about Nvidia, just because. What's good for Nvidia may mean it's sucking to be us.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
ha ha, I like that, a man that likes to keep his options open! I can appreciate that.

Yeah, regardless the subject matter, to be sure what is optimal for NV is rarely going to be good for us consumers since we are what pays their bills and creates the equity for their shareholders in the first place. Lucid is counting us to assist them in a similar capacity as well.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Meh, hydra looks disappointing. The scaling is not as good as they promised, and the additional cost of the hydra chip is excessive. It'd be a reasonable alternative to SLI at the high end, but anyone else should just go for a crossfire motherboard and put that $70 toward faster video cards.

I guess it is one of those 'the best at any cost' solutions, but if it has even more driver problems than SLI or Crossfire, what's the point? And anyone spending that much on a motherboard doesn't have much reason to use a last gen video card with a next gen.
BTW, won't there be image quality differences? The AF and AA algorithms differ from ATI to nvidia, as well as between card generations. There may be a difference in gamma or some other output settings too.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Meh, hydra looks disappointing. The scaling is not as good as they promised, and the additional cost of the hydra chip is excessive. It'd be a reasonable alternative to SLI at the high end, but anyone else should just go for a crossfire motherboard and put that $70 toward faster video cards.

I guess it is one of those 'the best at any cost' solutions, but if it has even more driver problems than SLI or Crossfire, what's the point? And anyone spending that much on a motherboard doesn't have much reason to use a last gen video card with a next gen.
BTW, won't there be image quality differences? The AF and AA algorithms differ from ATI to nvidia, as well as between card generations. There may be a difference in gamma or some other output settings too.

For those who are willing to drop $400 not once, but twice, on the best graphics card, the extra $70-80 would be decently worth an extra 15-25% in performance. If the performance gains are only 10% or so, it might not be as cost effective. Don't forget that many flagship MBs are already $300-400, so an extra $50 or so isn't a huge deal. I don't see this feature on sub $200 MBs.

Personally, I am cautious about the performance too. We need to see some actual results and samples distributed for review before I hold my breath for one actually being for sale.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
The Pcper article hinted that it was a combination of nV pressure and driver issues that delayed the product.

Finally...we see some performance numbers...I really hope this is for real. I would love to give this a shot.

Does anyone know if 3 cards could work?