[tomshardware] Pioneering The Multi-GPU Future With Oxide Games

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
A little over a month ago, the tech community was abuzz over an article about using an AMD GPU and an Nvidia GPU in the same system running Ashes of the Singularity from Oxide Games in DirectX 12. That early evaluation covered the use of a number of top tier graphics cards as well as a pairing of some older top tier graphics cards from a few years ago. Although the test served as an example of the benefits of such a configuration, it generated a number of questions from the community, such as why integrated graphics weren’t tested out and what happens when you use two very different graphics cards, such as a brand new GPU paired with an older one from a previous generation, or a lower tier card paired with a higher tier card.

I recently had the opportunity to speak with Dan Baker, one of the co-founders of Oxide Games and the creator of the Nitrous Engine, to discover these answers and learn much more about the engine and DirectX 12.

Read more
http://www.tomshardware.com/news/oxide-games-dan-baker-interview,30665.html
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
People always ask why games don’t look like Pixar films when currently available computer hardware is hundreds of times faster than the machines available in the 1990s when Toy Story was rendered. Although the graphics in these movies don’t have the same level of detailed textures, the 3D objects in them are crisp and without jaggy edges.

When the team at Oxide sat down to start building the Nitrous engine, the team wondered if it were now possible to apply these rendering techniques in real time. Baker said that every industry insider the team talked to loved the idea but didn’t believe that it would be possible to get it working fast enough. In practice, Oxide discovered that today’s graphics hardware is, in fact, capable of such calculations.

Very cool.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Baker said the hard part for his team was wrapping their heads around how to use the new API, but once they figured out that part, multi-GPU support just worked.

He said they are no longer dependent on GPU driver updates, which takes away some of the fragility of working with multiple graphics cards. He believes this will pave the way for wide support for multi-GPU configurations with a variety of games, regardless of driver support.
This is going to be huge if this becomes the norm across the industry. SLI/Crossfire is fundamentally broken right now. I've had two 290s and sold them off simply because the support is laughable. It's not that Crossfire profiles weren't there for the games I played, it's just the performance and frame time latency inconsistencies. From what I've heard over at /r/Nvidia, SLI isn't doing much better.

Secondly, being able to mix AMD/NV GPUs is going to be make us look back in amazement and wonder how we ever tolerated an environment where that wasn't the case. It's such a no-brainer, pro-consumer thing to do.

Lastly, since the responsibility of multi-GPU support is being shifted from NV/AMD to the game devs, that does make a part of me slightly skeptical just how widespread the adoption will be.

I mean, avant garde game devs like Oxide, who are exceptionally technicially proficient, will power ahead. I expect companies like DICE to be on the same level.

That being said, there are plenty of big companies who are known for crappy game code(Bethesda, Ubisoft, need I go on?). How would they implement this? They can barely create games in which single dGPU performance is good enough.

So a part of me wonders if everything will change. I think in games where the devs know what they are doing, dual-GPU (or more!) support will be like a dream, instead of a broken hack like it is today. But the distance between the haves and the have-nots could explode, if you'll excuse the Dickensian terminology. The fault would lie, unlike in Dicken's novels, resting firmly on the have-nots. But still. I'm not sure we'll see a brave new world where multi-GPU support is essentially fixed.

Likely we'll see more polarisation/fragmentation where it either works better than ever or not at all/exceptionally poorly.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
That being said, there are plenty of big companies who are known for crappy game code(Bethesda, Ubisoft, need I go on?). How would they implement this? They can barely create games in which single dGPU performance is good enough.

So a part of me wonders if everything will change. I think in games where the devs know what they are doing, dual-GPU (or more!) support will be like a dream, instead of a broken hack like it is today. But the distance between the haves and the have-nots could explode, if you'll excuse the Dickensian terminology. The fault would lie, unlike in Dicken's novels, resting firmly on the have-nots. But still. I'm not sure we'll see a brave new world where multi-GPU support is essentially fixed.

Likely we'll see more polarisation/fragmentation where it either works better than ever or not at all/exceptionally poorly.

This is my fear as well. Nvidia will have to spin up their SLI-works department to send an army of coders to the devs to do it for them or it won't happen.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
If you read the article carefully, he made it clear that it isn't very practical when the GPU's are not of similar performance. That usually means that when you upgrade to something new and faster, the old one wouldn't be worth pairing up with the new one. And since not every game is DX12 or designed to use multi-vendors, it still doesn't make much sense to willfully plan for. You are still best off using 2+ of the same card when going multi-GPU.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
That OSR technique is definitely the most exciting thing I read in that article.

We can finally live the childhood dream of playing Toy Story!

I wish someone like Nvidia or AMD would just freaking pay the licensing fee and give us a Toy Story demo like that Unreal Kite demo. The industry hype of "WE CAN FINALLY DO TOY STORY REALTIME!" would be worth whatever it costs to whoever sponsored it.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
This is my fear as well. Nvidia will have to spin up their SLI-works department to send an army of coders to the devs to do it for them or it won't happen.

I agree. If Nvidia is too lazy to create SLI profiles for games today with the benefit of increased revenue by selling more cards, then what incentive does a developer have that gets nearly zero benefit?

A developer may think they'll get more sales since it runs on a wider range of hardware and the people who invested in a multiple GPU system might be more inclined to make purchases on a game that can utilize their hardware. But that seems like a very niche case and probably not worth the effort. This is sad, since my experience with Crossfire on my 290Xs and 6950s was mostly very positive.

Unless DX12 makes its VERY easy for a developer, I think we'll continue to see less and less games support it.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
We can finally live the childhood dream of playing Toy Story!

I wish someone like Nvidia or AMD would just freaking pay the licensing fee and give us a Toy Story demo like that Unreal Kite demo. The industry hype of "WE CAN FINALLY DO TOY STORY REALTIME!" would be worth whatever it costs to whoever sponsored it.

I don't care about rendering Toy Story real time, but I like the idea of not having severe aliasing using this technique. They talked about the technique as if it is going to be used in future game engines.