• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GTX X2 coming?

I moved this thread from CPU's, as it belongs in the video forum. I forgot to put a comment in your post first.
 
Originally posted by: jaredpace
Wonder if they'll put it on a single PCB or go the sandwich route?

Good question...considering the memory bus size and size of the die itself it might be better to go sandwich a la GX2?
 
Originally posted by: thilan29
Originally posted by: jaredpace
Wonder if they'll put it on a single PCB or go the sandwich route?

Good question...considering the memory bus size and size of the die itself it might be better to go sandwich a la GX2?

Would 55nm open up single pcb possibilities? nVidia is experienced with the sandwich cards though. Seems they'd go with what they know. Will be a good card in any case. One thing I love about nV is the decent scaling. Should be interesting..

 
nVidia's got better scaling because they bought up 3Dfx and their SLI technology and they've had alot of time to tweak it. ATI had to do their implementation by hand.🙂
 
Originally posted by: thilan29
Originally posted by: jaredpace
Wonder if they'll put it on a single PCB or go the sandwich route?

Good question...considering the memory bus size and size of the die itself it might be better to go sandwich a la GX2?

They're already using a 14 layer PCB due to the routing necessary for the 448-bit memory bus. Even though the GTX280 is built on the same PCB with a 512-Bit memory bus, I doubt there's enough room left over to add the necessary lines for tying a second GPU into power, memory & PCIE slot plus adding an NVIO chip and ITS traces. My bet is the sandwich style as well.
 
I am surprised it has taken this long for something to compete on the top end. There must be some hold up on the 55nm chips.
 
Originally posted by: FalseChristian
nVidia's got better scaling because they bought up 3Dfx and their SLI technology and they've had alot of time to tweak it. ATI had to do their implementation by hand.🙂

Umm, no. 3Dfx only ever used SFR (Split Frame Rendering). nVidia and ATI both use AFR. nVidia tried SFR with their cards years ago, and for whatever reason, decided not to use it. So actually, all nVidia got out of the deal was being able to call two cards pushing one monitor SLI. You're right about the reason nVidia used to be farther ahead in scaling, though. They did start way sooner than ATI, but mostly just because they had more money to sink into the technology, I'm pretty sure. BTW, with alot of games these days, the ATI cards are scaling as well as the nVidia cards, and better sometimes.
 
Originally posted by: myocardia
Umm, no. 3Dfx only ever used SFR (Split Frame Rendering). nVidia and ATI both use AFR. nVidia tried SFR with their cards years ago, and for whatever reason, decided not to use it.

SFR might see a resurgence... Lucid Hydra anyone?
 
Regardless of if they do it or not, I'm not sure that many people are going to want to pay for it! No doubt, if they managed to make a 280x2, it would be one bad ass card (heck... 260x2 would be a nice piece of hardware too). I sure as hell wouldn't want to see the price tag on a 280x2 though! 260x2 could be fairly reasonable (or as reasonable as you get for top end hardware).

That being said, at the rate these cards are growing at (size wise), they are going to have to start building parts of them in 5.25" enclosures and connecting them to the main "card" via cables.
 
Originally posted by: Wuzup101
That being said, at the rate these cards are growing at (size wise), they are going to have to start building parts of them in 5.25" enclosures and connecting them to the main "card" via cables.

[[[]]]
[[[]]]
[[[]]]
[[[]]]
[[[]]] [[[]]]
[[[]]] [[[]]]
[[[]]] [[[]]]
[[[]]] [[[]]]
[[[]]] [[[]]]

Wuzup: Hey Zap, cute little computer tower you have there.
Zap: Nah, that's my computer.
Wuzup: Really? Then what is the big tower next to it?
Zap: That's my video card.

😛
 
Originally posted by: Zap
Originally posted by: myocardia
Umm, no. 3Dfx only ever used SFR (Split Frame Rendering). nVidia and ATI both use AFR. nVidia tried SFR with their cards years ago, and for whatever reason, decided not to use it.

SFR might see a resurgence... Lucid Hydra anyone?

that is a lot more advanced then SFR... it doesn't split the frame, it splits the individual calculations.
 
Back
Top