Where Are The Gigabyte GA-N680SLI-DQ6 Motherboards?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AstroDogg

Member
Feb 22, 2007
111
0
0
Originally posted by: Arctucas
Justin,

The cooling may be somewhat better, but, with the CrazyCool backplate it seems some modification is necessary for most after market coolers. As to the 3rd PCI slot, I can't really think what I would use it for? And anyway, who needs four ethernet ports? By the way, I just checked ZZF, the DQ6 is up to $369.99 USD (in stock).

As the back plate cooler goes since the heat sinks are not conected as it looked in an earlier photo of the early release board the cooler certainly would not move heat to the rear from the north bridge. It looks like to me firmly attaching the crazycooler to the board it could be used as the back plate support. since the only reason of a backplate is to strengthen the board from warping and breaking its connetions inside the board from the wieght of a massive cooler hanging off the frontside.Ofcourse this depends on what exactly the cooler is mounted-on I'm guess flat against the back of the board but some sort of heat risermakes more sense because of the solder points in this area.
You can stick a pci-e 1x or a 4x card in a 16x slot. If your running sli with two 2-slot cards you only have one avalible pci slot and one pci-e 1x and one pci-e 16x only good for a short 1x card because a long card in this slot would restrict airflow to the first video card. Good thing it has onboard gigabite Lan that takes care of that card(1 would of been sufficient(lol). Now the only thing I can think of useing the pci-e 1x slot for is an Eide controller card or buy a bunch or sata to ide converters for the TaraByte of ide drives I have. Isnt that silly a board with so much drive capacity to have to add-on a card to support some more drives.
I guess creative is waiting for the pci-x slot why haven't they made a pci-e card. Atleast this board supports a spot for 1 pci card. Enthusiast board?? hmmmm. Never met an enthusiast that was happy with onboard sound. Channel seperation is vital to on-line gaming especially first person shooters. Something not talked about much in the computer audio industry for some reason.
 

Flankerspanker

Junior Member
Feb 22, 2007
6
0
61
A nice preview article and, despite the delays, worth waiting for. I found the information I was looking for on Page 4:

"Although the board is touted as being 'Quad-Core optimized', we did not see it. "

and Page 9:

"While most people would not be too worried about this or might simply raise the multiplier to reach high CPU speeds with the QX6700, we consider it to be a problem when the boards will not reach at least 333FSB. The reason for this is that the upcoming 1333FSB (Quad Pumped) quad core processors will require the 333FSB base. When these hit the market we have to wonder if these boards will perform as advertised."

Since my only interest in the 680i chipset is for its advertised Quad Core 1333FSB support, the preview gives me exactly the information I was lookng for - high praise to the article for including this, especially when 8 pages of forum discussion were unable to make it so clear! I'm surprised more people weren't asking about it.

The only part I found a bit odd was:

"Gigabyte engineering is working to address this shortcoming, although if it is being caused by the chipset itself they may not have much success."

...if we know the Striker is already passing the mark with the same chipset, then why express such a suspicion that it's the chipset?

Inflated prices are happily not a showstopper for me, but this seemingly innocuous comment is preventing me from immediately choosing the Striker (it's as if the author doesn't even trust the result he got with it himself?!), or any other 680i board at this time... I'll wait a bit longer and see how this quad core 1333FSB business sorts out.

Thanks to all for the help and advice!
 

AstroDogg

Member
Feb 22, 2007
111
0
0
Ok slow down Flankerspanker look at the specs of the chip Gary is forced to use here first I think your under the impression that this is a 1333 when in fact it is a 1066 fsb chip
http://processorfinder.intel.com/details.aspx?sspec=sl9ul
Correct me if I'm wrong Gary but it is also a engineering chip which means it has a good chance of using silicon that was only avalible for the chips being made in full production at that time. With chip advancements come refinements in the processing of the silicon. By production time the refinements in silicon will be more pure and could be of better quality. That is very short and simple without going into advances/production refinements of the circuitry that lies inside and underneath the chip.
Even know the Striker has got over the 333 "mark" with a quad-core 1033fsb not a 1333fsb cpu. 410 still isn't impressive, Not with an unlocked cpu. It still is not as high as the Intel chipset with the same processor. We were expecting these boards to go over 2000fsb out of the box with some good cooling solutions. 410 is still along ways from 500. The P965's and 975 from Intel are making this chipset look well; rather poor. You think all this money time and effort on Nvidia's part would pay-off in something that could atleast marginally beat-out a chipset that is over 1½ years old. Ya know!
I Don't expect the quad-core to overclock as high as a dual-core but the quad-core should atleast be at the 480 mark and I'd like to see the dual core hitting 525-550 aircooled
Gary is simply stating a fact that if the 680i's are having a hard time hitting 1333 with a quad-core on a 680i board, But the same proc can easily make that speed on the Intel chipsets. Then there is doubt in the performance of the chipset.
This Maybe more of a remark of speculism or just plain "let down". I'm sure that all the boards with all 680i chipset are going to run the 1333 procs when they come out. How much they will overclock on the 680i chipset is what is of major concern. If they only clock to 410 then the chipset is a waste of money, But if the 1333 quad-core can easily get over the 500Mhz mark then I'd be satisfied. The QX6700 that is being tested here is no 1333 quad-core dont make the assumption that it is the same or even on par to it.
Does this makes sense and shed some light on your concerns?
 

TheBeagle

Senior member
Apr 5, 2005
508
0
0
Well, Mr. Astro Man, it is indeed unfortunate that you took offense at my general comments to Mr. Murder Man. We all have our opinions about these matters; and obviously, as we can all observe, you have quite a few opinions.
 

Flankerspanker

Junior Member
Feb 22, 2007
6
0
61
Originally posted by: AstroDoggI'm sure that all the boards with all 680i chipset are going to run the 1333 procs when they come out.

Pardon me - you're sure? Or the author of the preview is sure, and I've just misunderstood/misread him?

Because from the comment:

"we consider it to be a problem when the boards will not reach at least 333FSB. The reason for this is that the upcoming 1333FSB (Quad Pumped) quad core processors will require the 333FSB base. When these hit the market we have to wonder if these boards will perform as advertised"

...it appears that the people who have actually had a chance to test this board sound a good deal less confident than you do, about this precise exact issue, regardless of the processor they were testing.

I misunderstood something?
 

AstroDogg

Member
Feb 22, 2007
111
0
0
No of course I'm not absolutely positively sure, But if theyt don't Nvidia and the other manufactures are going to lose a great deal of money in recalls and possibly law suits if thier products don't do what they say they will do. Like I stated before if it doesn't support it then return it to the manufacture for the replacement. I'm sure enough that I'd bet my current game box on it.
 

justinburton

Member
Feb 5, 2007
122
0
0
I definately need the extra pci slot. I have dual 8800GTX cards water cooled, 1 physics card, and 1 sound card. I will be adding a 7900GTX in the midlle PCIE slot once I get the 680i-DQ6. My other 8800GTX is just sitting in my closet waiting to be installed along with my 7900TTX as a physics card. Nvidia will announce the new physics slot very soon and I hope the 680i board can run the new physics.
 

AstroDogg

Member
Feb 22, 2007
111
0
0
Originally posted by: justinburton
I definately need the extra pci slot. I have dual 8800GTX cards water cooled, 1 physics card, and 1 sound card. I will be adding a 7900GTX in the midlle PCIE slot once I get the 680i-DQ6. My other 8800GTX is just sitting in my closet waiting to be installed along with my 7900TTX as a physics card. Nvidia will announce the new physics slot very soon and I hope the 680i board can run the new physics.

Justin I don't understand isn't your physics card a pci-e 16x? How are you going to put 4-16x slot cards on one board that only has 3 slots. Maybe you should buy my Qaud royal board(lol) that I'm planing on giving to my mom or just wait for the MSI P6n Diamond also with four pci-e 16 slots. With water cooled GPU's the 4-16x slots spaced every other slot away would work perfect. It also advertises Quad SLI support.

I myself have two eVga SuperClocked 7900GTX cards, one pci-e 1x ide/sata raid controller card. I'm still running an Audigy2zs because Creative never came out with a pci-e X-Fi card. It runs my 7.1 system just fine. Although I wished channel seperation was a little bit better but 50x better than the first audigy card, so I have lived with it. Still better than any on-board audio I've heard. I'm sure going to miss my second monitor without being able to put a long card in the middle 16x slot. Maybe I'll be forced to switch to watercooling after a short time. I started one I bought all the waterblocks still need the pump, res, and rad tho. would need two extra gpu blocks now.
 

AstroDogg

Member
Feb 22, 2007
111
0
0
Ok nevermind, Must of had a momentary Brain Cloud. Correct me if... Your going to use the 7900 as the physics card and run the two 8800's in sli and the only other card will be the pci sound card.
I was understanding that you had(or buying) a devoted physics card and were going to use the 7900 for a second monitor and the two 8800's in sli.

After a further look at the MSI P6N Diamond looks like this setup wont be possible in it either with sli 2@16x.

The final feature that stood out as relatively unique were the four PCI-Express x16 slots - this has been done already by Gigabyte, so it's not truly unique, but it's an interesting feature nonetheless. The PCI-Express lanes are configurable between x8-x8-x16-x8 and x16-x8-x16 using a digital switch similar to the one used on the P4N Diamond motherboard.
http://www.bit-tech.net/news/2007/01/09/msi_p6n_diamond_will_have_on-board_x-fi/

Acording to this there just isnt the amount of express lanes available to support sli with 2@16x and a second video card @ 8x and a physics card at 8x. Like on the Gigabyte Quad Royal. So it makes little sense to me to put 4 16x slots on-board and call it an enthusiast board if it cant run but one card @ 16x and three @ 8x, or two @ 16x and one @ 8x putting us right back to a 3 slot 16x board. Rendering the fouth 16x slot useless. Unless it will still run a 1x card, but I doubt it(further updates of that board may or maynot confirm this).
 

AstroDogg

Member
Feb 22, 2007
111
0
0
Originally posted by: Arctucas
I have followed this thread from the beginning, eagerly anticipating Gary's review. It appears that Gigabyte's 680i offering is not much different from all the others (except for the backup BIOS and four Ethernet ports). I am not particularly impressed with this board, so I bought a Striker ($338.55 USD shipped).

Arctucas: Can you please provide were you found the Striker at this price.
 

justinburton

Member
Feb 5, 2007
122
0
0
AstroDogg,

Here is my plan: 2 8800GTX in SLI at 16X each, 1 7900 GTX between them running at 8X for physics/extra monitor. Also 1 sound card in a pci slot and 1 Asus physX card in the other PCI slot. So I will have two video cards, two physics cards, and 1 sound card. The reason for this is because the PhyX card only works with certain games and the new "Havok" physics that Nvidia is developing will be used on different games. All of this Physics talk is not too related to this thread but you can research the "physics wars" on your own. The war is between PhyX, Nvidia Havok, and ATI Triple-play physics. Which will win, or will they all be agree on a single solution?

Basically all of my slots will be filled except one PCI slot.
 

AstroDogg

Member
Feb 22, 2007
111
0
0
@ justinburton

Ok thanks for the heads-up didn't realize a card could run physics and still use the secondry for an additional monitor.
 

justinburton

Member
Feb 5, 2007
122
0
0
Np. Actually I am not totally sure about the third card being used for physics yet but Nvidia will announce the reason for the third slot very soon. I am postive the third card could be used for another monitor and for Dual View mode. With Nvidia present drivers, you cannot run a second monitor while running SLI, unless you disable SLI and then restart your computer, and then switchback to SLI when you play games. Lame. You would think with the power of two video cards you could run two monitors.
 

AstroDogg

Member
Feb 22, 2007
111
0
0
Originally posted by: justinburton
you cannot run a second monitor while running SLI, unless you disable SLI and then restart your computer, and then switchback to SLI when you play games. Lame. You would think with the power of two video cards you could run two monitors.
Ok it gets kinda confusing for the speaker and the listener when it comes to talking about multicards setups. Sli renders two cards with a total of four monitor outputs down to one usable output. So it is easy two call both these cards your first video card and call a third card your second video card. So I'm kinda taking what you wrote as; you cannot envoke the third card as the second desktop display. To observe apps like motherboard monitor, a runing clock/alarm, or ventrilo to see who is logging in or out, while playing your 3D game on the two Sli cards conected to your first display. Like it can be done on a single videocard with two video outputs. With 1200 to 1600 dollars of video cards I'd be a little more than pissed off not to be able to monitor some of my background apps on my second desktop display as I was playing my favorite game. Damn I can do that on my $26shipped 9800SE 256mb 8pipelines w/Artic Ice Cooler from ebay. Or are you meaning you cannot do this on one of the other outputs of the two sli cards?
 

justinburton

Member
Feb 5, 2007
122
0
0
You are correct. Basically the my 7900GTX will not be used to its full ability. Only 8X pci speed, running 1 19 inch monitor, and for future nvidia physics. I might as well have got your $26 9800se. But I actually got my dual watercooled 8800GTX and 7900GTX for about $400.
 

TheBeagle

Senior member
Apr 5, 2005
508
0
0
Hello Gary. Thanks VERY much for the new BIOS update. I likewise expect arrival of this new board today (28th), and will certainly put the new BIOS to good use. BTW, can you tell us what types of changes/improvements the F3 BIOS final has implemented? Again, thanks for all you efforts on this new board. TheBeagle :)
 

justinburton

Member
Feb 5, 2007
122
0
0
Gary, ur the best. We all owe you a beer. If you ever come to San Jose, CA let me know. If everyone on this forum got Gary a beer, that would give him a lifetime supply of beer.
 

TheBeagle

Senior member
Apr 5, 2005
508
0
0
Hello Murder Man. As per my previous posting in response to yours (2/24/07), the new GA-N680SLI-DQ6 board works GREAT with the Zalman 9700. However, you just need to cut the four corners off the plastic Zalman bottom mounting plate (include enough of the plastic plate to securely hold the bottom mounting nuts), and then utilize the provided screws to securely attach the upper mounting frame to the motherboard. If you just take a little extra time, you can shave down one of those corners, including the retaining nut, and it will fit very nicely into the open mounting hole in the underside of the Gigabyte Crazy-Cool heatsink. It all works like a charm. Hope that provides some assistance to you. Enjoy. TheBeagle :)
 

Murderlove

Junior Member
Feb 24, 2007
5
0
0
Hello Beagle,

You indeed kept your word, thanks for that. I'm thrilled that the 9700 fits this mobo. Makes it even harder to resist :) I'll make sure to follow your info on mouting the CPU cooler. But I think I'll wait before buying it. Something tells me to wait. And that's not meant in a negative way towards this mobo. On the contrary. But for now, my Abit AW9D-Max provides everything I need.