• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

nVidia GTX460 SLI v.s. Intel P67 Express Chipset

Gogger

Junior Member
Aug 31, 2011
3
0
0
desktoppers.door44.com
I recently ordered a Dell Alienware (my second in two years) and went as all out as I could for 3D rendering. The machine gets here and renders like a dog! The same scene, same software as on my old R1 Alienware (Win7, 12GB Ram, 2 TB HD, 875W PSU, nVidia GTX295 dual GPU on one card) rendered in 22 minutes. The new R3 Alienware (Win 7 Pro, 16 GB Ram, 2 TB HD, 875W PSU, dual 1GB GTX460's in SLI) rendered in 2 hours and 23 minutes. YUK!

I just spent (not exaggerating) EIGHT hours straight on the phone with Dell tech support and various agents going round in circles. In the end it turns out that the Intel P67 Express chipset on the motherboard runs the PCI 2.0 slots at 16x for one card, but cuts the PCI 2.0 buses to 8x when two cards are installed, effectively making it perform as if you had one card anyway. I FINALLY got them to understand and acknowledge this (they tried to blame e-on software's VUE 9.5 Infinite 3D program). I had swapped my GTX295 into the new rig and it took 2hrs to render. I tried just one GTX460 and it still was poor. I put the GTX460 in my old Alienware and rendered in 23 minutes so clearly there was not a bad card issue.

So in the end, and why I am writing, does anyone know anything about this? Is it correct to state the P67 chipset hobbles the dual video card setup to the equivalent of a single video card via bus management? And WHY did the single card GTX295 (with dual GPUs) fail so miserably on the new R3 Alienware then? And yes we did the Driver dance, I uninstalled VUE and reinstalled it all, as well as the video drivers, to no avail. We ran a dozen different benchmarks. Eventually some benchmarks turned out pretty decent, but ultimately for my $2600 machine, I should have gotten better performance. I call tomorrow for a return and *possibly* an exchange for a machine that will actually work as advertised. Don't get me wrong, I LOVE my Alienware!!!! I just got a configuration nightmare from the web site configurator.

Anyone concur or disagree with the P67 v.s. Dual video card conflict? Any light shed on this would be appreciated.
 

Mallibu

Senior member
Jun 20, 2011
243
0
0
I don't know what this bug might be,but the difference between 16x+16x vs 8x+8x is less than 1%
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
746
126
effectively making it perform as if you had one card anyway[/B]. I FINALLY got them to understand and acknowledge this (they tried to blame e-on software's VUE 9.5 Infinite 3D program).
This is not true. The performance hit between PCIe 2.0 16x/8x is about 1-2%, while between 16x and 4x is about 7-8% on a GTX480!

I had swapped my GTX295 into the new rig and it took 2hrs to render. I tried just one GTX460 and it still was poor. I put the GTX460 in my old Alienware and rendered in 23 minutes so clearly there was not a bad card issue. And WHY did the single card GTX295 (with dual GPUs) fail so miserably on the new R3 Alienware then?
Clearly! That likely means you have other problems:

1) Did you install proper Intel chipset drivers?
2) Does your motherboard have the latest BIOS installed?
3) Does your CPU properly scale to its intended clock speed in loaded state (use CPU-Z to check that)
4) Are you sure none of your components are overheating / thermal throttling? Check CPU/GPU temps with HWMonitor
5) Do you have CUDA enabled/installed? (I am not sure if this helps for rendering with the software you use, but you can check online)
6) What CPU do you have in the rig that renders in 23 min vs. one that does it in 2 hours (the new one?)

Did you check all the settings for multi-core / multi-threading are enabled in this rendering software you use?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
746
126
Ok I did a little bit of searching for you:

http://forum.cgpersia.com/f27/vue-9-5-multicore-processing-36219/

1. It seems this VUE software doesn't fully utilize all threads/cores in a processor (i.e., very inefficient software). You can check this with either Windows Task Manager, All CPU Meter or Core Temp

2. Check your VUE settings.

It seems this software is NOT suitable for rendering movies, not even on a 2600k CPU. It's simply too intensive for consumer applications outside of 30 sec to 1 min ads.

From the link above by 1 poster with a 2600k, 24 GBs of RAM:
"I consider "Final" to be the only choice, most of the renders you see on Quadspinner blog are rendered on final only too, with render time between 1 and 6 hours for 1920x1080 renders"

What about any other rendering software that you use? Did you see any performance improvement?
 
Last edited:

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,266
70
91
In the end it turns out that the Intel P67 Express chipset on the motherboard runs the PCI 2.0 slots at 16x for one card, but cuts the PCI 2.0 buses to 8x when two cards are installed, effectively making it perform as if you had one card anyway.
 

Gogger

Junior Member
Aug 31, 2011
3
0
0
desktoppers.door44.com
RussianSensation, thanks for replying and checking on things. I spent over 8 hours straight (not exaggerating) on the phone with Alienware tech support and they ran gazillions of tests and various benchmarks. In the end the bottom line was that I wasn't getting better performance than my 16 month old Alienware so I sent the new one back. I did order a new computer with a single 1.5 GB nVidia GTX 580. I kept getting conflicting information as to whether the Intel chipset was a problem with two video cards in SLI. I went with what I could observe and acted accordingly as time was running out for the return. I simply didn't have time left to do hours and hours of troubleshooting and felt I shouldn't have to with a new, high end system right out of the box. Normally I don't mind tinkering with the guts of my PCs but I simply did not have the luxury of time. (I would have thought Dell might offer an extended return period whilst figuring out what was going on. Seems they'd just rather it be returned.) Just wanted to say THANKS for your efforts!
 

BladeVenom

Lifer
Jun 2, 2005
13,540
16
0
If you're doing 3d rendering, wouldn't you be better off with one of their Quadro workstation cards, instead of a gaming card?
 

Gogger

Junior Member
Aug 31, 2011
3
0
0
desktoppers.door44.com
Answer: I'm not a fan of ATI video cards. Been there done that.

New: I got my new Alienware R3 with a 1.5GB GTX580 and performance-wise it is everything I wanted but didn't get with the one that started this thread. The 480's would roar to life if I even just moved the mouse cursor (it seemed) but the GTX580 is whisper quiet even under heavy rendering. With my old Alienware and new one working the Render Farm an image that took 65 minutes to render on the old now takes just 20 minutes together. AWESOME! I couldn't be happier with my 580!!!

:: end ::
 

blackened23

Diamond Member
Jul 26, 2011
8,548
1
0
To answer your question, P67 and Z68 chipsets do not support x16 / x16 SLI/crossfire. I'm surprised that the tech support drones you spoke to weren't aware of this. The maximum supported is dual x8 -- Only the x58 chipset supports dual x16.

There's an exception, some motherboard makers charge a premium for the nvidia nf200 chip which allows dualx16 on any chipset, but its really not worth it - dual x8 only has a 1-2% penalty compared to dual x16.

The reason for your performance penalty is confusing, it shouldn't be because of the dual x8 configuration. Multiple benchmark sites have shown the performance impact to be nearly non existant....do keep us updated if you figure it out. Sounds like the issue may lie elsewhere.

edit: just noticed your most recent reply...oops.
 
Last edited:

ASK THE COMMUNITY