R600 Installed

SexyK

Golden Member
Jul 30, 2001
1,343
4
76
Sticking by the 300W numbers. Once confirmed, I think we can all agree... that's scary.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
:roll: We've been over this before. It's impossible for the R600 to pull 300 watts.

PCie slot - Provides 75 watts maximum
6 pin PCIe power connector - Provides 75 watts maximum
8 pin PCIe 2.0 power connector - Provides 150 watts maximum

That's a total of 300 watts maximum. No computer component has been or will ever be designed to draw the absolutely limit of power that is possible to be supplied to it.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SexyK
Sticking by the 300W numbers. Once confirmed, I think we can all agree... that's scary.

Imagine Quad-Crossfire :shocked:
 

Modular

Diamond Member
Jul 1, 2005
5,027
67
91
Originally posted by: Creig
:roll: We've been over this before. It's impossible for the R600 to pull 300 watts.

PCie slot - Provides 75 watts maximum
6 pin PCIe power connector - Provides 75 watts maximum
8 pin PCIe 2.0 power connector - Provides 150 watts maximum

That's a total of 300 watts maximum. No computer component has been or will ever be designed to draw the absolutely limit of power that is possible to be supplied to it.

8-pin x2?

Or does that break the 2.0 power specifications?
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
No, that would work. It's possible, but every picture we've seen so far has shown either 2 6-pin connectors or 1 6-pin and 1 8-pin.

In one of the threads here, there was a quote saying that the R600 would work with 2 6-pin connectors, the same as the 8800GTX. But doing so would disable the factory overclocking option in the driver.

Whether or not this is true is anybody's guess.
 

XMan

Lifer
Oct 9, 1999
12,513
49
91
Sheesh.

I don't care how fast it is, 14 inches is waaaaay too long. I don't think I could fit that in my case without removing my hard drive bays.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Originally posted by: XMan
Sheesh.

I don't care how fast it is, 14 inches is waaaaay too long. I don't think I could fit that in my case without removing my hard drive bays.

This has been discussed multiple times. You are looking at the OEM version. The retail version will be much shorter.
 

KeithTalent

Elite Member | Administrator | No Lifer
Administrator
Nov 30, 2005
50,231
118
116
Length & wattage are completely irrelevant to me. Show me benchmarks and give me a price as that is all I care about.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
No SLI-like connectors?

Well, at least it doesn't have a dongle...

Two GTS 640 MB's would be perfect if nVidia new how to get their own SLI working properly.

So far 2007 has been nothing but troubleshooting and waiting it seems.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
I'm deeply concerned by the amount of emphasis being placed on R600's number crunching ability rather than its pixel shading capability.

Exactly why do we enthusiasts care that R600 crossfire can push a teraflop?
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Now it's up top I dont mind bumping to say: This thread is worthless w/o benchmarks ::::Delete:::

 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Originally posted by: KeithTalent
Length & wattage are completely irrelevant to me. Show me benchmarks and give me a price as that is all I care about.

QFT. Length and wattage are on the priority list but occupy a place much further down the list than performance and price.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Originally posted by: josh6079
No SLI-like connectors?

Well, at least it doesn't have a dongle...

Two GTS 640 MB's would be perfect if nVidia new how to get their own SLI working properly.

So far 2007 has been nothing but troubleshooting and waiting it seems.

The 2 cards used for "Stream Computing" do not use Crossfire. 2 8800GTS 640MB's are quite nice for the most part now, just a few application issues with SLI interfering with my happiness.

@Matt2, over at XS they drool over Folding@Home etc so some enthusiasts do care.

@Creig the GPGPU based on the XTX can have up to 4GB of VRAM so I can see that hitting ~270W possibly.
 

shabby

Diamond Member
Oct 9, 1999
5,782
45
91
While were talking about old news i found these specs of the r600...

65nm
64 Shader pipelines (Vec4+Scalar)
32 TMU's
32 ROPs
128 Shader Operations per Cycle
800MHz Core
102.4 billion shader ops/sec
512GFLOPs for the shaders
2 Billion triangles/sec
25.6 Gpixels/Gtexels/sec
256-bit 512MB 1.8GHz GDDR4 Memory
57.6 GB/sec Bandwidth (at 1.8GHz)
WGF2.0 Unified Shader
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Creig
:roll: We've been over this before. It's impossible for the R600 to pull 300 watts.

PCie slot - Provides 75 watts maximum
6 pin PCIe power connector - Provides 75 watts maximum
8 pin PCIe 2.0 power connector - Provides 150 watts maximum

That's a total of 300 watts maximum. No computer component has been or will ever be designed to draw the absolutely limit of power that is possible to be supplied to it.

Actually, you just confirmed that it "is" technically possible for an R600 to pull 300W. You added it all up for us and everything. Not saying it does pull 300, but it could.

And I bolded all the presumptuous remarks. You cannot know what "will ever" come out in the future. Why do you think power bricks are sometimes considered? Because they are fearful that the product will pull more power than the system can provide.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
If it uses 300w, doesn't that bode well for its performance (more than likely anyway)?

AMD recently stated that it will only use 200w. This news doesn't make sense.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: shabby
While were talking about old news i found these specs of the r600...

65nm
64 Shader pipelines (Vec4+Scalar)
32 TMU's
32 ROPs
128 Shader Operations per Cycle
800MHz Core
102.4 billion shader ops/sec
512GFLOPs for the shaders
2 Billion triangles/sec
25.6 Gpixels/Gtexels/sec
256-bit 512MB 1.8GHz GDDR4 Memory
57.6 GB/sec Bandwidth (at 1.8GHz)
WGF2.0 Unified Shader

I doubt it will be 65nm.. AMD hasnt even got processors on 65nm yet afaik, i would think they would concentrate on getting those out the door first before getting a video card on a 65nm process. 80nm is more likely
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: keysplayr2003
Originally posted by: Creig
:roll: We've been over this before. It's impossible for the R600 to pull 300 watts.

PCie slot - Provides 75 watts maximum
6 pin PCIe power connector - Provides 75 watts maximum
8 pin PCIe 2.0 power connector - Provides 150 watts maximum

That's a total of 300 watts maximum. No computer component has been or will ever be designed to draw the absolutely limit of power that is possible to be supplied to it.

Actually, you just confirmed that it "is" technically possible for an R600 to pull 300W. You added it all up for us and everything. Not saying it does pull 300, but it could.

It's a good thing you're not an engineer with thinking like that. Variances between cards (some draw slightly more, some draw slightly less), variances in heat can cause increased/decreased resistence, etc.

For the same reason nobody runs a system with a PSU that puts out only enough to run the components in the case, no company will design a component that pulls every last watt available to it. Every item in your computer has a certain "safety margin" built into it. Why do you think it is that video card companies put out a high "recommended" PSU requirement, yet the actual recorded draw during private testing is always lower?

Originally posted by: keysplayr2003
And I bolded all the presumptuous remarks. You cannot know what "will ever" come out in the future. Why do you think power bricks are sometimes considered? Because they are fearful that the product will pull more power than the system can provide.

I do know for a fact that nobody will ever design a component to draw max power available. That's just asking for instabilities. It's the same reason power bricks used to be considered. When video card draw was outstripping available PSU power they didn't want to run the risk of system instability. So adding a power brick would alleviate that problem. And I guarantee that the amount of juice that would have been supplied by the system+brick would not be exactly equal to the system draw. They would have given a healthy amount of free overhead.