AtenRa
Lifer
- Feb 2, 2009
- 14,001
- 3,357
- 136
2560/1920 is 25%?
Its 33.3% or one GPC or 640SP.
1920 is 25% less SPs than 2560
or
2560 is 33% more SPs than 1920
2560/1920 is 25%?
Its 33.3% or one GPC or 640SP.
That is just silly, hyperbole extravaganza spam. You can have a GPU that consumes 1000w and stay cool if you have beefy enough cooling.
Sure more hot air is produced, but again its not going to make any real difference to the ambient temperature of the room for example, 350 to 400W is nothing, especially with effective cooling. You need 1.5kw heater to heat a 5x5 room properly in the winter and its a freaking heater, designed to heat.
Wattage is only important at looking at the electric bill. 100W isn't even a big difference at all, especially for a GPU which won't be at 99%-100% use most of the time.
So a difference of 100W is insignificant in every single way, especially when we are talking between say 300W and 400W, if its say 100W vs 200W you have to take into account psu, the cables, does it have 6pin or 8pin, etc...
SK Hynix claims 1.2V for 4-Hi stacks only. Vega FE uses 8-Hi stacks... more stacks/layers obviously needs more voltage.Are the HBM2 yields that bad, that they are forced to pump 1.35v into them, and not the 1.2v that Hynix claims? https://www.skhynix.com/eng/product/dramHBM.jsp
This was found by http://imgur.com/a/ImPz1
Everyone just ignoring this? More and more leads pointing to Vega FE having some major features disabled for now..Because of AVFS (Adaptive Voltage & Frequency Scaling, new with Polaris) and ACG (Advanced Clock Gating, new with Vega). ACG is completely disabled as of now and AVFS isn't working properly.
Coming from a guy who is working for AMD.
https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11421717#post11421717
Translation:
https://translate.google.com/translate?hl=de&sl=de&tl=en&u=https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11421717#post11421717
The alleged leaks for RX had the same memory clock speeds, so AMD may be cranking the voltage just to ensure all of the launch parts are stable. They overvolted Polaris at launch so we may see the same here. I think we'll see a lot of people being able to undervolt and get performance improvements once again.
2560/1920 is 25%?
Its 33.3% or one GPC or 640SP.
That is just silly, hyperbole extravaganza spam. You can have a GPU that consumes 1000w and stay cool if you have beefy enough cooling.
Sure more hot air is produced, but again its not going to make any real difference to the ambient temperature of the room for example, 350 to 400W is nothing, especially with effective cooling. You need 1.5kw heater to heat a 5x5 room properly in the winter and its a freaking heater, designed to heat.
Wattage is only important at looking at the electric bill. 100W isn't even a big difference at all, especially for a GPU which won't be at 99%-100% use most of the time.
So a difference of 100W is insignificant in every single way, especially when we are talking between say 300W and 400W, if its say 100W vs 200W you have to take into account psu, the cables, does it have 6pin or 8pin, etc...
Sorry, you have NO clue what you're talking about. 1000W will keep a room cool with beefy cooling? LOL? Where does that 1000W go then? It doesn't matter how cool you keep your GPU (other than slightly less power draw the cooler it is), whether it's using 200W or 400W or 1000W, that energy converts to heat, and just about every bit of that heat goes into the room. Period. Sure, 50-100W isn't much of a difference, but every bit of that still goes to heating up a room. The only "insignificance" if you can call it that, is to a power bill.
25% less or 33% more, depending on which card you use as base.
That is just silly, hyperbole extravaganza spam. You can have a GPU that consumes 1000w and stay cool if you have beefy enough cooling.
Sure more hot air is produced, but again its not going to make any real difference to the ambient temperature of the room for example, 350 to 400W is nothing, especially with effective cooling. You need 1.5kw heater to heat a 5x5 room properly in the winter and its a freaking heater, designed to heat.
I beg to differ. In a 10x12 room, after an hour or two that room would be 4-5+ degrees (Fahrenheit) warmer with my old overclocked GTX 460 and core 2 duo running full tilt for that duration.
Everyone too busy arguing about the efficiency of air conditioners to discuss real points like this.SK Hynix claims 1.2V for 4-Hi stacks only. Vega FE uses 8-Hi stacks... more stacks/layers obviously needs more voltage.
No one expected AMD to offer 16GB from two stacks so its a reasonable assumption that these stacks may not be very efficient. Don't we typically see lower capacity memory chips clock better? Its very possible the 8GB RX Vega has faster memory.
Why is that?
Are you in the camp that thinks there is new silicon for RX?
My 980ti pulling 300W+ right now in the middle of summer in an upstairs bedroom already gets my ambient room temps up to over 80F after a few hours of gaming, I can't imagine having anymore power than that, I would literally melt. This is of course in south Texas where we have summers that get well over 100F for weeks at a time.
I was okay with higher power draw at 1080ti level performance because it would offset the perf/watt from my 980ti, but if RX is going to hit 400W at 1600MHz it's a no sale now.
Everyone too busy arguing about the efficiency of air conditioners to discuss real points like this.
No one expected AMD to offer 16GB from two stacks so its a reasonable assumption that these stacks may not be very efficient. Don't we typically see lower capacity memory chips clock better? Its very possible the 8GB RX Vega has faster memory.
I just updated the first post with some stuff I felt was worth updating. Any other solid reviews/benchmarks I should add?
I've never been a big "power and efficiency" person for desktop GPUs either.That is just silly, hyperbole extravaganza spam. You can have a GPU that consumes 1000w and stay cool if you have beefy enough cooling.
Sure more hot air is produced, but again its not going to make any real difference to the ambient temperature of the room for example, 350 to 400W is nothing, especially with effective cooling. You need 1.5kw heater to heat a 5x5 room properly in the winter and its a freaking heater, designed to heat.
Wattage is only important at looking at the electric bill. 100W isn't even a big difference at all, especially for a GPU which won't be at 99%-100% use most of the time.
So a difference of 100W is insignificant in every single way, especially when we are talking between say 300W and 400W, if its say 100W vs 200W you have to take into account psu, the cables, does it have 6pin or 8pin, etc...
I've never been a big "power and efficiency" person for desktop GPUs either.
Same performance but 50W more and $50 less? I'm probably buying the $50 less.
I have a big case and don't mine, the CPU is on an AIO and only doing normal boost. (no OC)
It's sort of disheartening the Vega is showing less performance than the much lower power 1080 because that doesn't bode well for the APU and mobile markets, but I suppose it's to be expected as AMD just doesn't have NVs R&D budget.
Thanks guys, saved me the trouble of responding. I know it's difficult for some people to understand that climate can vary wildly depending on where you live, but central AC does not keep up with the heat output of my PC (overclocked 4770k and 980ti @ 1500 MHz) combined with the 104+ temps we've had in July. I've had to basically quit gaming during the day because the room will become unbearably hot after about 45 minutes of gaming.
Back to the whole point of my post, 300W is my hard limit on my GPU because I don't want to have to buy a window AC unit for my room just to be able to drive higher frame rates. I was willing to deal with a little more power at 1080ti levels of performance because I would be able to cap FPS at 60 on some of the games I play (WoW, Hearthstone, HoTS etc) and keep the heat generation under control but if RX Vega is going to need 400W and LN to sustain 1600 MHz like FE does it's a no go. I would pay $200+ more for a GSync and GTX set up to NOT have to deal with insane power consumption. First world problems and all, and I very well may end up getting a window unit for my room anyway because has anyone here ever tried to sleep in a room thats 85+ at night? It's awful.
Yes this. Its become obvious that whatever design frequency a chip is designed for on 14nm.. that it typically has a sweet spot. This conversation will get a lot more exciting once we know actual RX Vega performance and can then start to discuss perf/watt. For now these abstract power consumption figures are basically meaningless to me.
you should be fine with at least some version of vega, unless this information is completely off.
I'm still holding out hope for the miracle of FE being different silicon than RX Vega. The money saved from going Vega and FreeSync would pay for my window unit after all if power consumption isn't insane