• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

AMD Radeon HD 7950 Gets Turbo

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
If AMD does nothing, you'll say they are falling behind. When they do something to preempt Nvidia, they are "reactive" in a negative connotation. If Nvidia was doing this, you'd say they were being proactive. :sneaky:

Next time Nvidia reacts to spoil an impending AMD launch, I'm going to quote you.


.....oh and GPU boost is stupid I wish it would go away.

GPU Boost was very pro-active thinking. It's not surprising you would think its stupid. More default performance, while staying below a TDP threshold, while delivering a balance of performance, watt efficiency, acoustics and thermals -- indeed sounds stupid! Good Grief.
 
Last edited:
It's fine in the way that "don't open the hood take it to your dealer" cars are fine. Companies make a lot of money selling those types of cars.

Edit: And when it comes to cars I don't mind it at all, but I could see how a car tinkerer type of person would not have a great opinion of them.

GPU Boost was very pro-active thinking. It's not surprising you would think its stupid. More default performance, while staying below a TDP threshold, while delivering a balance of nice acoustics and thermals -- indeed sounds stupid! Good Grief.
 
Last edited:
Uh, how is it stupid to want to be able to manually tune? So I should wear bell bottoms if the market commands it? I'm not convinced I'm the stupid one in this exchange you'll have to add more explanation.

Market decides -- not vocal posters that offer stupid.
 
GPU Boost was very pro-active thinking. It's not surprising you would think its stupid. More default performance, while staying below a TDP threshold, while delivering a balance of performance, watt efficiency, acoustics and thermals -- indeed sounds stupid! Good Grief.
Fail. The GPU would be just as efficient without the nanny biois, and would be way way easier to tweak. It's good for the majority of people who are not tech savvy though, which seems to be what Nvidia is aiming for this generation. I for one don't like it, but understand why they did it.
Market decides -- not vocal posters that offer stupid.
I see. So having control of the GPU is stupid. Got it.
 
Can't wait for 2015 when we can benchmark race our TabletComputeSmartCloudVideoPhones! Market decides.
 
Fail. The GPU would be just as efficient without the nanny biois, and would be way way easier to tweak. It's good for the majority of people who are not tech savvy though, which seems to be what Nvidia is aiming for this generation. I for one don't like it, but understand why they did it.

I see. So having control of the GPU is stupid. Got it.

No, having more control of the GPU is welcomed. Having more performance while staying below a TDP threshold, while offering a nice balance of performance, watt efficiency, thermals and acoustics is welcomed as well.
 
You seem to suffer from the delusion that NVIDIA and AMD has to launch at the same time?

Not at the same time but reasonably close to each other. That's not a delusion, it's history. Go back to history and see how it was in the past before telling me I suffer some delusion. In the tech world, things move so fast, being 6-8 months behind isn't winning much imo. The company still makes $ but technologically speaking, it's already late. What if AMD launched all of their cards 6-8 months behind? You wouldn't claim they are late?

Once a competitor launches a new generation/refresh first, you are late every day after that because your product is no longer up-to-date, unless your competitor is way behind (2900XT vs. 8800GTX).

1) 8500 (August 2001) vs. GF3 Ti500 (Oct 2001) - very close launch but NV technically won this round since the vanilla GF3 launched way before the 8500
2) GF4 (Feb 2002) - AMD has no response - lost this round completely
3) 9700Pro (June 2002) vs. FX5800U (Jan 2003) --> 7 months behind: NV badly lost this generation (behind and miles slower)
4) 9800Pro (April 2003) vs. FX5900U (May 2003) - very close launch
5) 9800XT (Sept 2003) vs. FX5950U (Oct 2003) - very close launch
6) X800XT AGP (May 2004) vs. 6800U (April 2004) - very close launch
7) X850XT PE (Dec 2004) vs. nothing from NV - NV lost this round
8) X1800XT (Oct 2005) vs. 7800GTX 256mb (June 2005) - AMD is way behind, and subsequently lost this round.
9) X1900XTX (Jan 2006) vs. 7900GTX (March 2006) - very close launch
10) X1950XTX (Aug 2006) vs. nothing from NV - NV lost this round
11) 2900XT (May 2007) vs. 8800GTX (Nov 2006) - AMD is way behind, and badly lost this round
12) 3870 (Nov 2007) vs. 8800GT (Oct 2007) - very close launch
13) 4870 (June 2008) vs. GTX260/GTX280 (June 2008) - launched the same month!
14) 4890 (April 2009) vs. GTX275 (April 2009) - launched the same month!
15) 5850/5870 (Sept 2009) vs. GTX470/480 (March 2010) - NV is way behind, but they brought good performance, which "saved" their face
16) 6970 (Dec 2010) vs. GTX570 (Dec 2010) - launched the same month!
17) 7970 (Jan 2011) vs. GTX680 (March 2010) - fairly close launch.
Source

And now the <$400 Kepler desktop GPU market

HD7750/7770 = Feb 15, 2012 --> No response from NV for 6 months...not expected until Sept-October (so 8 months late)
HD7850/7870 = Mar 5, 2012 --> No response from NV for 6.5 months (assuming GTX660Ti/660/650 all launch Aug 16).

So you are saying NV isn't late with its 28nm Kepler desktop roll-out? This isn't like car model years where a major redesign happens every 5-7 years. If NV fans are waiting 6-8 months to buy < $400 Kepler, that doesn't change anything. Should we start rumours that HD8000 series is 6 months away? NV had to use feature obsolete, power hungry, and VRAM starved 560/560Ti/570/580 to compete with 7850/7870 all this time and had no counter at all to 7750/7770 on the desktop. That's 6-8 months late. Delivering a better product 6-8 months later is not an accomplishment of any kind, esp. since you have seen exact prices, specs and performance of your competitor and 28nm node matures over time, making it cheaper and easier to release faster cards.
 
Last edited:
Yeah, you did and speaks for itself. Market decides.


The market doesn't like it much either. Especially since they know Nvidia is limiting overclocking. 670/680 targets the enthusiast market and why limit the enthusiast market? They are great cards, don't get me wrong, but locking out voltage control was not a good move.
 
So you are saying NV isn't late with its 28nm Kepler desktop roll-out? This isn't like car model years where a major redesign happens every 5-7 years. Being 6-8 months late in tech circles is a big deal.

Then again, did you not read the conference call notes and transcript? They said they were still supply constrained and will continue to be throughout Q3. If you're supply constrained, would you rather sell the chip that makes the most money or the chip that makes the least money? Nvidia's 28nm launch is definitely behind AMD's, but it could be a situation right now where it doesn't make sense to steer wafers allocated to GK104 towards GK106 and GK107 when they're aren't enough wafers to go around and GK104 makes the most money.

Their financials tell a tale different than what many people are trying to paint: their consumer GPU division was up 15% last quarter in a market that was relatively flat from Q1. I think they know what they're doing.
 
Last edited:
This is a good point. This tells me that the 660Ti is a better performer than we think. Otherwise, price cuts for the 7950 again. They'll have to cut prices anyway if they ever expect to sell another one. IMHO.

They did cut prices, MSI and Gigabyte have 7950s closing in on $300 with rebates. Just installed the Gigabyte.
 
Not a huge fan of boost, but not a big deal.

I hope reviewers review with the 7950's available at ~$300 which come OC'd out of the box as well as the 7950B.

Does not make sense to compare the 660Ti vs only the 7950 reference when the price of the OC'd models is same or less than reference cards.

Awhile back we had nVidia overnight 460GTX FTW cards prior to their avialibity in shops to review (availablity non existent for streches post this tactic) sites to snuff the 68xx reviews. A number of posters were upset with that, and a few vocal folks defended it vigoursly. I got the impression Anandtech sided with the idea that nvidia's approach in that case was inapporiate after reading the future reviews done here.

So while I took issue with that type of a tactic from nVidia and the review sites who gave into Nvidia's tactic; given the 7950 OC'd variants are out and availble to the consumer at superior clockspeeds for <$ than reference models in some cases, I don't think it makes sense to compare the 660Ti to anything other than the non reference 7950OC'd
 
it isnt a aprty without ionusx here the webs only known news ninja.

techpowerups review was awesome
hd 7950bios 2.0 absolutely dominated the gtx 680 in crysis 2 @ 720p :lol:
crysis2_1280_800.gif

problem Nvidia???

maximumtrolling.jpg
 
Interested in how AA will be expailned in the GTX660 reviews. You can bet nVidia outlined exactly how reviewers are to approach this subject given that left without instruction the reviews are going to show a huge disadvantage for the 660 when AA is applied.

I hope we don't get sung praise of another inferior AA method,....TXAA.
 
Really? And this is based on?


The whole GPU boost concept was so that Nvidia could lock down the voltage on the cards. So the card's performance was based on the TDP threshold. If you get a better binned chip, then you overclock higher. It just seems that Nvidia wants everyone to play the silicon lottery to me and Nvidia needs to relearn the definition of enthusiast.
 
So wait, some people are getting upset over a free additional 5~6% increase in performance (at default settings)? I'd be pretty stoked if I was an "average user" and the card I bought 4 months ago just got faster. True, we aren't average by any means here and benefit very little, but still how could this not be a good thing (besides that this is how the card should have shipped to begin with)?

This... People are getting something for nothing... why is this a bad thing?
 
Overclocking will get anyone the exact same results which makes flashing to this bios kind of stupid and not at all worth the small risk for doing so. At least they didn't re-release the the 7950 as a ghz edition product and raise prices.

I agree. What makes this BIOS even worse it that it bumps up the voltage to 1.25V using GPU Boost. That's worse than just about any after-market high quality 7950 card can achieve. If AMD partners discontinue those 7950 cards, suddenly they'll be pushing 1.25V on a Tahiti chip to just hit 925mhz. That's a huge step back for the 7950 since it'll get hotter and more power hungry and still not do as well as MSI TF3 or Giga Windforce 3x 7950 cards. Not sure what AMD is thinking with this one. 1.25V for 925mhz is so conservative, I don't even know what to say to that. Even HD6970 ran at 1.175V at 880mhz on 40nm. Maybe the AIBs will launch better binned 7950 after-market cards with much lower voltages. We'll have to see. So far I am not impressed since AMD is just overvolting the 7950, but yet 900mhz 7950 cards can be had for $310 with good voltages. Also, the GPU Boost seems to be erratic from AT's review.
 
Last edited:
That's a huge step back for the 7950 since it'll get hotter and more power hungry and still not do as well as MSI TF3 or Giga Windforce 3x 7950 cards. .


Remember, that Gigabyte and MSI use a custom bios by default. They will tweak the voltages based on their own specs. This would apply only to the reference models which I believe will be phased out which everyone will be switching to aftermarket cooling.

Example: The voltage on my TF3 7950 is 1.035.
 
Last edited:
Back
Top