• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

HD6770 specifications possibly leaked: 1280SPs 900Mhz 256-bit

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
If Barts is $200 and $250 respectively, then where does that place 6850, 6870, 6970?

6850 can be $349, 6870 $449 and 6970 at $599-699. Nothing wrong with that pricing considering 6850 will be faster than 5870 and 6870 will be faster than GTX480 with 6970 competing with itself. Once nV brings our a refresh of Fermi, AMD can simply lower the pricing and still remain competitive. With no competition from NV for the next 3 months, why launch your brand new cards priced so low in the first place?

Last time 5770 came out priced higher than 4870 with lower performance...so why are people expecting a $199 card with near 5870 level of performance when AMD is able to sell 5870s for $350? They could easily sell Barts XT for $250. At this point of course, pricing is just speculation.

When we had no competition from NV, ATI priced 5870 at $379 and then raised it to $399 vs. $299 for previous high end - 4870. Both AMD and NV are just as guilty of increasing MSRP the minute there is no competition (8800GTX, GTX280 come to mind, etc.)
 
Last edited:
But can this be done without creating the noise/heat/power beast that the GTX480 is? I'm not sure if this is a 'only time will tell' thing - are the limits of the 40nm process being pushed by these chips?

The 900MHz clock for the 6770 might denote an improvement in efficiency, rather than just an ability to run at higher clocks. Lots of chips can run at appreciably higher clocks than they are marketed at. They just turn into power monsters.
 
But can this be done without creating the noise/heat/power beast that the GTX480 is? I'm not sure if this is a 'only time will tell' thing - are the limits of the 40nm process being pushed by these chips?

Sound & Heat should be lower as AMD is using Vapor chambers on the higher end. Power shouldn't be too high either, remember the HD 5970 stock settings still used less power & made less heat/sound than the GTX 480.
 
If AMD has any sense at all, it should eat as much market share as it can before Kepler arrives so as to reduce the potency of the CUDA/PhysX ecosystem. Simply matching NV's price/perf isn't enough to steal market share. GTX 460-768MBs have dropped to $140AR briefly before settling in the $150-160AR zone, and GTX460-1GBs have been seen for under $190AR for a while now. It may drop some more in 30 days' time. Now and in the recent past, the GTX470 has seen a lot of deals at the ~$250 mark, and it's still got a month to drop in price some more.

Barts Pro at $189 would be sorta okay ($180 would be much better at taking market share), but $200+ is letting NV off the hook imho. Similarly, AMD if it wants to kill NV's market share should price the 6770 at ~$220 or less if it's exactly in-between a 5850 and 5870 (not sure if it really will approach a 5870 or not; if it does, then a higher price is justified).



The HD6670 and HD6650? 🙂



I agree fully with this statement. AMD has a chance to change the graphics ecosystem in its favor. The last time AMD had an advantage (X2 days), Mr Ruiz stalled 65nm cpu development to maximize short term profits. If AMD does this again, they deserve to be whipped by a future revitalized Nvidia.

It was recently reported that the TSMC 40nm node is yielding what 55nm did. AMD can sell lower and still increase profits.
 
I don't keep track of the highest-end parts much but I disagree with the lower end. The lowest prices after rebate are more like:

You might be right. My numbers were a quick look at what was selling near the lower end of the price spectrum on newegg and I ignored rebates... because I hate them so much 🙂
 
But can this be done without creating the noise/heat/power beast that the GTX480 is? I'm not sure if this is a 'only time will tell' thing - are the limits of the 40nm process being pushed by these chips?

Quite likely the 6870 will consume more power than a 5870 and run hotter. Early shots of the cards show 8+6pin power connectors. As well the die size will go up.

Unlike the 480 the 5870 uses significantly less power and runs a good deal cooler, so AMD has room to increase power consumption and heat output. The 480 is basically at its limit, it consumes very near 300W and in many cases exceeds 300W with a small overclock and runs super hot.

So I think AMD has room to push 40nm further whereas NV has already hit the limits of what they can do with their chip.
 
Because tweaking, respinning the different layers, and refreshing existing parts only make things worse, right? I mean that's what happened with the the 5900FX, correct?
I'm not we're in the same scenario here though. There's a lot more to fix with the GF100 chips than there was with the FX5800.

Anyway, Barts XT looks impressive, hopefully it'll be sold at a decent price. The one thing I'm wary of is the high clock speed, which is fine if the card can still overclock like crazy.
 
I'm not we're in the same scenario here though. There's a lot more to fix with the GF100 chips than there was with the FX5800.

This may or may not be true, and I doubt either of us know which is the case. But even if you were right, it would only serve to validate my point more. If there is more to "fix", then there is more room for improvement. But regardless, I no longer think Nvidia will respin gf100 for the consumer market. If they are going to come out with a refreshed line before 28nm (other than fully unlocked gf104 and gf106 cores) it'll likely be a newer, beefed up gf104.
 
Because tweaking, respinning the different layers, and refreshing existing parts only make things worse, right? I mean that's what happened with the the 5900FX, correct?

Ignoring heat issues Nvidia is still at the power limit with GF 100. Yes they could do better with GF 104. But Nvidia still has some serious limits to what they can do until they get power usage under control.
 
There's a diff between what one might expect to happen (i.e., AMD just barely beats NV's price/perf and calls it a day) and what AMD should do if it wants to eat up market share (price HD6xxx aggressively to hurt the CUDA/PhysX ecosystem). I can see AMD going either way. Either way they make a profit, since Barts may not even cost that much more to make than Juniper, as I detailed in another thread, especially if TSMC 40nm yields have gone way up since fall 2009.

Also AMD may split the difference and price 67xx more aggressively but pad their profit on 68xx and 69xx GPUs, since NV likely has nothing competitive in those ranges, anyway.

It bears mentioning that 5770 had DX11, 4870 did not.

Also NV was in a different competitive situation back then, with no DX11 parts. GTX460s have been selling as their prices have fallen, and GTX470 and 480 are DX11 cards. There is some actual competition for AMD this time, at least at the 67xx level.

6850 can be $349, 6870 $449 and 6970 at $599-699. Nothing wrong with that pricing considering 6850 will be faster than 5870 and 6870 will be faster than GTX480 with 6970 competing with itself. Once nV brings our a refresh of Fermi, AMD can simply lower the pricing and still remain competitive. With no competition from NV for the next 3 months, why launch your brand new cards priced so low in the first place?

Last time 5770 came out priced higher than 4870 with lower performance...so why are people expecting a $199 card with near 5870 level of performance when AMD is able to sell 5870s for $350? They could easily sell Barts XT for $250. At this point of course, pricing is just speculation.

When we had no competition from NV, ATI priced 5870 at $379 and then raised it to $399 vs. $299 for previous high end - 4870. Both AMD and NV are just as guilty of increasing MSRP the minute there is no competition (8800GTX, GTX280 come to mind, etc.)
 
This may or may not be true, and I doubt either of us know which is the case. But even if you were right, it would only serve to validate my point more. If there is more to "fix", then there is more room for improvement. But regardless, I no longer think Nvidia will respin gf100 for the consumer market. If they are going to come out with a refreshed line before 28nm (other than fully unlocked gf104 and gf106 cores) it'll likely be a newer, beefed up gf104.
I can't say agree with the bolded. My point was that there wasn't as much to fix with the FX5800 (mainly AA performance, also that POS cooler, IIRC) as there is to fix with GF100 (power consumption, increased performance all around to compete with Cayman). The biggest problem is they've already pretty much maxed out the capabilities of 40nm in my eyes, and they're just going to have to start cutting the fat, and quickly. I agree that a beefed up GF104 is the most plausible, but even then, it doesn't seem like it'll be competitive with the rumored performance of the Cayman parts, nevermind Antilles. Of course, time will tell.
Ignoring heat issues Nvidia is still at the power limit with GF 100. Yes they could do better with GF 104. But Nvidia still has some serious limits to what they can do until they get power usage under control.
Exactly.
 
MrK6, but at the same time how can Fermi, even refreshed, compete with a brand new HD6000 series in performance? In videocard generations, generally the performance difference is very large between generations. It will take until Kepler before HD6000 series can be challenged on the high-end unless HD6000 is underwhelming. If NV can actually compete with HD6000 series with just refresh of Fermi, then HD6000 is a huge failure. There should be almost no competition between a brand new generation and an old one. NV's only strategy at this point is lowering prices and bundling more games with their cards.
 
@ Russian, Considering the much larger die size of NV cards, id say its safe to assume they cost more to make, even though none of us really know. Assuming that, NV is stuck between a rock and a hard spot, as they will be forced to lower prices, i agree, a revision cant stand up to a new arch, but they have to make some sort of money. They can either sell them for little to no money, or be beat horribly as far as performance. Its looking bad for NV. Especially if Kepler is late, or the same time as 7xxx.
 
MrK6, but at the same time how can Fermi, even refreshed, compete with a brand new HD6000 series in performance? In videocard generations, generally the performance difference is very large between generations. It will take until Kepler before HD6000 series can be challenged on the high-end unless HD6000 is underwhelming. If NV can actually compete with HD6000 series with just refresh of Fermi, then HD6000 is a huge failure. There should be almost no competition between a brand new generation and an old one. NV's only strategy at this point is lowering prices and bundling more games with their cards.

If AMD sticks to its annual cycle, Kepler will not be competing with HD6xxx, but rather HD7xxx.
 
If AMD sticks to its annual cycle, Kepler will not be competing with HD6xxx, but rather HD7xxx.

Sounds to me like 3 possibilities exist.

1) AMD's generational leaps will be much less drastic in performance i.e. 4890 --> 5870 (+50-60&#37😉, rather than 9800XT -> X800XT (2x) -> X1800XT (2x) as a result of much quicker release schedule (12-13 months).

2) If #1 is not true and AMD continues 70-100% performance increases every year, while NV will take 2x longer to get 2x the performance increase, NV will have to resort to AMD's CPU strategy and compete on price rather than performance.

3) NV will take a lot longer than before (i.e., 18-24 months, rather than 12-15 months) to roll out a new generation, but performance increases between its generations will be more than 2x. But then this strategy will leave them competing on price once again between generational releases with half-assed refreshes.

Something has got to give here, unless NV also switches to the small-die strategy.
 
Last edited:
Why is it I feel that this thread is getting derailed into an "armchair CEO" thread again? :hmm:

Sounds to me like 3 possibilities exist.

1) Either AMD's generational leaps will be much less drastic in performance i.e. 4890 --> 5870 (+50-60&#37😉, rather than 9800XT -> X800XT (2x) -> X1800XT (2x).

2) If #1 is not true and AMD continues 70-100% performance increases every year, while NV will take 2x longer to get 2x the performance increase, NV will have to resort to AMD's CPU strategy and compete on price rather than performance.

3) NV will take a lot longer than before (i.e., 18-24 months, rather than 12-15 months) to roll out a new generation, but performance increases between its generations will be more than 2x. But then this strategy will leave them competing on price once again between generational releases with half-assed refreshes.

Something has got to give here, unless NV also switched to small-die strategy.
 
Last edited:
Why is it I feel that this thread is getting derailed into an "armchair CEO" thread again?

because being an armchair moderator is so much better.

a 100% linear discussion in real life would be incredibly dull - a little deviation is food for thought too

on topic - if these specs are right - what kind of minimum CPU level would be needed to NOT bottleneck these cards?
 
Good Lord, can we just wait and see what the things do? LOL.

No! It's fun to speculate! 🙂 Plus, I've been dying for the last couple of weeks after I sold my 5850 in preparation for HD6xxx launch. My old 6800XT was supposed to tide me over, but it died on me suddenly. In any case, with the death of my old GeForce, the best GPU in the house is currently a freakin' IGP. D: I need a single-GPU tri-monitor solution sooner rather than later, but I can't bear to buy a HD5xxx series when I know I can just wait a month and get a better GPU with UVD 3.0 and (probably) efficiency. The wait is killing me and these leaks are getting really interesting... Barts XT can't come soon enough!
 
Back
Top