[BitsAndChips]390X ready for launch - AMD ironing out drivers - Computex launch

Page 47 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cloudfire777

Golden Member
Mar 24, 2013
1,787
95
91
Everyone assumes that HBM is really expensive, but I haven't seen any actual evidence of that. The truth is that everything people are saying about pricing, yields, and configurations for HBM is almost pure speculation.

From the price of R9 370X and 390/390X I have in front of me here, HBM does not look very expensive no.
 
Feb 19, 2009
10,457
10
76
That's good news for AMD, they are going to have to price the lineup very aggressively due to the presence of the 970 at ~$300. We all know the 980 and Titan X prices are inflated, so those aren't the metric for perf/$ that AMD needs to compete against.

The 970 is also the best selling performance part. With NV's new game bundle with TW3 and Batman, AMD needs to beat Maxwell on every metric.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I hope the new Generation will Support HDMI 2.0 and H265

I'm sure they will. With the proper codecs installed, even my aging 680's have partial support for H265 bringing CPU usage down to a mere 10% when watching a 4k video.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Just because you justify every AMD product with walls of text "proving" their performance per dollar advantage does not mean the consumer has to accept that, or will not use other metrics as well to make their purchase decision.

Maybe you should read my post again and come up with a better rebuttal to my points. The first part of my post has nothing to do with AMD's price/perf but it's clearly talking about NV's changing marketing strategy in the last 5 years. It's obvious you haven't read my posts as I have recommended countless NV GPUs in the last 13 years. There have been plenty of AMD GPUs that I think were not worth buying at all, almost regardless of their price/performance. Recently I slammed 285 and I slammed 7970 on release for many months. I also skipped HD2000 and 3000 series and wait for it...I put my money where my mouth is and bought GTX470s and overclocked them to 750mhz because they beat 5850s OC in DX11 games (with tessellation) which meant their inferior perf/watt was irrelevant. But you must not remember any of that, or me recommending GTX480s 7 months after launch, or GTX460 overclocking, etc. :rolleyes:

Getting back to perf/watt.

You seem to forget the facts: AMD had superior perf/watt for ages, starting from the day HD4850/4870 launched and all the way until the last Kepler card dropped (Kepler took 2.5-9 months to launch top-to-bottom) on the desktop. Even though 670/680 beat 7950/7970 in perf/watt, NV still used outdated stack of Fermi cards to compete in lower end for 6+ months thereafter without any problems. If perf/watt was such a critical factor, NV would have already been on the ropes since AMD's HD4800 series.

From the day HD4850 launched until the first day of GTX680, if we look at the total market share AMD did not gain much, if at all. I am not going to pull up the charts again as I've done it 10X. So AMD had both perf/watt and price/perf and still didn't make a dent in the overall market share in the last 5-6 years. So your point is only now people started to care for perf/watt or is it because NV is in the lead?

When AMD basically beat NV in nearly every metric from VRAM, to price/perf, to perf/mm2, to double precision, to perf/watt, did NV sit with 24% market share? :confused:

You also missed my other point. Just like 5 years ago, NV still makes 100W, 200W and 250W GPUs. Get the point? NV never stopped making flagship cards that use 250W+ of power. That means NV isn't some Greenpeace company that makes 150W GPUs and that's it. So how does NV get people to upgrade more often to a lackluster card like a 960 or how does it market a card barely faster than a 780Ti for $550 that in the past NV always sold in the $200-300 band? Perf/watt marketing non-stop. People are still butt-hurt on these forums to accept that GTX680/980 are GTX460/560Ti successors because they don't want to admit they are paying double for what was always a mid-range $250-300 card. No one wants to publicly admit they overpaid for something so how does one justify his purchase? Had NV purposely delayed GTX480/580 and launched $499 GTX460 as the next gen GTX480 flagship and then followed up with $550 GTX560Ti as the next gen flagship and only after released the real flagship GTX580 for $1K, how would that make you feel? Welcome to Kepler + Maxwell strategies.

Did you never notice how no reviewer in the world performs a perf/watt vs. price premium cost-benefit analysis. Why is that? They just say well this card uses X less watts of power so it's price premium is justified (and they blatantly ignore price/perf). Just try to pull that type of analysis in the finance world where you need to back up your conclusion with financial proof.

<The premium it costs to buy a more efficient product - say a more power efficient TV, dryer/washing machine, videocard, hybrid vehicle, etc.>

vs.

<the time it would take to break-even and start re-couping the cost savings vs. a less efficient one>

Just because most of the market thinks the premiums are worth it, doesn't mean squat because most people can't do finance, can't calculate TCO, and buy based on emotions and what makes them feel good. It takes FAR more work to justify your buying decision with mathematics that are free of emotional bias.

Perfect example is the popularity of the Toyota Prius:

"It turns out the additional $5,473 required for the privilege of owning a Prius instead of an Insight can buy a lot of fuel. At today's fuel prices, the actual monetary savings earned by the Prius' edge in fuel economy is miniscule, working out to a paltry $70 per year. Paying off the Prius' extra tariff in sticker price with the savings in fuel purchases would require more than 75 years."
http://www.edmunds.com/honda/insight/2010/comparison-test1.html

One can easily find hundreds of thousands of consumers buying Toyota Prius' to save on fuel costs when it'll take 10-20+ years to break even with a $16000 Honda Fit. They'll use all kinds of illogical reasons how Prius saves them money on fuel but they should just say they bought a Prius because they wanted to/it made them feel good inside. From a financial standpoint the Prius is a questionable buy in most cases, but it's a marketing success. The Prius marketing and feel good about yourself factor works, but for someone who breaks things down mathematically, perf/watt or miles per gallon can ALL be measured to provide an exact monetary benefit. I've done the math to figure out if perf/watt is financially beneficial to warrant the price premiums, have you? I bet most consumers in the world have not.

Some highly technical people on our forums have done the maths and have even measured the increase in their room temperature from using 600W of GPUs vs. say 300W GPUs. Most other folks are probably buying a more efficient card because they are told it's cooler, quieter and overall better by review sites (you know the same outlets that get marketing dollars for reviewing and pushing sales! duh!) but they haven't got a clue how to prove if it's really better or not. Also, you can have a card using 150W of power that runs hotter and louder than a card using 500W of power.
 
Last edited:

Hitman928

Diamond Member
Apr 15, 2012
6,755
12,503
136
Then Nvidia came with the GTX 900 series which is amazingly efficient, when the GTX 980 draws almost half of what the GTX 770 uses and is way stronger in performance.

Actually, the 980 uses more power than the 770, despite what Nvidia lists as the TDPs. I think it was already talked about in this thread but you probably didn't read those pages. Maxwell was still a very impressive step up in performance per watt, just not as much as Nvidia would like you to believe. After market 980s which most review use for performance use a bit more still (right about 200W) because they boost higher than the reference.

power-gaming-FIXED_w_600.png
 
Last edited:

showb1z

Senior member
Dec 30, 2010
462
53
91
It's not just perf/W by itself that's such a big selling point these days.
I sold my GTX970 after their scam got revealed, using my old 570 untill R9 300 comes out. It sure gets toasty in here now, even while playing games that don't really put that much of a load on it.
That comfort is worth the extra price of admission imo (if performance is similar).
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I'm sure they will. With the proper codecs installed, even my aging 680's have partial support for H265 bringing CPU usage down to a mere 10% when watching a 4k video.

Maybe I am wrong but it seems people overestimate the demands of 4K video watching on products released in the last 3 years.

"Earlier in the day we had witnessed the Mac Pro playing back 16 streams of 4K in a multicam clip, it can shift data." ~ Source

mac_pro_machine_specs.jpg


mac_pro_machine_speed_test.png


AMD's cards can even process 4x 4K live streams for live broadcasts in real time.

In Final Cut Pro X, 18 simultaneous effects being applied in real time on a 4K video without dropping a single frame. I think modern systems today can already handle 4K video fairly easily.

To your other point: 980 series can run cooler and quieter than R9 290X series but it's SKU specific. That's been the main issue that tarnished the entire R9 200 series since most people believe ALL R9 200 series run hot and loud, doesn't matter what the product is.

Gigabyte G1 980 = 43 dBA
Sapphire Tri-X 290X = 37 dBA

You can easily buy a hotter and louder 970/980 than an R9 290X. For example, how many people are using reference Titan blower cooled 970/980? I've seen a lot of people rocking these and in SLI. Reference 970/980/Titan X are all hotter and louder than Sapphire TriX 290Xs or MSI Lightning 290Xs.

----

It's not just perf/W by itself that's such a big selling point these days.
I sold my GTX970 after their scam got revealed, using my old 570 untill R9 300 comes out. It sure gets toasty in here now, even while playing games that don't really put that much of a load on it.
That comfort is worth the extra price of admission imo (if performance is similar).

It's hard to believe your point considering:

Asus DCUII 570 = 161W avg / 181W peak
Asus Strix 970 = 161W avg/ 179W peak

But let's go with your point anyway since it's bound to come up again:

1. Who buys $300-600 GPUs regularly but can't afford AC in the summer? Sounds like a strange mix of priorities imho.

2. What about ignoring the benefits of the PC heating up the room in the winter for those living in northern climates? If we are going to talk about how running a more power efficient rig helps someone in UAE, it's not as clear cut for someone living in Canada/Norway.

3. If we are going to talk about minor differences of 50-100W of power usage heating up a room so that it goes from manageable to unbearable, how small is your apartment/room? At that point, the power usage of your monitor and every light bulb alone would probably have a bigger impact than a GPU swap. But here is the interesting part --- A LOT of people who keep talking about perf/watt run highly overclocked i5/i7 systems (some of them 6-core i7s) which means while YOU may have a point, a lot of others gamers just use perf/watt as some metric to shout about.

4. Electricity costs. Do we even need to go there? GTX970 costs $320 on Newegg and R9 290 is $240. It'll take a long time before 970 even breaks even on electricity costs in North America. By that time I bet both cards are obsolete/will be sold by the gamer to upgrade to something better.

That's good news for AMD, they are going to have to price the lineup very aggressively due to the presence of the 970 at ~$300. We all know the 980 and Titan X prices are inflated, so those aren't the metric for perf/$ that AMD needs to compete against.

The 970 is also the best selling performance part. With NV's new game bundle with TW3 and Batman, AMD needs to beat Maxwell on every metric.

You know that's not going to happen because AMD doesn't' have the brand value and their drivers will be blamed due to GW. TW3 coupon runs out by May 31st, which means by the time R9 300 series launches, it prob. won't be a factor. But GWs, that will remain for years to come. With GWs alone NV can gain 30-50% more performance and ensure CF scaling is 0% for months.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Maybe I am wrong but it seems people overestimate the demands of 4K video watching on products released in the last 3 years.

"Earlier in the day we had witnessed the Mac Pro playing back 16 streams of 4K in a multicam clip, it can shift data." ~ Source

mac_pro_machine_specs.jpg


mac_pro_machine_speed_test.png


AMD's cards can even process 4x 4K live streams for live broadcasts in real time.

In Final Cut Pro X, 18 simultaneous effects being applied in real time on a 4K video without dropping a single frame. I think modern systems today can already handle 4K video fairly easily.

To your other point: 980 series can run cooler and quieter than R9 290X series but it's SKU specific. That's been the main issue that tarnished the entire R9 200 series since most people believe ALL R9 200 series run hot and loud, doesn't matter what the product is.

Gigabyte G1 980 = 43 dBA
Sapphire Tri-X 290X = 37 dBA

You can easily buy a hotter and louder 970/980 than an R9 290X. For example, how many people are using reference Titan blower cooled 970/980? I've seen a lot of people rocking these and in SLI. Reference 970/980/Titan X are all hotter and louder than Sapphire TriX 290Xs or MSI Lightning 290Xs.

----

Running in pure software, my CPU usage was peaking at right around 70% but with the right codecs installed and properly configured to use hardware decoding, that figure dropped to 10%

It's also important to note, that my onboard HD4000 IGP was not able to offload any of the decoding, but my 680's can. The Mac pro is capable of 16 threads as well as hardware decoding on the AMD GPU's so it's not surprising it can handle such a feet.

It IS demanding without a properly configured setup, and with the majority of PC buyers getting off the shelf setups with nothing more than the IGP which can't offload H265, there will be issues. Bear in mind, my 70% figure is on an overclocked i7. Someone buying a Pentium or an i3 without a GPU that can offload the decoding doesn't stand a chance of smooth play back.

http://forums.anandtech.com/showthread.php?t=2427763&page=3

That's the thread where me and another member were experimenting with a 4K video and various configurations.

But anyway, I think we are getting a bit off topic here.
 
Last edited:

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
I'm going to quote myself and make a couple enhancements since I seemed to have lost you at some point during the entire 1 sentence of my post.

I hope that clears it up.
my bad for jumping the gun :wub:

personally, I really don't care about perf/w as long as it is within a certain range. if nv and amd got gpus that are so close in perf/$, I would still go amd. just so amd can stay in the gpu business. I do not want the gpu market to become the same as the cpu one.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Running in pure software, my CPU usage was peaking at right around 70% but with the right codecs installed and properly configured to use hardware decoding, that figure dropped to 10%

That's awesome. :thumbsup: What player are you using for 4K video playback?

4K playback, multi-monitor support and low power usage might make the 750/750Ti the best HTPC cards once new cards launch and these go on clearance sale for $50. ;)

----------

Back to R9 300 series. In the olden days, I think ATI/AMD had a chance, but today I don't. With GWs code in AAA games, I don't see how AMD's cards stand a chance. Titan X is 66% faster than a 290X in Project CARS' recent testing. That means for R9 390X to be only 5% faster than the Titan X, it has to beat R9 290X by nearly 75%. o_O

http--www.gamegpu.ru-images-stories-Test_GPU-Simulator-Project_CARS_2015-test-pc_2560.jpg


On average, Titan X is only 43-48% faster than the 290X and here it's basically getting 20% extra out of nowhere. 980 is 34% faster than the 290X. How does one explain that? This game was made for NV cards, just like many other games coming out in the last 24 months. Pretty much that means NV is getting 15-20%+ performance advantage from software / developer relationship partnerships alone. That means even if AMD closed the 15-20% hardware gap we saw with 280 vs. 4870, 480/580 vs. 5870/6970, they need 15-20% to compensate for GWs. I don't see that happening. Even if they did, NV can just release a 1250-1275mhz GM200 6GB with after-market coolers, heck even 1.4Ghz MSI Lightning/EVGA Classified GM200 6GB. They can also release 970Ti and faster clocked 980. For that reason I think AMD lost this generation. Most of 2015 highest anticipated games are all GWs and NV will have that 'automatic' 15-20% software advantage and of course SLI scaling at or near launch.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I was just using Windows Media Player

my bad for jumping the gun :wub:

personally, I really don't care about perf/w as long as it is within a certain range. if nv and amd got gpus that are so close in perf/$, I would still go amd. just so amd can stay in the gpu business. I do not want the gpu market to become the same as the cpu one.

I think that's what we were trying to say... Perf/w isn't a primary concern for most people looking at high(er) end gaming GPUs.
 
Feb 19, 2009
10,457
10
76
But GWs, that will remain for years to come. With GWs alone NV can gain 30-50% more performance and ensure CF scaling is 0% for months.

Now you're seeing the point I've raised for awhile now.

GW is a killer feature against AMD because it cripples performance and disables CF for months after release. These GW devs have shown they do NOT optimize for AMD at all during development, they eventually release an AMD optimization patch 3-4 months after launch, and suddenly AMD performance sky rockets back up to where it should have been. Nothing AMD can do about it besides throwing more $$ around to bribe developers to be ETHICAL.

We know they can't throw as much $ as NV at devs because they aren't in good shape, so that can't happen.

So yes, it doesn't really matter if 390X smashes Titan X by 20%. In GW titles, it will be crippled and broken until the devs decide to optimize for AMD.

As it is, it's a major selling point for gamers to buy an NV GPU, often at inflated prices. This ensures they play all newly released games well, without caring whether its GW or not.
 
Last edited:

showb1z

Senior member
Dec 30, 2010
462
53
91
It's hard to believe your point considering:

Asus DCUII 570 = 161W avg / 181W peak
Asus Strix 970 = 161W avg/ 179W peak

But let's go with your point anyway since it's bound to come up again:

1. Who buys $300-600 GPUs regularly but can't afford AC in the summer? Sounds like a strange mix of priorities imho.

2. What about ignoring the benefits of the PC heating up the room in the winter for those living in northern climates? If we are going to talk about how running a more power efficient rig helps someone in UAE, it's not as clear cut for someone living in Canada/Norway.

3. If we are going to talk about minor differences of 50-100W of power usage heating up a room so that it goes from manageable to unbearable, how small is your apartment/room? At that point, the power usage of your monitor and every light bulb alone would probably have a bigger impact than a GPU swap. But here is the interesting part --- A LOT of people who keep talking about perf/watt run highly overclocked i5/i7 systems (some of them 6-core i7s) which means while YOU may have a point, a lot of others gamers just use perf/watt as some metric to shout about.

4. Electricity costs. Do we even need to go there? GTX970 costs $320 on Newegg and R9 290 is $240. It'll take a long time before 970 even breaks even on electricity costs in North America. By that time I bet both cards are obsolete/will be sold by the gamer to upgrade to something better.

Those points are somewhat different in Europe. AC in homes is a rare thing here and electricity costs are higher. And yes my computer room is small.

I don't have graphs with my room temperatures to prove this, but I'm pretty sure about it. Just holding my hand over the exhaust of my card, the difference is very noticeable.

Either way, my point was that many people do take heat into account. If you go to the gaming subs on reddit, where people are less knowledgeable about hardware, they all think AMD cards are space heaters.
And to be clear, I'm not one of them, to me perf/$ is by far the most important metric. If I had to buy right now, I'd get a 290 for sure. Unfortunately AMD was slow dropping their prices when GM204 released.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I still think some posters on our forum use the term re-brand incorrectly. If you take Hawaii XT made on 28nm TSMC, but manufacture it at GloFo and add HBM, that's not a rebrand. The chip not only has different transistors, but entirely different perf/watt and memory sub-system characteristics. How can you call an R9 380X with 15% more performance and 80-100W less power vs. the 290X a rebrand? (Just using hypothetical performance and power metrics here). The point is, a re-brand is taking the same chip and at most just bumping the GPU and memory clocks and releasing it at a lower price --> for example GTX680-->770 or HD7970 --> HD7970Ghz. Even the minor changes from GTX480 -> 580 is already not a rebrand. Similarly 470 -> 570 is not a re-brand.



But you said 370X ~ 780 and 780 < $240 R9 290 and even slower than an R9 280X. Sure from a perf/watt, it's probably a huge leap but selling 780 level of performance in the $200-250 range isn't exactly mind-blowing when R9 290 regularly sells for $240. I guess compared to the turd that is the 960, it will be a huge win for those who have power usage as a key priority to have 780 level of performance with 140W power usage below $250. As far as $550 MSRP of 980, that price has always been questionable as the card was overpriced from day 1 imho. That $500-550 price for a 980 is laughable when cards like the Gigabyte GTX970 Windforce go on sale for $295.

Right now 980 is 8-15% faster than a 290X, maybe 25% if we use reviews full of GW titles but it costs 70% more than a 970 and 80-90% more than an R9 290X. I have a 1000W Platinum PSU so I don't really care if 980 uses 200W or 300W. For me the fair price of a 980 today is $399 tops based on its performance and features. Therefore, if R9 380X ~ 980, it should really be priced at $379-399 or otherwise we are still in mid-range overpriced land.

Another way of looking at it, HD7970Ghz was $549 and when AMD released R9 200 series, that level of performance became available at $299 in the R9 280X. That means AMD should really aim for R9 380 to be priced at $299 and have R9 290X level of performance.

On the performance level I would like to see:

R9 380 = $299 as fast as R9 290X reference card
R9 380X = $399 as fast as GTX980 reference card
R9 390 = $549 15% faster than a GTX980 reference card (so 87-89% as fast as the Titan X for nearly half the price)
R9 390X = $699 as fast as the Titan X

Wait for DX12. I have a feeling it will help level the playing field.
 

coercitiv

Diamond Member
Jan 24, 2014
7,503
17,935
136
I don't have graphs with my room temperatures to prove this, but I'm pretty sure about it. Just holding my hand over the exhaust of my card, the difference is very noticeable.
You've just been given solid data showing your old gfx card consumes just as "less" as a new 970, and you're still thinking about proving a room temperature delta between the two cards?

It's spring, and heating up lately. You don't have AC. Your card power consumption is ~160W while gaming.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
I don't have graphs with my room temperatures to prove this, but I'm pretty sure about it. Just holding my hand over the exhaust of my card, the difference is very noticeable.

Either way, my point was that many people do take heat into account. If you go to the gaming subs on reddit, where people are less knowledgeable about hardware, they all think AMD cards are space heaters.
And to be clear, I'm not one of them, to me perf/$ is by far the most important metric. If I had to buy right now, I'd get a 290 for sure. Unfortunately AMD was slow dropping their prices when GM204 released.

You can compare how much one product or another will heat your room simply via the amount of power it consumes. All energy will eventually become heat. Doesn't matter how it is cooled or what temperature the card itself is at, the energy will eventually become heat and the heat will eventually spread out into the room and beyond.

Heat is the afterlife of everything.

My crossfire 290s do noticeably change the temperature of my computer room after long gaming sessions. My 670s did as well of course, as I stated all energy becomes heat, but not nearly as much.
 

showb1z

Senior member
Dec 30, 2010
462
53
91
You've just been given solid data showing your old gfx card consumes just as "less" as a new 970, and you're still thinking about proving a room temperature delta between the two cards?

It's spring, and heating up lately. You don't have AC. Your card power consumption is ~160W while gaming.

I've been given solid data about 2 cards, neither of which is the same as the cards I used. Nor do they run at the same clocks I've run them at.

It's entirely beside the point anyway, maybe it's all in my head, but it was just an anecdote to illustrate heat matters to some people. And I'm pretty sure that the heat difference between a 970 and a 290 is undisputed.

You can compare how much one product or another will heat your room simply via the amount of power it consumes. All energy will eventually become heat. Doesn't matter how it is cooled or what temperature the card itself is at, the energy will eventually become heat and the heat will eventually spread out into the room and beyond.

Heat is the afterlife of everything.

My crossfire 290s do noticeably change the temperature of my computer room after long gaming sessions. My 670s did as well of course, as I stated all energy becomes heat, but not nearly as much.

Yea I know. Personally I don't even care, I expect high-end GPU's to consume a lot of energy. I have a 850W PSU, I used to run 570's in SLI. I'm fine with all that.
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
Back to R9 300 series. In the olden days, I think ATI/AMD had a chance, but today I don't. With GWs code in AAA games, I don't see how AMD's cards stand a chance. Titan X is 66% faster than a 290X in Project CARS' recent testing. That means for R9 390X to be only 5% faster than the Titan X, it has to beat R9 290X by nearly 75%. o_O

http--www.gamegpu.ru-images-stories-Test_GPU-Simulator-Project_CARS_2015-test-pc_2560.jpg


On average, Titan X is only 43-48% faster than the 290X and here it's basically getting 20% extra out of nowhere. 980 is 34% faster than the 290X. How does one explain that? This game was made for NV cards, just like many other games coming out in the last 24 months. Pretty much that means NV is getting 15-20%+ performance advantage from software / developer relationship partnerships alone. That means even if AMD closed the 15-20% hardware gap we saw with 280 vs. 4870, 480/580 vs. 5870/6970, they need 15-20% to compensate for GWs. I don't see that happening. Even if they did, NV can just release a 1250-1275mhz GM200 6GB with after-market coolers, heck even 1.4Ghz MSI Lightning/EVGA Classified GM200 6GB. They can also release 970Ti and faster clocked 980. For that reason I think AMD lost this generation. Most of 2015 highest anticipated games are all GWs and NV will have that 'automatic' 15-20% software advantage and of course SLI scaling at or near launch.

Project Cars releases on May 7th. So I would wait it out before judging AMD cards. There is a good chance that AMD will have a driver this month for Witcher 3 and Project Cars. btw can you stop with your exaggeration and hyperbole. its getting irritating. Nvidia is invincible. Nvidia is god. Nvidia is king of marketing. Nvidia will kill AMD by starting a price war.

btw when you talk of AIB GM200's do you think AMD partners will be twiddling their thumbs with R9 390X. I except some wickedly fast cards from Sapphire, MSI, ASUS. Its expected that there will be TOXIC, LIGHTNING, MATRIX versions of R9 390X. So lets see how they fare with the Classified / Lightning / Matrix GM200. Till then I suggest you please tone down your Nvidia chest thumping. :rolleyes:
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
http--www.gamegpu.ru-images-stories-Test_GPU-Simulator-Project_CARS_2015-test-pc_2560.jpg


On average, Titan X is only 43-48% faster than the 290X and here it's basically getting 20% extra out of nowhere. 980 is 34% faster than the 290X. How does one explain that? This game was made for NV cards, just like many other games coming out in the last 24 months. Pretty much that means NV is getting 15-20%+ performance advantage from software / developer relationship partnerships alone. That means even if AMD closed the 15-20% hardware gap we saw with 280 vs. 4870, 480/580 vs. 5870/6970, they need 15-20% to compensate for GWs. I don't see that happening. Even if they did, NV can just release a 1250-1275mhz GM200 6GB with after-market coolers, heck even 1.4Ghz MSI Lightning/EVGA Classified GM200 6GB. They can also release 970Ti and faster clocked 980. For that reason I think AMD lost this generation. Most of 2015 highest anticipated games are all GWs and NV will have that 'automatic' 15-20% software advantage and of course SLI scaling at or near launch.

To give some contrast, here is amd sponsored title:
http--www.gamegpu.ru-images-stories-Test_GPU-Simulator-DiRT_Rally-test-drt_1920.jpg


BOOM! SLI, Xfire, all fair and square. Works regardless that it is beta and amd sponsored game.

Should amd do what nv is doing and we would have a burning crusade here.
 

stuff_me_good

Senior member
Nov 2, 2013
206
35
91
Those points are somewhat different in Europe. AC in homes is a rare thing here and electricity costs are higher. And yes my computer room is small.

I don't have graphs with my room temperatures to prove this, but I'm pretty sure about it. Just holding my hand over the exhaust of my card, the difference is very noticeable.

Either way, my point was that many people do take heat into account. If you go to the gaming subs on reddit, where people are less knowledgeable about hardware, they all think AMD cards are space heaters.
And to be clear, I'm not one of them, to me perf/$ is by far the most important metric. If I had to buy right now, I'd get a 290 for sure. Unfortunately AMD was slow dropping their prices when GM204 released.
I don't know where in europe do you live but even in spain during winter is relatively cold. So only the summer time can be hard for gaming if no AC or good ventilation.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
The 390X could be twice as power hungry as the 290X and twice as loud as the GTX 480, but if it's as fast as rumors place it and as cheap as rumors place it, then I guess I'll just have to suffer through it.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Either way, my point was that many people do take heat into account. If you go to the gaming subs on reddit, where people are less knowledgeable about hardware, they all think AMD cards are space heaters.
And to be clear, I'm not one of them, to me perf/$ is by far the most important metric. If I had to buy right now, I'd get a 290 for sure. Unfortunately AMD was slow dropping their prices when GM204 released.

For comedy's sake I asked people on a relevant imgur post what they thought the difference between the 290X and 970's power consumption was and how fast it would cover the difference.

It seems from that that the core gaming demographic is running tri/quad-fire 10+ hours a day. :rolleyes:

I just wonder how people who are unemployed to game that much are tight enough on cash to worry about the price of power and not Adam it up with quad titans or something justify not selling one or two of the cards to cover their games. Or food for that matter.

Lest this turn into 10 time zones of Russian goodpost, I'll cut to the chase: in popular consciousness, after marketing can dig its claws in, the perception of difference in power consumption is considerably greater than the actual difference.
 
Last edited:

Azix

Golden Member
Apr 18, 2014
1,438
67
91
Back to R9 300 series. In the olden days, I think ATI/AMD had a chance, but today I don't. With GWs code in AAA games, I don't see how AMD's cards stand a chance. Titan X is 66% faster than a 290X in Project CARS' recent testing. That means for R9 390X to be only 5% faster than the Titan X, it has to beat R9 290X by nearly 75%. o_O

http--www.gamegpu.ru-images-stories-Test_GPU-Simulator-Project_CARS_2015-test-pc_2560.jpg


On average, Titan X is only 43-48% faster than the 290X and here it's basically getting 20% extra out of nowhere. 980 is 34% faster than the 290X. How does one explain that? This game was made for NV cards, just like many other games coming out in the last 24 months. Pretty much that means NV is getting 15-20%+ performance advantage from software / developer relationship partnerships alone. That means even if AMD closed the 15-20% hardware gap we saw with 280 vs. 4870, 480/580 vs. 5870/6970, they need 15-20% to compensate for GWs. I don't see that happening. Even if they did, NV can just release a 1250-1275mhz GM200 6GB with after-market coolers, heck even 1.4Ghz MSI Lightning/EVGA Classified GM200 6GB. They can also release 970Ti and faster clocked 980. For that reason I think AMD lost this generation. Most of 2015 highest anticipated games are all GWs and NV will have that 'automatic' 15-20% software advantage and of course SLI scaling at or near launch

AMD needs to pick key titles but their help in these games does not negatively affect nvidia like nvidias does amd. Shame really as far as looking good goes. But games like witcher 3 should have been AMD, for the sake of the game and for our sake. The next deus ex will be amd with tressfx (and dx12 too). unfortunately division won't be amd due to being ubisoft game and will likely suffer for it.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
That's awesome. :thumbsup: What player are you using for 4K video playback?

4K playback, multi-monitor support and low power usage might make the 750/750Ti the best HTPC cards once new cards launch and these go on clearance sale for $50. ;)

----------

Back to R9 300 series. In the olden days, I think ATI/AMD had a chance, but today I don't. With GWs code in AAA games, I don't see how AMD's cards stand a chance. Titan X is 66% faster than a 290X in Project CARS' recent testing. That means for R9 390X to be only 5% faster than the Titan X, it has to beat R9 290X by nearly 75%. o_O

http--www.gamegpu.ru-images-stories-Test_GPU-Simulator-Project_CARS_2015-test-pc_2560.jpg


On average, Titan X is only 43-48% faster than the 290X and here it's basically getting 20% extra out of nowhere. 980 is 34% faster than the 290X. How does one explain that? This game was made for NV cards, just like many other games coming out in the last 24 months. Pretty much that means NV is getting 15-20%+ performance advantage from software / developer relationship partnerships alone. That means even if AMD closed the 15-20% hardware gap we saw with 280 vs. 4870, 480/580 vs. 5870/6970, they need 15-20% to compensate for GWs. I don't see that happening. Even if they did, NV can just release a 1250-1275mhz GM200 6GB with after-market coolers, heck even 1.4Ghz MSI Lightning/EVGA Classified GM200 6GB. They can also release 970Ti and faster clocked 980. For that reason I think AMD lost this generation. Most of 2015 highest anticipated games are all GWs and NV will have that 'automatic' 15-20% software advantage and of course SLI scaling at or near launch.

Yet people still act like GW is a good thing and not Nvidia's temper tantrum over not getting any console contracts.