are we really biased ?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I don't own that card so I can't test it. What I can test is the 7970 and since the 12.11 driver I too have stuttering and similar data to tech report. I also know most people don't have a problem with stutter and clearly a single card is not as bad (about 50% less severe in my tests).

I agree with him that AMD has made the problem worse. But most people haven't noticed because they aren't susceptible to it. But my data and perception matches techreports. Not because of today's review, but because of my own testing.

His review isn't relevant to you, its relevant to me because I am part of a minority who can't play with stuttering that severe. My max acceptable appears to be around 6ms swings before it becomes a problem. Everyone has a threshold, presumably slightly different so it might be worth finding yours.
 

sandorski

No Lifer
Oct 10, 1999
70,787
6,346
126
AMD is clearly the best company in the universe, anyone who disagrees is biased.

Facts:

AMD CPU are more than 10 times as fast as Nvidia CPU.

AMD GPU are more than 10 times as fast as Intel GPU.

10+10= 20

Therefore, AMD is 20 times better than Intel plus Nvidia.

I'm not biased, I just trust the math. Math can't lie.

:D:D
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
Recognising ones bias is very hard.

AMD currently has a problem with its cards. They benchmark well but they stutter. Thus they fail to produce a good perception of motion for some (maybe many) users. Its not bad enough for all users to find it unplayable but those that perceive it find the experience very poor.

NVidia has the problem it can't manufacturee its top card. The mid range card is not as quick as AMDs card and it was very late to market. It produces lower frames per second although seems to match the image quality. However NVidias cards so not stutter and hence all users are able to perceive smooth motion so long as the average frame rate at least exceeds 30 fps, which it definitely does.

So here in lies the rub. AMD has a history of driver problems, still has a serious one, and Nvidia has a reputation that is just better. This generation AMD has produced a lemon, its just the way tech sites measures motion is wrong. There are better ways and everyone will has to change to accommodate the way AMD has broken it.

But for a lot of users who don't perceive the problem AMD is clearly better, and for them it is. I would actually recommend people buy an AMD card and play with it on a variety of games with sub 60fps and see if they think its smooth. If it is you got a better cars for less. But if you think its at stuttering or just not quite smooth then send it back immediately and pay more for Nvidias inferior card. You at this point know you perceive micro stutter and will need to choose based on it in the future.

Understanding both sides of this is important to understanding your own bias in this discussion. Just because you don't see stuttering does not mean someone else wont.

Bright Candle you say you're a successful professional competitive gamer, if so it should be fairly easy to attract sponsorship from a graphics card manufacturer. When I was a tech journalist I got all my hardware for free for the vague promise of some positive coverage. Tech PR companies dish out hardware to a journalist who asks for it in exchange for coverage or a positive review and they sponsor competitive gamers.
If you haven't got a sponsor you should get one. Do you see less stuttering in Arma2?
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Recognising ones bias is very hard. It can be done but it often only happens when someone you know and care about disagrees strongly with you. You won't find that in a forum.

When it comes to graphics cards I think its important to understand that fps is a proxy measure for what we actually want. Settings in game are a proxy also for what it is we want. Our ideal card produced photo realistic quality with zero artefacts and does so at a frame rate where no one can tell its anything but the real world. FPS as an average measure is crude at best for measuring perceiving motion and no one really test the quality of image reproduction in comparison to rendering exactly as it should be.

AMD currently has a problem with its cards. They benchmark well but they stutter. Thus they fail to produce a good perception of motion for some (maybe many) users. Its not bad enough for all users to find it unplayable but those that perceive it find the experience very poor.

NVidia has the problem it can't manufacturee its top card. The mid range card is not as quick as AMDs card and it was very late to market. It produces lower frames per second although seems to match the image quality. However NVidias cards so not stutter and hence all users are able to perceive smooth motion so long as the average frame rate at least exceeds 30 fps, which it definitely does.

So here in lies the rub. AMD has a history of driver problems, still has a serious one, and Nvidia has a reputation that is just better. This generation AMD has produced a lemon, its just the way tech sites measures motion is wrong. There are better ways and everyone will has to change to accommodate the way AMD has broken it.

But for a lot of users who don't perceive the problem AMD is clearly better, and for them it is. I would actually recommend people buy an AMD card and play with it on a variety of games with sub 60fps and see if they think its smooth. If it is you got a better cars for less. But if you think its at stuttering or just not quite smooth then send it back immediately and pay more for Nvidias inferior card. You at this point know you perceive micro stutter and will need to choose based on it in the future.

Understanding both sides of this is important to understanding your own bias in this discussion. Just because you don't see stuttering does not mean someone else wont.

You keep saying this but all the daya I have seen doesn't say this. In fact the 1 video footage we have of a side by side comparison it seems like stuttering is more prevalent in an SLi system.
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
Everyone is biased in almost every product they buy. Very few approach spending with complete objectivism. Those who bloviate the most about how unbiased they are remind me of that Shakespeare quote: "The lady doth protest too much, methinks.".

I prefer Nvidia - even if at times certain cards might have slightly lower performance or a slightly higher price vs ATI/AMD. I've owned four ATI cards and tried and returned two AMD cards over the years (HD 5850, HD 7850). I get a better, smoother gaming experience from NV. Of course that's an anecdotal observation that doesn't necessarily apply to anyone else (and I won't claim that it should - try both brands and see which is best for you and your very particular hardware combination and game selection). Yeah, I am biased towards Nvidia much in the same way I like Mazda automobiles. At least I'll admit it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
If it wasnt obviously enough. You are utterly BIASED. And I would actually use another word as well.

That's your opinion, but here is what really happened with your post:

You clearly provided misleading graphs and if I didn't call you on it, would someone else have? You just attacked me because I was the first person to correct the misinformation and false causation and market share trends in your entire post. You attributed notebook GPU market share not only to desktop GPUs, but it also means you showed as if desktop discrete GPU market share declined at the same pace as notebook GPU marker share did, which of course is also incorrect.

Yet your market share graph shows a dramatic decline and you addressed it to desktop products, citing similar market share of ~ 35-36%, implying it's "similar" so why am I even pointing it out? The fact that the notebook and desktop GPU market share numbers are similar now has nothing to do with a precipitous decline in market share as the desktop and notebook GPU markets behaved completely differently. In fact, your post completely didn't address what really happened. AMD dropped just 2% in the desktop, and 17% in notebooks. You didn't mention this at all.
http://jonpeddie.com/press-releases...arter-to-quarter-growthamd-and-intel-dropped/

By showing a dramatically declining notebook GPU market share from above 50% to mid-30s you are misrepresenting what happened in the desktop discrete GPU market. The desktop discrete GPU market share was never above 50% 3 quarters ago for AMD. They continued to have much lower market share to NV as early as Q1 2012; this isn't news. So again, you misrepresented what's happening in the desktop discrete GPU space and got called by me because I follow the industry and can pick up on BS immediately when I see it.

Furthermore, AMD "voluntarily" gave up market share is a very real explanation to at least part of the lost market share in the notebook space. They simply had no $ to secure more notebook GPU design wins. This is how the GPU industry works. Nvidia pumped up OpEx due to increased Kepler implementation costs and other manufacturing-related charges that the company faced during the Kepler GPU family ramp up. This is at least 1 key reason that allowed NV to gain more than 300+ design wins and actually deliver on those contracts. NV even sighted that lack of focus on mobile parts was a reason why they didn't get as many Fermi notebook design wins and they planned to spend a lot more $ this round on securing them. This type of strategic/marketing/OPEX spending was planned in advance and NV executed perfectly. AMD conceding market share in the notebook discrete GPU space can be partly explained due to lack of financial ability to invest into notebook GPU design wins. I never said this is the only reason, although you implied AMD's product performance is the only reason since you failed to provide alternative explanations. What I linked is not coming from me, but from AMD's management and professionals in the industry. I can't help you if want to deny this information and instead call me biased. If you have a problem, consult AMD's management and industry analysts reporting on this news. Sometimes companies simply cannot spend as much marketing $ as was required or afford to produce custom design wins if it's too expensive given your company's financial circumstances. I am neither agreeing or disagreeing with AMD's strategy but simply have pointed out alternative explanations for why so much market share was lost, backed up actual professional industry support/sources. You have done nothing of this sort and instead provided a market share graph.

If you took the time to look at market share and what really happened, the AIB market for GPUs grew from 14.76 million units to 17.54 million in a seasonal transition from Q2 to Q3. AMD's GPU shipments actually increased from 5.95 million to 6.26 million, while NV gained from 8.75 to 11.23 million from Q2 to Q3. AMD obviously lost a ton of market share in a seasonally growing Q3 because they couldn't deliver on those design wins, not only because their products were inferior, which is the ONLY insinuation you made. This is why I pointed out the flaw in your entire post given that industry sources and management's own comments provide at least another explanation as well. The market share data is all public information:
http://jonpeddie.com/press-releases...rd-shipments-seasonally-up-from-last-quarter/

Countless years of me posting have shown that I recommend AMD and NV products based on price/performance and overclocking and change my recommendations if drivers change, performance in modern games changes and even adjust my recommendations based on specific uses of a gamer (high resolutions, SLI vs. CFX, etc.). I have no problems recommending NV cards when they offer great value or would work better for someone. I cannot say the same about you since the time you joined this forum. I don't recall you saying a single favourable thing about AMD cards or recommending them, ever. You can call me biased all day if you want, but I am simply providing all the details in an objective way to members on this sub-forum regarding the post you made. It's amusing for you calling me biased when you intentionally misrepresented/omitted information to prove a point and then used a lame excuse that notebook and GPU market shares are similar to make it seem you did nothing misleading at all. You better be prepared to scrutiny on a forum full of technical people and those who follow the industry. Calling them biased will do nothing to add credibility to your post.
 
Last edited:

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
So you are saying FX5200-5900, GTX450/550Ti, GeForce 7 sold well because they were great products? Your graph does not disprove that many NV buyers are sheep. You can admit it or not but there are many NV cards that were absolute turds and remained so for their entire useful lives. FX5000 series is probably the most famous of all. Of course ATI/AMD has many of those too. Both companies have sheep but time and time again has shown that even when NV produces worse products, people still buy them. Even when NV is late by 6-8 months to market, people still wait to give them their $, ignoring completely the opportunity cost of gaming on a slow GPU despite AMD providing a reasonable option. The same rarely happens with AMD cards. If AMD flops a generation, by far the majority of AMD owners would jump ship to NV. If AMD is 6-8 months late, there is no way the majority of AMD enthusiast on this forum are going to be waiting that long. But we see this repeated time and time again for many people who continue to buy only NV cards.



I think most people on this sub-forum care only about desktop discrete GPU parts. You just linked a notebook dGPU market share graph, assuming no one else would notice. It's just too bad you left the original source link available for everyone to see. I went there and I noticed the chart is showing notebook market share only. So now I am asking how relevant is this to anything that goes on here regarding desktop HD7000 vs. GTX600 discussions? I'll go a step further though.

Bias is not just favouring one brand over another no matter what (FX5900/GeForce 7 series), but intentionally posting misleading information without disclosing pertinent information that explains the other side of the story. As a perfect example, in your post you selectively used a notebook GPU market share graph to insinuate that "Because [mobile] NV GPUs sell better and AMD is losing market share, then because more people on our forum recommend [desktop] AMD cards and call NV buyers sheep, then AMD members on this forum must be biased because they can't admit or see the great value of NV's [desktop] products."

It's logical to conclude then that NV's GPUs must be superior on the whole (for those here who weren't aware that your graph has nothing to do with desktop parts), or otherwise why would AMD be losing so much market share? It can't be that millions of NV buyers are sheep, or maybe there is another good explanation?

If you wanted to be objective, you would have posted the source article that explains why a lot of that market share was lost - and if you did, the readers on our forum would have known your causation of quality/performance that NV offers has little to do with why this market share was lost. AMD voluntarily gave up market share to NV because they couldn't afford to secure those design wins.

"In a bid to cut costs, Advanced Micro Devices claims it is turning down certain low-volume deals that require it to invest into implementation of its products. While such approach leads to a significant decrease of market share, it naturally means leaner financial structure of the whole company. Nvidia is now the No. 1 supplier of notebook GPUs (based on data from Mercury Research provided by Nvidia) because of AMD’s reluctance to help integrate its Radeon Mobility products based on the recent architecture. The policy of cutting implementation and other costs has reduced the company’s operating expenses from circa $610 million to about $450 million per quarter this year. For a struggling company, $160 million in cash is a significant amount of money."

It's probable that some of that market share was lost because NV's Kepler parts are superior for the mobile market in terms of performance/watt but it appears you intentionally omitted a significant part of what that graph depicts.

In summary:

1) You managed to depict notebook dGPU market share as desktop dGPU market share;
2) You didn't link to source doc which explained reasons other than performance or price/performance, but instead assumed it has everything to do with NV's cards being superior;
3) Market share and sales data alone do not prove whether one product is superior to the other. Plasma vs. LCD/LED is the perfect example why an inferior product can be vastly more popular.

-----------------

Some people might try to claim that VC&G sub-forum is AMD biased but what have AMD GPUs provided in the last 4 years ? Very good price/performance, and overclocking/enthusiast features (dual-BIOSes, safe bios flashing, price/performance of HD4000/5000/6000 series) and actually prior to HD7000 series, superior performance/watt since 2008.

As was already mentioned before, most people who recommended AMD GPUs over years continue to focus on these qualities and would have no problems switching sides at any point. I can't say the same about certain NV users.

This generation was no exception as HD7900 series were hardly recommended until the prices dropped, new drivers were released, their performance improved and game bundles followed. At the same time, how can anyone be blamed for recommending 28nm HD7770-7870 cards when NV took 6-8 months to launch their respective competitors? Were we supposed to tell people to buy slower and more power hungry 40nm NV parts?

If anything, this generation has made it more evident who the fanboys are. Bias was blaming AMD for ripping off consumers but not only did NV deliver the least impressive generational increase ever with GTX680, but also ignoring that GTX280 depreciated worse than HD7970 did. Bias was defending NV locking voltage control as a great measure for enthusiasts to save them from RMA. Bias was shifting goal posts of not caring about performance/watt for 3 generations to this being the most important factor this round. Bias was discussing amazing overclocking of GTX460/470/560/560Ti parts and ignoring it for the most part for HD7000 series, claiming it to be luck of the draw. Bias was claiming that AMD drivers were very poor, while ignoring that Fermi drivers needed at least 6 months to get up to speed. Bias was more or less blaming AMD for high prices of this generation but ignoring that NV publicly admitted that they prioritized its mobile customers and as a result of wafer shortages delayed their sub-$300 desktop GPU roll-out by 6-8 months. As such, NV just as much ripped us off by underdelivering with the 680 and is at least partially responsible for allowing AMD to maintain higher prices of HD7900 for 2.5 months before 670/680 launched and thereafter for sub-$300 desktop parts by being MIA for 6+ months. Bias was shown again by ignoring NV's ridiculous prices of 8800GTX Ultra or GTX280 cards. I mean if we are going to be fair, even GTX480 was more of a rip-off than HD7970 was. It was 6 months behind HD5870 with only 20% more performance but 35% higher cost ($499 vs. $369). HD7970 was 20% faster than GTX580 and was 22% more expensive ($549 vs. $449). There are many more examples of double standards exhibited by Team Green, and yet they call our VC&G forum biased?

If anything, many AMD "biased" members have remained consistent by focusing on price/performance and overclocking, and I would say a larger weighting assigned to higher resolution (>1920x1200) for more expensive GPUs. Many pro-NV members here just shift goal posts every generation to whatever metric is winning in a given time. You can count on 1 thing - if NV cards are losing, everything will shift to driver quality and PhysX, guaranteed.

For many AMD owners this round, perhaps the biggest trump card of all was bitcoin mining. Sure, PhysX sounds nice but ignoring bitcoin mining by NV users was a real eye-opener. Who can argue that someone is biased because they got a $500-$1000 of GPUs for free, or nearly free? That to me beat out just about anything NV had on the table this generation. That's not bias, but saving $ to get a very similar gaming experience.

Damn! that was a good post.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
You guys can't let a single thread exist without turning it in to the same thing every single thread on this board turns in to.


:rolleyes:
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
People voted with their wallets. Period.

Yup, LCDs/LEDs have 90% market share and are inferior to Plasma televisions in viewing angles, black levels, colour accuracy, response time/motion blur. You'd have to go way down on the list before a first LED is even considered for the best overall TV of 2012. Again, your logic that consumers on average are smart and buy the 'best' products is flawed. There are many other variables why some products sell better than others, some attributable to performance, others to price, others to perceptions, others to consumer awareness, others to marketing, etc. Sales and market share alone do not tell us about the quality or performance of the actual product. I think it's pretty absurd to even try to make such an argument. My guess is you never worked in the consumer products industry, the field of marketing or went to business school. Otherwise you would have never made such an elementary connection that High sales = great product. It can be true, and in other cases it may not be true. Have you heard of Crocs?

For instance, there are so many variables in high-end GPUs that might matter for niche gamers such as those on enthusiast forums:
1) Do these consumers overclock?
2) Do these consumers know what bitcoin mining is?
3) Do these consumers play games with high MSAA and mods?
4) Do these consumers use their GPUs outside of games? If so for what programs? Do these programs benefit from double precision performance, CUDA, OpenCL, etc.?
5) Do these consumers use more than 1 GPU?
6) Do these consumer play at higher resolutions or use multiple monitors?

Many average consumers don't do any of the above. But most importantly, you still didn't address how severe declines in notebook GPU market share of 17% are in any way comparable to a 2% loss for desktop GPUs in the same time frame? Yet, you made the correlation anyway. :rolleyes:
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
12,062
2,275
126
For many AMD owners this round, perhaps the biggest trump card of all was bitcoin mining. Sure, PhysX sounds nice but ignoring bitcoin mining by NV users was a real eye-opener. Who can argue that someone is biased because they got a $500-$1000 of GPUs for free, or nearly free? That to me beat out just about anything NV had on the table this generation. That's not bias, but saving $ to get a very similar gaming experience.

QFT!! Never before (at least not that I can remember) could mainstream GPU purchasers actually have their cards pay for themselves. That alone has kept me with AMD for the past 2 generations. For me, the gaming experience overall is pretty similar between both camps, so why not go for the one that actually pays for itself? IF nVidia made cards that were better at bitcoin mining I would have switched to them this round since I would not have had to modify my GPU water block to fit on the odd Tahiti GPU.

You can scream drivers (debatable IMO since I have had bad experiences with nV more than AMD) or physx all day long, but those are not enough to trump my cards making a profit for me. And it IS a profit for me...2 6950s and a 7950 all paid for...now it's just beer money :D
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
My max acceptable appears to be around 6ms swings before it becomes a problem. Everyone has a threshold, presumably slightly different so it might be worth finding yours.

How do you play any games at all then? I am pretty sure 6ms is nearly impossible to attain with GTX680 SLI in modern games at high settings.

While I don't disagree with your findings that you found CF to exhibit more stutter than SLI, it is also true that others do not notice single GPU micro-stutter. I don't think we should start generalizing right away that all AMD GPUs stutter or that NV GPUs are free of stutter. Depending on the game and resolutions, things can be completely reversed.

For example, if someone is gaming on multi-monitors, GTX680/680 SLI actually stutter more than HD7970/HD7970 CF in a game like BF3. I am using TR once again to be consistent with their HD7950 vs. GTX660Ti review.

bf3-fps.gif

bf3-99th.gif

techReport review

or Crysis 2 runs with less stutter on HD7970 CF against GTX680 SLI.

crysis2-fps.gif

crysis2-99th.gif


It seems both companies suffer from micro-stutter depending on resolution and game. What I am having doubts believing is that at the same 60 fps in a game, GTX680 SLI "feels smoother" than HD7970 at 60 fps. You are insinuating that GTX 680 SLI multi-GPU stutter is less than a single HD7970, assuming you set all the in-game settings so that both can run at 60 fps?

My HD6950 @ 6970 and HD7970 stutter less than my GTX470s did. Based on my experience, I can't notice the stutter difference for single GPUs between NV and AMD (unless of course one GPU is chugging at low frames compared to the other) but I notice that single GPUs feel smoother as lower frames than my 470s did at higher frames. To me multi-GPUs still have more inherent stutter, regardless of brand. Also, it's still odd to see how GTX660Ti produced higher frames per second than HD7950 950mhz version did in Skyrim at 2560x1440 with MSAA. Many other people have noted this already and I tried to find any other review online that had GTX660Ti beating HD7950 boost in Skyrim at that resolution and I simply could not. I find that very odd actually that TR has GTX660Ti producing faster performance than 7950 in this game.
 
Last edited:

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
Yup, LCDs/LEDs have 90% market share and are inferior to Plasma televisions in viewing angles, black levels, colour accuracy, response time/motion blur. You'd have to go way down on the list before a first LED is even considered for the best overall TV of 2012. Again, your logic that consumers on average are smart and buy the 'best' products is flawed. There are many other variables why some products sell better than others, some attributable to performance, others to price, others to perceptions, others to consumer awareness, others to marketing, etc. Sales and market share alone do not tell us about the quality or performance of the actual product. I think it's pretty absurd to even try to make such an argument. My guess is you never worked in the consumer products industry, the field of marketing or went to business school. Otherwise you would have never made such an elementary connection that High sales = great product.

For instance, there are so many variables in high-end GPUs that might matter for niche gamers such as those on enthusiast forums:
1) Do these consumers overclock?
2) Do these consumers know what bitcoin mining is?
3) Do these consumers play games with high MSAA and mods?
4) Do these consumers use their GPUs outside of games? If so for what programs?
5) Do these consumers use more than 1 GPU?
6) Do these consumer play at higher resolutions or use multiple monitors?

Many average consumers don't do any of the above. But most importantly, you still didn't address how severe declines in notebook GPU market share 17% are in any way comparable to a 2% loss for desktop GPUs? Yet, you made the correlation anyway. :rolleyes:

I'm starting to suspect that Russian Sensation is a collective of sensational Russians designed by a mad ex-soviet scientist to wade through the morass of unscientific opinion in the vc&g forum with a magic wand of incisive truth, probs made in an ex-tank factory beyond the urals.:thumbsup:
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Bright Candle you say you're a successful professional competitive gamer, if so it should be fairly easy to attract sponsorship from a graphics card manufacturer. When I was a tech journalist I got all my hardware for free for the vague promise of some positive coverage. Tech PR companies dish out hardware to a journalist who asks for it in exchange for coverage or a positive review and they sponsor competitive gamers.
If you haven't got a sponsor you should get one.

If you haven't gathered this already I can confidentally say I would never take a sponsor. Not interested in going pro in that way.

Do you see less stuttering in Arma2?

Its a little better in Arma 2. I am happy with 2x680's down to about 40 fps, AMD nothing below 55 (Vsync on). The big difference between the two is that I can run the graphics settings much higher on the 680, likely because a lower frame rate is acceptable. I still target 60 fps mostly but I played rolling thunder 10 with drops to 45 fps without a problem.

But the real gain in Arma 2 is with ATOC, which is the AA that is applied to trees and bushes. The AMD cards have had a bug for about 1.5 years (had this on the 5970 as well) where sometimes the trees and bushes show white pixels where antialiasing should be applied, we just call it white speckles. It is a horrible effect and it makes the game nearly unplayable as you can't see through the trees and bushes, which on a map like Lingor (Vietnam like jungles) its just lethal. So you have to turn ATOC and any AA off, which in Arma looks downright terrible. But the bug doesn't exist on NVidia so you can run FXAA/SMAA happily and it thus looks considerably better.

Its one game I play a lot where based on Bit-techs benchmarking I was expecting to loose out quite a lot as the NVidia cards perform worse but in practice ended up looking better and not having the one artefact that has been consistently annoying me.

Rage also gained massively, it was the difference between 5760x1200 + AA verses 1920x1200 and no AA. The IQ difference between NVidia and AMD was enormous. The Witcher 2 also runs a lot better and with higher settings, at a lower FPS without the stutter. Honestly there are a lot of games that both look and run better for me on 680's compared to 7970's, and in every case I had charts just like techreport, and when it wasn't present (like with frame meter that worked for example) the stuttering in the chart was equally reduced to almost nothing.

I have personally found that frame times seems to mostly be good enough for determining stuttering, but I have also seen cases (Far Cry 3) where I don't see the impact in the frame times as I would expect but see stuttering in both solutions below 60 fps. So while it seems to be the case that if frame time is inconsistent then the game is stuttering its not true that if the chart shows no variance that the game is not stuttering.
 

Jacky60

Golden Member
Jan 3, 2010
1,123
0
0
Going pro that way involves you doing exactly what you're doing now but receiving free product from one or other manufacturer.

Re:ATOC bug yes it's there, no the lazy AMD driver team hasn't fixed it. In Arma2 the white speckles require you to turn AA to high in the in game settings not extra high and use ccc to enable higher aa the same way that default video settings mean all vram used whereas very high is max 1gb, don't ask me why. I get incredible image quality with MSAA/FSAA in multiplayer on those maps BUT you do have to shop/mess around unnecessarily to find the right settings. It's never been unplayable and the benchmarks suggest AMD has a healthy lead. I understand if you find a few games annoyingly 'unplayable' because driver support isn't there and I sent emails to AMD about arma2 when I bought my current solution and found driver support lacking. Its pretty good now but still requires workarounds but Arma (and BIS) games are notoriously buggy when first released and often require the online community to fix them. Put simply i can run max aa max af and am cpu limited. I would love to experience 680 sli as I too looked at bit-tech review. Not sure I could go back to the frame rates ur talking about though.;)
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I am pretty unhappy with both sets of cards. I have games right now that I can't run above medium settings without some pretty awful stutter. I have dual top end cards, I should be playing on ultra on everything and it hasn't happened except in a few (mostly console port) games. On the one hand I should love the high detail now being offered, but I loath the stuttering that seems so prevalent with this generation.

Neither card is quick enough in my opinion, neither is stutter free. I have had less pain with NVidia but its not exactly perfect and I just want a lot more performance than I currently get. Peeps in the cpu channel often say PCs are fast enough now but its certainly not true for GPUs and game graphics.

In addition to more GPU performance I also want 120fps IPS colour quality monitors. I don't like washed out images and I certainly don't like hitching motion and I want it all in ultra widescreen with peripheral vision (major competitive advantage frankly). The rush to the bottom and ultra small is destroying my chances of getting my hoped for future PC, I don't imagine I'll want an embedded GPU for a very long time, the trade offs just don't make sense.
 

brandon888

Senior member
Jun 28, 2012
537
0
0
anyway amd has shitty drivers .....

http://www.youtube.com/watch?v=xagcPDlLBNw


i had 6950 and 7970 .... and i will never ever touch amd in my life .... im not a fanboy but im not blind sorry ... amd is unstable shit in games ....



i had I7 950 4.0 GHZ with GTX 480 and then tested on Radeon 7970 .... and 7970 was owned by GTX 480 in battlefield 3 lol .... sure drivers were new .. it was in february :D


anyway amd is and will be shit for me .... don't agree with me ? no problem ... go bash my post i don't care .....

True thing is .....

Amd Cards 7000 series has better specifications then 600 series of nvidia cards ... more bandwidth , more pixel filtrate and ect ... but drivers sux ....


My dream is ... AMD Card with nvidia drivers ... that's all
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
On the contrary My brother's 7950 DC II is running like cream the last time I heard :D
 

Rikard

Senior member
Apr 25, 2012
428
0
0
I have dual top end cards, I should be playing on ultra on everything and it hasn't happened except in a few (mostly console port) games. On the one hand I should love the high detail now being offered, but I loath the stuttering that seems so prevalent with this generation.
After some self reflection, I have realized I have no bias towards a certain brand or manufacturer but I do have a bias against multi-GPU solutions. In the beginning it was based on performance per cost, and combining that with the myriad of problems SLI and Crossfire users encounter I thought it is simply not worth it. With time however, this develops into a filter that raises a red flag everytime multi-GPU solutions are mentioned, and it is all too easy to interpret all negative reports in a way that reinforces that bias. The quote above is an example of such a bias reinforcing message. I suppose that is similar to some forumites develop a bias for or againts a particular brand too.
 

Whitestar127

Senior member
Dec 2, 2011
397
24
81
I am pretty unhappy with both sets of cards. I have games right now that I can't run above medium settings without some pretty awful stutter. I have dual top end cards, I should be playing on ultra on everything and it hasn't happened except in a few (mostly console port) games. On the one hand I should love the high detail now being offered, but I loath the stuttering that seems so prevalent with this generation.

Neither card is quick enough in my opinion, neither is stutter free. I have had less pain with NVidia but its not exactly perfect and I just want a lot more performance than I currently get. Peeps in the cpu channel often say PCs are fast enough now but its certainly not true for GPUs and game graphics.

In addition to more GPU performance I also want 120fps IPS colour quality monitors. I don't like washed out images and I certainly don't like hitching motion and I want it all in ultra widescreen with peripheral vision (major competitive advantage frankly). The rush to the bottom and ultra small is destroying my chances of getting my hoped for future PC, I don't imagine I'll want an embedded GPU for a very long time, the trade offs just don't make sense.

IPS? I thought those monitors had too low response time for you competitive gamers?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
IPS? I thought those monitors had too low response time for you competitive gamers?

Older ones perhaps. They displayed more ghosting than your average TN panel. Competitive gamers obviously preferred TN.

These days, all panels could be much improved.
 

Whitestar127

Senior member
Dec 2, 2011
397
24
81
Some people here are definitely biased. I mean come on people, it is not a political party, it is not a religion, it is just a computer component! The smart people look for the best performance per cost that satisfies the individual user requirement one might have, and make an informed decision on buying a product based on that.

:thumbsup:
 
Status
Not open for further replies.