• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Who/What do YOU blame for this extreme lull for enthusiasts?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Who is to blame for a boring product lineup?

  • ATI/Nvidia

  • Game Developers/Lack of need

  • Economy

  • Improving APUs from Intel/AMD

  • GPU Fabs (TSMC, GF, whoever)

  • Relatively Cheap and long supported Console Platforms

  • What lull?


Results are only viewable after voting.
I'm just having a hard time with the people saying that an average FPS in the 20s-40s is acceptable. The FPS dumps you would no doubt experience during intense and resource hungry moments would make gaming unacceptable.

If you are running @ 80fps, you can afford to lose 10-15 FPS. Averaging 30? Not so much.

Min. FPS matters as well as the average.

Right, but like I said...I played the original crysis at around 25fps and it wasn't unplayable. At the time you couldn't expect to get much more than 30 even with a $500+ card lol.
 
I'm just having a hard time with the people saying that an average FPS in the 20s-40s is acceptable. The FPS dumps you would no doubt experience during intense and resource hungry moments would make gaming unacceptable.

If you are running @ 80fps, you can afford to lose 10-15 FPS. Averaging 30? Not so much.

Min. FPS matters as well as the average.

You would be suprised to see how some tards game, it's mindboggling:
http://www.youtube.com/watch?v=Ff5KctPOVS8
 
Actually it does...but it dosn't suit your agenda..so you will ignore it.
Only a Tool would talk about a diffrence in I.Q. in games today.

What is next..the only reason we don't hear this in REVIEWS...ae because NVIDIA paid of ALL reviewers?

Bottom line: You post false information.

The most recent example of visual difference between AMD and Nvidia is in this screenshot from Far Cry 3. This could have been fixed by now in drivers, but at release the comparison showed this difference. The Nvidia hardware rendered HDAO a little darker and the flesh tones had more red in them. The building however had some writing on a sign that was more legible on the Nvidia shot vs the AMD shot. Although it appears they are standing in slightly different spots so the dynamic weather and different lighting could play a role as well. While the hardware may output the same image, the driver can produce varying effects behind the scenes. The real problem is with this game, the cloud cover, time of day, position of the shadows from buildings and trees, and the position the player stands in changes the resulting visuals so you can not really get an accurate comparison.

farcry3_hdao.jpg
 
Last edited:
The most recent example of visual difference between AMD and Nvidia is in this screenshot from Far Cry 3. This could have been fixed by now in drivers, but at release the comparison showed this difference. The Nvidia hardware rendered HDAO a little darker and the flesh tones had more red in them. The building however had some writing on a sign that was more legible on the Nvidia shot vs the AMD shot. Although it appears they are standing in slightly different spots so the dynamic weather and different lighting could play a role as well. While the hardware may output the same image, the driver can produce varying effects behind the scenes.

farcry3_hdao.jpg

Bolded part...invalid test sample.
 
Right, but like I said...I played the original crysis at around 25fps and it wasn't unplayable. At the time you couldn't expect to get much more than 30 even with a $500+ card lol.

I remember playing Crysis 1 at 40+ FPS, but it required me to lower my resolution and settings. I also would get motion sickness after a few minutes of play, but I'd push through it. I could build up a tolerance after a while and last as much as 30-45 minutes at those FPS. That was the last game I played with that poor of FPS. I did not know that low FPS played a big part in my motion sickness problems at that time. Farcry 1 was even worse. I played that around 30 FPS.
 
Bolded part...invalid test sample.

Duh...did you read what I typed? That's the point...you can't get a really accurate test. While some people say one was darker, you couldn't read the sign on the other and it was over brightened. Impossible to know what's going on. One thing is certain though, the hardware has no effect on the image with a digital connection.

I remember playing Crysis 1 at 40+ FPS, but it required me to lower my resolution and settings. I also would get motion sickness after a few minutes of play, but I'd push through it. I could build up a tolerance after a while and last as much as 30-45 minutes at those FPS. That was the last game I played with that poor of FPS. I did not know that low FPS played a big part in my motion sickness problems at that time. Farcry 1 was even worse. I played that around 30 FPS.

I played through at max settings at 1280x1024. Was pretty slow but it wasn't unplayable. I think mostly because of the motion blur.
 
Last edited:
The most recent example of visual difference between AMD and Nvidia is in this screenshot from Far Cry 3. This could have been fixed by now in drivers, but at release the comparison showed this difference. The Nvidia hardware rendered HDAO a little darker and the flesh tones had more red in them. The building however had some writing on a sign that was more legible on the Nvidia shot vs the AMD shot. Although it appears they are standing in slightly different spots so the dynamic weather and different lighting could play a role as well. While the hardware may output the same image, the driver can produce varying effects behind the scenes. The real problem is with this game, the cloud cover, time of day, position of the shadows from buildings and trees, and the position the player stands in changes the resulting visuals so you can not really get an accurate comparison.

farcry3_hdao.jpg

I don't know which is actually better, but keep in mind. Nvida has HDAO and AMD uses HBAO, as a result, they do get slightly different results. Those are not the same exact settings.
 
I don't know which is actually better, but keep in mind. Nvida has HDAO and AMD uses HBAO, as a result, they do get slightly different results. Those are not the same settings.

Those are both using HDAO. The settings are the same between them. The only difference is the drivers in use. However the standing position, time of day, and weather pattern are not the same which does change some of what you see too.

Anyway the real comparison was how the HDAO is displayed in various areas. The Shadows aren't exactly the same between them. AT least not at the time.
 
Those are both using HDAO. The settings are the same between them. The only difference is the drivers in use. However the standing position, time of day, and weather pattern are not the same which does change some of what you see too.

Anyway the real comparison was how the HDAO is displayed in various areas. The Shadows aren't exactly the same between them. AT least not at the time.
From what I've read, AMD and Nvidia have different implementations of that settings in their drivers/hardware. Everywhere else I look shows that the two brands do not have the same settings offered in Farcry 3. I think they just didn't label the picture well.

Granted, HDAO and HBAO are supposed to be the same thing, they just have slightly different implementations of the same idea.

But I agree that they are close enough and I wouldn't quibble over either.

edit: Perhaps the settings are there, but due to the way the drivers are created, you use HBAO for best results on Nvidia, and HDAO for AMD.
 
Last edited:
I know for a fact you can select HDAO on Nvidia hardware, I had the option on my 670. The difference was that supposedly the Nvidia cards artifacted or had issues rendering HDAO so the recommendation at the time was to use HBAO. This could have been totally rectified with drivers though. The fact remains that the cards themselves output the same digital signal either way. The software is what changes the final image.
 
I know for a fact you can select HDAO on Nvidia hardware, I had the option on my 670. The difference was that supposedly the Nvidia cards artifacted or had issues rendering HDAO so the recommendation at the time was to use HBAO. This could have been totally rectified with drivers though. The fact remains that the cards themselves output the same digital signal either way. The software is what changes the final image.
You may be right, but HBAO is still an Nvidia version of HDAO, and may still be preferred.

I have a hard time seeing any real difference in quality. Slight differences of shade doesn't make one better imo. I don't know which they'd rather have.
 
You may be right, but HBAO is still an Nvidia version of HDAO, and may still be preferred.

I have a hard time seeing any real difference in quality. Slight differences of shade doesn't make one better imo. I don't know which they'd rather have.

Then you have questions like "what was the artist's intentions when designing the game?" "What did the developers code into the game as being the more correct color tone?"

We don't know these answers either.
 
Then you have questions like "what was the artist's intentions when designing the game?" "What did the developers code into the game as being the more correct color tone?"

We don't know these answers either.
That's exactly why I try not to get into these comparisons. It is so hard to know what the intentions are.
 
Then you have questions like "what was the artist's intentions when designing the game?" "What did the developers code into the game as being the more correct color tone?"

We don't know these answers either.
It's a non issue because the picts looked the same. The monitor could have been different etc.
 
It's a non issue because the picts looked the same. The monitor could have been different etc.

The monitor has no effect on screen shots. They are taken from the GPU in the same digital format that will be sent to any monitor.

Though I guess the way they look on your monitor may effect how you like different colors levels from the video card, but those are monitor adjustment issues, and not video card issues.
 
The monitor has no effect on screen shots. They are taken from the GPU in the same digital format that will be sent to any monitor.

Though I guess the way they look on your monitor may effect how you like different colors levels from the video card, but those are monitor adjustment issues, and not video card issues.
At any rate they look the same.
 
You may be right, but HBAO is still an Nvidia version of HDAO, and may still be preferred.

I have a hard time seeing any real difference in quality. Slight differences of shade doesn't make one better imo. I don't know which they'd rather have.

Exactly. Without a "control image" we don't know which is more accurate.
 
All I care about is having fun. Anyone remember when gaming was just pure fun and you knew nothing of frame rates, specs and hardware terms? Prolly not cause you all are too cool for that but you guys can shove your PC vs Consoles and 30fps vs 1000FPS debates up yours and I'll continue to enjoy all the platforms in whatever performance or efficient range I can afford.

You guys are just Sunday Gamers. Real gamer's do not give a crap if the PS4 is weaker than your PC or if it only does 30fps cause the real enjoyment is the gameplay and even 1000FPS at Ultra HD resolution will not make [inferior] games more fun. So if your only driving your Hotrod on Sundays, then your just a Sunday driver.

Please avoid the use of profanity in the technical forums.
-- stahlhart
 
Last edited by a moderator:
All I care about is having fun. Anyone remember when gaming was just pure fun and you knew nothing of frame rates, specs and hardware terms?

I do, but I also remember that we were a lot younger then too. I think you have to take that into consideration.
 
Back
Top