Far Cry 3 GPU and CPU benchmarks [PCGH.de]

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Nintendesert

Diamond Member
Mar 28, 2010
7,761
5
0
there was some review that covered DX9 vs DX10 at more than just ultra and DX10 was not always faster. I believe it was the settings below very high that ran better in DX9. and the game is well known for the stuttering issues in DX10 at least near release so maybe you played it much later. I certainly remember having to play in DX9 for the best smoothness even though the framerate was a few fps slower than DX10. and I remember the tons of comments and some fixes to help out at that time. you can just google and see for yourself if you still doubt me.



I never had issues with it. So since you're the one making the claim, I'd love to see a link. Telling people to just google it isn't really supporting evidence. :p
 

Dave3000

Golden Member
Jan 10, 2011
1,351
91
91
The problem is that a high-end Intel dual-core with HT is probably faster than that six core processor.

At the end of the day IPC is still more important than the number of cores you throw at the problem.

Well, not exactly. The 6-core i7 3930k turbos to 3.8 GHz when only 1 or 2 cores are active. The fastest dual core i3 CPU is 3.4 GHz, so the i7 3930k still beats it in even lightly threaded programs.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Perhaps it was due to early driver issues or something? I don't remember any stuttering issues from when I played it

Here's some tests for DX9 vs DX10 anyway
http://www.hardocp.com/article/2008/12/01/farcry_2_dx9_vs_dx10_performance/3
again just simply search and you will see it was a major issue that went on for some time after launch. I think patches and new drivers fixed it for the most part as it seemed fine when I went back to it a couple years later. and I already mentioned that DX10 was faster on very high and Ultra but it was below very high where DX9 was just as fast or faster. I will see if I can find that review later just to show what I mean.

I never had issues with it. So since you're the one making the claim, I'd love to see a link. Telling people to just google it isn't really supporting evidence. :p
you are capable of using google and its not my fault some people cant remember the DX10 issues that were there for at least the first few months. ;)
 
Last edited:

Dave3000

Golden Member
Jan 10, 2011
1,351
91
91
Well, not exactly. The 6-core i7 3930k turbos to 3.8 GHz when only 1 or 2 cores are active. The fastest dual core i3 CPU is 3.4 GHz, so the i7 3930k still beats it in even lightly threaded programs.

I also want to add the the i3 CPU does not have turbo mode, so the i3 3.4 GHz won't go higher than 3.4 GHz unless you overclock.
 

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
Nice thread. But I still don't know what MSAA is. I don't haveh a option for that.
I have nvpanel , and it gives option for FXAA on and AA is 16x QCSAA ,,,,, whats this MSAA ,,, and is it a performance hit, cuz when I do it theres no performance change on AA or OFF... gl
 

DiogoDX

Senior member
Oct 11, 2012
746
277
136
Nice thread. But I still don't know what MSAA is. I don't haveh a option for that.
I have nvpanel , and it gives option for FXAA on and AA is 16x QCSAA ,,,,, whats this MSAA ,,, and is it a performance hit, cuz when I do it theres no performance change on AA or OFF... gl
Nvidia have a confusing nomenclature for AA types.

Nvidia AA:
2xAA = 2xMSAA
4xAA = 4xMSAA
8xAA = 4xMSAA + 4xCSAA
8xQAA = 8xMSAA
16xAA = 4xMSAA + 12xCSAA
16xQAA = 8xMSAA + 8xCSAA
32xAA = 8xMSAA + 24xCSAA

More info here: http://www.tomshardware.com/reviews/anti-aliasing-nvidia-geforce-amd-radeon,2868-4.html
 

dragantoe

Senior member
Oct 22, 2012
689
0
76
Wow, I thought I'd be able to handle this game with my pc, anyone have suggestions for a cpu upgrade?
 

Nintendesert

Diamond Member
Mar 28, 2010
7,761
5
0
again just simply search and you will see it was a major issue that went on for some time after launch. I think patches and new drivers fixed it for the most part as it seemed fine when I went back to it a couple years later. and I already mentioned that DX10 was faster on very high and Ultra but it was below very high where DX9 was just as fast or faster. I will see if I can find that review later just to show what I mean.

you are capable of using google and its not my fault some people cant remember the DX10 issues that were there for at least the first few months. ;)



That's not how it works. You make the claim, you back it up. :rolleyes:
 

Spjut

Senior member
Apr 9, 2011
928
149
106
it was a well known issue so stop playing dumb and just google it for yourself. :rolleyes:

Well, even if it was well known, it seems to have been gotten fixed. So it's still a weird choice to remove the DX10/10.1 support now in its sequel, when it could result in better performance at times
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
More CPU benchmarks:
4VwfH.png
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
The newest 310.64 driver is supposed to increase performance by up to 40% :D
I wonder how they do that...
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
DX10 was faster on very high and Ultra but it was below very high where DX9 was just as fast or faster. I will see if I can find that review later just to show what I mean.

Pretty sure DX9 couldn't run Very High or Ultra
 

An00bis

Member
Oct 6, 2012
82
0
0
Why is it we can't ever seem to get info like this in English? Separate CPU and GPU benchmarks are extremely helpful in deciding upgrades.
because our reviewers are idiots who still use 1920x1200, only show the fps results of the 3dmark11 benchmark instead of its score, test cpus in battlefield 3 in SINGLEPLAYER, and start yelling at the top of their lungs X GPU OBLITERATES THE COMPETITION when it gets 1 more fps in a game compared to Y card made by the other company.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Pretty sure DX9 couldn't run Very High or Ultra
that would be incorrect as I am looking at the built in benchmark right now. and again if you guys would just learn to google, you could find out these things too.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Must have been the other way around then, DX10 couldn't run the lower settings.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
because our reviewers are idiots who still use 1920x1200, only show the fps results of the 3dmark11 benchmark instead of its score, test cpus in battlefield 3 in SINGLEPLAYER, and start yelling at the top of their lungs X GPU OBLITERATES THE COMPETITION when it gets 1 more fps in a game compared to Y card made by the other company.

Yep and yep. I can't stand it when people bench CPUs and use a high res. Really don't grasp why they would do that. Its not like they are stupid, but they act like they are clueless.