• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Hawx 2 bechmark program officially released

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
thanks for the warning, i was planning to install it tonight and run on my gtx 460. no thanks, ubisuck.

Yep I was meaning to mention the account stuff earlier myself. Way to go Ubisoft! :thumbsdown:

Although I didn't pay to much attention to detail, I don't remember seeing anything to write home about when it came to tess on versus off. But i'm not the sort that cares all that much about things like that anyways...
 
Last edited:
0FPS because it needs to have some intrusive rootkit spyware abomination installed and running.


Alot of people dont like ubisoft for the nvidia stuff they pulled.
Alot of people dont like ubisoft for their DRM stuff.

My predictions? wow is this game gonna sell ^-^
 
Alot of people dont like ubisoft for the nvidia stuff they pulled.
Alot of people dont like ubisoft for their DRM stuff.

My predictions? wow is this game gonna sell ^-^


Yuk Yuk Yuk 😀

It would probably be what they deserve if the info is accurate
 
I meant of the benchmark itself.
I cba to waste 700MB downloading a benchmark for a game I don't care about (since I have a stupid ISP cap), but I would like to see tessellation on vs off pictures, since most of the time it's used pretty terribly.

Just for you. Look at the outlines across the span of the mountains. These shots aren't exactly screaming of massive quality improvements, but in game (i.e. in motion) it is readily noticeable and does improve the visual quality. Now I want to take some Civ 5 screen shots to see if I can tell a difference like this.

TESSELLATION OFF
tessOFF_2.jpg


TESSELLATION ON
tessON_2.jpg


TESSELLATION OFF
tessOFF_1.jpg


TESSELLATION ON
tessON_1.jpg
 
The second set you can definitely see the top of the mountain looks a lot better.
On the first set the lower line looks a lot more natural too.

(I wasn't doubting it looks better, I was just wondering how well they had used it).
 
The second set you can definitely see the top of the mountain looks a lot better.
On the first set the lower line looks a lot more natural too.

(I wasn't doubting it looks better, I was just wondering how well they had used it).

Yeah like I said it's the first time I've noticed a difference right away in an actual game and it does actually look decent. I just took some comparison screenshots of Civ 5 with high vs. low tessellation, and there actually is a fairly noticeable difference there too but since the game doesn't have a whole lot of animation going on at any one time, it's hard to notice or appreciate it while playing.
 
I have to say for what it is it does look pretty.

Apart from ungine this is the first time Ive actually seen tessellation make visually a decent differnce.
 
Phenom II x3 @ 3.73ghz/240nb/DDR31600/HD4850@700/1280x1024/all settings on high.

183 max fps
114 average fps
 
Alot of people dont like ubisoft for the nvidia stuff they pulled.
Alot of people dont like ubisoft for their DRM stuff.

My predictions? wow is this game gonna sell ^-^

One thing you need to realize. People on an msgboard represent a small % of people.
 
Does the camera face the sky during the benchmark or what? Whats with the high framerates for a DX11 game?
 
Lonbjerg, you have to admit though that Ubisoft + Nvidia have a history of doing stuff like that though. Batman AA as a exsample...dx support with Assasins Creed, ect ect.

Its more likely to believe nvidia and ubisoft useing foulplay than AMD PR just makeing things up.
 
Lonbjerg, you have to admit though that Ubisoft + Nvidia have a history of doing stuff like that though. Batman AA as a exsample...dx support with Assasins Creed, ect ect.

Its more likely to believe nvidia and ubisoft useing foulplay than AMD PR just makeing things up.

If you are a simple fanboy, with very limited GPU architecture knowlegde and happily eat all AMD's FUD (aka Fuddy false claims...and not the first time) I can see how you could get that idea.

If you are a hardware geek with a strong interests in GPU architecture, the facts and performance and care little for PR...you couldn't have reached that conclusion on any valid basis.

Hence it's easy to put people in boxes now...just watch their "argumentation" and it's easy to decide if they belong in the "AMD fanboy" box...or the "Hardware geek" box.


No need to characterize Arkadrel in such a derogatory fashion as you have done here. Personal insults are not acceptable.

Moderator Idontcare
 
Last edited by a moderator:
Lonbjerg, you have to admit though that Ubisoft + Nvidia have a history of doing stuff like that though. Batman AA as a exsample...dx support with Assasins Creed, ect ect.

Its more likely to believe nvidia and ubisoft useing foulplay than AMD PR just makeing things up.

Batman was made by Eidos, not Ubisoft so your examples don't run very deep.
 
Lonbjerg, you have to admit though that Ubisoft + Nvidia have a history of doing stuff like that though. Batman AA as a exsample...dx support with Assasins Creed, ect ect.

Its more likely to believe nvidia and ubisoft useing foulplay than AMD PR just makeing things up.

When ya use a one-sided view. Ubisoft and the first HawX utilized DirectX 10.1.
 
Even the shittiest cards run this benchmark fine it's okay guys the sky is not falling yet.



No profanity in the tech subforums. You've already been warned for this.


esquared
Anandtech Forum Director
 
Last edited by a moderator:
Even the shittiest cards run this benchmark fine it's okay guys the sky is not falling yet.

That is not the problem.
The problem is that AMD PR spread a lie.
The lie was that Ubisoft and NVIDIA were BAD...and teased AMD with subpar performance.
That the gamecode was flawed.
AMD spewede out these lies and their minons on the forums picked up their weapons....and the lie is still being used....KUDOS til AMD PR for getting their fans to spread the lie.

Sad part is that the problem is not in the gamecode, but the subpar tessellation engine AMD has implemented...but that fact is distorsted and hidden behind the AMD lie....guarded fiercely by AMD's minons.

Thus reality looses...and a lie gets twisted into "facts"🙄
 
Dude,get someone normal to read your post.
AMD "spewing lies","guarded by minons(sic)" etc etc.
Right now you're sounding a little unbalanced.



Do you ever learn? If you think a post is not worthy, then use the report button.

Do you feel you need to attack everyone that disagrees with you?
You're off for a week. Next attack, you're gone for two week, etc.

Get it?


esquared
Anandtech Forum Director
 
Last edited by a moderator:
If you are a simple fanboy, with very limited GPU architecture knowlegde and happily eat all AMD's FUD (aka Fuddy false claims...and not the first time) I can see how you could get that idea.

If you are a hardware geek with a strong interests in GPU architecture, the facts and performance and care little for PR...you couldn't have reached that conclusion on any valid basis.

Hence it's easy to put people in boxes now...just watch their "argumentation" and it's easy to decide if they belong in the "AMD fanboy" box...or the "Hardware geek" box.

So your reasoning is "It's a lie because AMD said it". This is really all I should have to post but I'll humor you.

You don't have the slightest bit of evidence to prove that the performance difference is because of AMD's hardware and not because of the game code, yet you say this like it's a rock solid proven fact that should be common knowledge. There's no shortage of evidence to suggest that there's something wrong with the benchmark.(Similar performance between gigantically different AMD cards, astronomical difference in performance between Nvidia and AMD cards when every single other benchmark and game has a difference of 1/5th what is shown in this benchmark, AMD says the benchmark is using a non optimal rendering path for tessellation)

If you had any real understanding of the issue you would know that it's entirely possible AMD isn't just lying. There absolutely is such a thing as diminishing returns when it comes to tessellation. Eventually your round object won't look more round from increasing the polygon count. This is AMD's argument and it makes sense.

Nobody is arguing that AMD's current generation has better tessellation performance than Nvidia's current generation. It came out more than 6 months later, it had better be better in something right? The subject of debate is if this difference in tessellation performance matters for the gamer of today.

Do you know which "box" it's easy to put you under? It's not either of the two you mentioned.
 
Last edited:
Just replaced my two Radeons with two Palit GeForce GTS 450 and Hawx 2 runs very well. Have only tested one of them the Palit GeForce GTS 450 Sonic and with max settings (4xAA enabled etc) at 1600x1200 I get an average score of 70fps and it's very smooth! 🙂
 
Last edited:
That's so stupid it's funny.

That being said if the game is any good I may pick it up. Given Ubisofts latest releases I'm giving that a slim chance.

Knowing Ubi, the game is dumbed down to the point of mind-numbing stupidity, so don't bother. This is the same group of French douches that ruined Ghost Recon, Rainbow Six and Splinter Cell. The latter went from awesome to a poor man's "24". No thank you, Ubi Suck.


The cultural derision is not befitting of a post in the technical forums. Please keep your slurs to yourself, or lob them into P&N.

Re: "French douches"

Moderator Idontcare
 
Last edited by a moderator:
Back
Top