Assassin's Creed uses dx10.1?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: Pelu
Originally posted by: Aberforth
DX10.1 doesn't improve anything but adds mandatory 4X AA, cubic mapping, sm 4.1 and 32bit fpo.

so they push you the AA now.... is that 4aa in case you dont want anything or you can still getting 8 aa and 16aa????

Minimum 4x AA is required for DX10.1- so it cannot be lowered.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Minimum 4x AA is required for DX10.1- so it cannot be lowered.
I believe the minimum requirement is for the hardware, not the software (i.e. the game can request less than 4xAA if it likes).
 

Scotteq

Diamond Member
Apr 10, 2008
5,276
5
0
Originally posted by: happy medium
So all this is great news, right? If you lashed out on a DirectX 10 card, then perhaps not. Although DirectX 10.1 is fully backwards-compatible with DirectX 10 features and hardware, the reverse isn?t true. Neither NVIDIA?s GeForce 8 series nor ATI?s Radeon HD 2x series of GPUs support DirectX 10.1. ATI?s new products ? the Radeon 3870 and 3850 ? do support DirectX 10.1, but NVIDIA apparently has no plans to release a DirectX 10.1-capable GPU. Their next product range, codenamed GT200, will support DirectX 11, but as this is due for release before DirectX 11 itself it will be interesting to see how well early products based on this GPU will support the new DirectX technology.

So gamers have a difficult choice to make ? go with NVIDIA and be restricted to DirectX 10, or travel the ATI path and get enhanced gaming visuals. And as for those unlucky users who have alreadyt upgraded their graphics cards as a prelude to migrating to Vista, and are now facing even more financial outlay to get the full benefits of a simple service pack?you have a our deepest sympathy.



Well... I bought my GTX a year ago, it still is one of the best cards on the market, and there *still* is no clear/direct replacement superior enough that I would even consider replacing it. So I'm sure the writers of the above article will understand if I consider their statement to be rather.... asinine...


Oh - By way of detailia - DX10/10.1 is an entirely new API set, and is NOT backwards compatible with DX9. What Microsoft did is create a version of DX9 that has a subset of 10's added functionality and also understands WDDM (new display driver). This version of DX 9 is included in every Vista OS, along with DX10/10.1. So (1) when there's a comparison of DX9 to DX10 (XP - Vista), the comparison is actually between the different versions of DX9. and (2) If the same game/disc runs on both XP and Vista (read: nearly everything), then it isn't really DX10. Rather, it's DX9 with a couple extensions so they can put "10" on the label and sell it.

In light of that. The fact there is only 1 or two Vista~exclusive games, which will run on the existing hardware anyhow. And the timeline for nVidia's next set of offerings, I think it is a perfectly reasonable business decision to not bother with 10.1 at all.


 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
There is a patch coming out for Assassin's Creed that removes DX 10.1.

LINKAGE

Quote:

"In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin?s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly."

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
There is a patch coming out for Assassin's Creed that removes DX 10.1.

LINKAGE

Quote:

"In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin?s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly."

OOPs .. that may well be a costly PR blunder for Ubi/AMD :p

evidently you *don't* want to remove it if you are playing on a AMD card .. notice the "glitches" apparently mostly relate to NVIDIA
could you by chance tell us whether you at UBI are able to reproduce the problems a lot of people experiencing with nvidia 8800's gpu?

and we await the answer

and i await Assassins Creed to be come bargain bin quickly and i hope to experience a PROPER IMPLEMENTATION of DX10.1 with GT200 or r700 .. around September would be my time frame.. my current solution is fine for now ...
- micro stutter be damned!

rose.gif


So much for "get in the game" ..
- evidently NOT with AC and not with AMD [properly] yet :p

 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: apoppin
...

and i await Assassins Creed to be come bargain bin quickly and i hope to experience a PROPER IMPLEMENTATION of DX10.1 with GT200 or r700 .. around September would be my time frame.. my current solution is fine for now ...
- micro stutter be damned!

rose.gif


So much for "get in the game" ..
- evidently NOT with AC and not with AMD [properly] yet :p

I don't think that current 38xx 10.1 implementation is wrong. It appears Ubi's SW implementation is broken.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
How does removing the render pass affect image quality ? I'm not to technical savvy, but if there is no difference, then why would they remove it ? Weirdness, could this be Nvidia pushing ubisoft around?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Janooo
Originally posted by: apoppin
...

and i await Assassins Creed to be come bargain bin quickly and i hope to experience a PROPER IMPLEMENTATION of DX10.1 with GT200 or r700 .. around September would be my time frame.. my current solution is fine for now ...
- micro stutter be damned!

rose.gif


So much for "get in the game" ..
- evidently NOT with AC and not with AMD [properly] yet :p

I don't think that current 38xx 10.1 implementation is wrong. It appears Ubi's SW implementation is broken.

did i give a wrong impression?
:eek:

It IS UBI .. clearly .. but they are working to "get into the game" :p
- it looks sloppy, imo

rose.gif
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: MarcVenice
How does removing the render pass affect image quality ? I'm not to technical savvy, but if there is no difference, then why would they remove it ? Weirdness, could this be Nvidia pushing ubisoft around?
If they're doing some kind of effect that requires multiple passes (some sort of blur, I'm guessing) then removing a pass is equivalent to getting it wrong. For example, if you're familiar at all with Newton's method for approximating a square root, then you'd know how dropping an iteration results in a less precise value. This isn't necessarily what's going on, but it's one such way that dropping a pass may cause problems.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ViRGE
Originally posted by: MarcVenice
How does removing the render pass affect image quality ? I'm not to technical savvy, but if there is no difference, then why would they remove it ? Weirdness, could this be Nvidia pushing ubisoft around?
If they're doing some kind of effect that requires multiple passes (some sort of blur, I'm guessing) then removing a pass is equivalent to getting it wrong. For example, if you're familiar at all with Newton's method for approximating a square root, then you'd know how dropping an iteration results in a less precise value. This isn't necessarily what's going on, but it's one such way that dropping a pass may cause problems.

it would *appear* to me that it works fine on AMD HW .. but not on NVIDIA HW properly .. that the render pass elimination simply favors AMD architecture and glitches nvidia

OOPS .. Obi needs to get back in the game with the way it is meant to be played - for everyone
:eek:
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: apoppin
Originally posted by: ViRGE
Originally posted by: MarcVenice
How does removing the render pass affect image quality ? I'm not to technical savvy, but if there is no difference, then why would they remove it ? Weirdness, could this be Nvidia pushing ubisoft around?
If they're doing some kind of effect that requires multiple passes (some sort of blur, I'm guessing) then removing a pass is equivalent to getting it wrong. For example, if you're familiar at all with Newton's method for approximating a square root, then you'd know how dropping an iteration results in a less precise value. This isn't necessarily what's going on, but it's one such way that dropping a pass may cause problems.

it would *appear* to me that it works fine on AMD HW .. but not on NVIDIA HW properly .. that the render pass elimination simply favors AMD architecture and glitches nvidia

OOPS .. Obi needs to get back in the game with the way it is meant to be played - for everyone
:eek:

How is it glitching nvidia when the problem only exhibits itself in an API mode nvidia hardware doesn't support?

Not sure where you're getting the idea NV hardware is having problems with AC. The game runs magnificiently at 1920x1200 max everything no AA for me. Relative to its visuals its probably the best performing game I've played next to COD4 on the PC.



 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: chizow
Originally posted by: apoppin
Originally posted by: ViRGE
Originally posted by: MarcVenice
How does removing the render pass affect image quality ? I'm not to technical savvy, but if there is no difference, then why would they remove it ? Weirdness, could this be Nvidia pushing ubisoft around?
If they're doing some kind of effect that requires multiple passes (some sort of blur, I'm guessing) then removing a pass is equivalent to getting it wrong. For example, if you're familiar at all with Newton's method for approximating a square root, then you'd know how dropping an iteration results in a less precise value. This isn't necessarily what's going on, but it's one such way that dropping a pass may cause problems.

it would *appear* to me that it works fine on AMD HW .. but not on NVIDIA HW properly .. that the render pass elimination simply favors AMD architecture and glitches nvidia

OOPS .. Obi needs to get back in the game with the way it is meant to be played - for everyone
:eek:

How is it glitching nvidia when the problem only exhibits itself in an API mode nvidia hardware doesn't support?

Not sure where you're getting the idea NV hardware is having problems with AC. The game runs magnificiently at 1920x1200 max everything no AA for me. Relative to its visuals its probably the best performing game I've played next to COD4 on the PC.

does your GTX run DX10.1?

rose.gif


look at the thread and follow the link Keys gave back to the issues with NVIDIA cards .. i don't have AC so i can't test it :p
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
"In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin?s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly."

What I am getting out of this is:

The render pass "itself" is not being done.
The render pass "itself" is costly when done.
Having one less render pass inflates performance.

It doesn't make sense to say that "not" having the render pass is costly.
It does make sense to say that "having" the render pass is costly. Because its more to do.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: schneiderguy
If the dx10.1 path is broken why didn't they remove it before they shipped the game? :confused:

That's what patches are for. For fixing things not understood as "broken" until "after" it ships.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
about the nvidia skipping dx10.1 thing.. If nvidia wants to support DX11 they will also need to support DX10.1
those things are inclusive. So if nvidia skips it means the first DX10.1 capable nvidia cards will also be DX11 cards. Not that there will never be an nvidia DX10.1 card.

Regardless, the DX10.1 is indeed supposed to improve speed a little bit. But its a give and take, should they implement it for speed improvement, or should they tweak the cards for greater speeds regardless of DX level...

I wonder how much of the 20% is DX10.1 and how much is vista SP1. With all that being said, it is good to see AMD finally getting a break.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
about the nvidia skipping dx10.1 thing.. If nvidia wants to support DX11 they will also need to support DX10.1
those things are inclusive. So if nvidia skips it means the first DX10.1 capable nvidia cards will also be DX11 cards. Not that there will never be an nvidia DX10.1 card.

Regardless, the DX10.1 is indeed supposed to improve speed a little bit. But its a give and take, should they implement it for speed improvement, or should they tweak the cards for greater speeds regardless of DX level...

I wonder how much of the 20% is DX10.1 and how much is vista SP1. With all that being said, it is good to see AMD finally getting a break.

It would be, but they didn't ... Ubi needs to *remove* DX10.1 implementation from Assassin's Creed - it is *broken* :(
-although i imagine AMD graphics users will still be delighted with 3870 and 3870 multi solutions as it appears to still work for them [right?]

- after they patch it properly, then we will see if there is any real improvement justifying an upgrade to a DX10.1 capable GPU
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
ah... so it gets 20% faster FPS because its broken, not because it is that good... kinda like the water rendering bug in crysis with nvidia...

Yea, while DX10.1 was supposed to be slightly faster, 20% seems a bit much... for both it AND the SP1 improvement together it is still way too much...
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Thought this game was part of the other teams program? If the gains to turn out to be substantial and the new ati part on the way whips everything with it - We will hear alot of nv promo for a yet to exist part. Should be fun times. Personally I am very surprised that nv doesn't yet support dx10.1. This may be a tough year on profits.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Just going to ignore the fact that this isn't working right atm, but honestly- there is a reason.

For this entire generation ATi has been blasted rather heavily for its poor AA performance and in many cases rightly so. They made a design choice of merging their AA and shader hardware together and it has cost them considerably in overall performance(going over every element of the cards they SHOULD be faster then they are utilizing AA). The first thing that popped into my head when I read this was that we finally had a game that was able to properly utilize ATi's chip layout the way it was intended to be used. Overall, I still think that ATi made a bad call in their decission to go the way they did but in all honesty I fully expect and across the board bump in performance roughly comparable to this in all DX 10.1 games that utilize the API for AA effects.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: apoppin
Originally posted by: chizow
Originally posted by: apoppin
Originally posted by: ViRGE
Originally posted by: MarcVenice
How does removing the render pass affect image quality ? I'm not to technical savvy, but if there is no difference, then why would they remove it ? Weirdness, could this be Nvidia pushing ubisoft around?
If they're doing some kind of effect that requires multiple passes (some sort of blur, I'm guessing) then removing a pass is equivalent to getting it wrong. For example, if you're familiar at all with Newton's method for approximating a square root, then you'd know how dropping an iteration results in a less precise value. This isn't necessarily what's going on, but it's one such way that dropping a pass may cause problems.

it would *appear* to me that it works fine on AMD HW .. but not on NVIDIA HW properly .. that the render pass elimination simply favors AMD architecture and glitches nvidia

OOPS .. Obi needs to get back in the game with the way it is meant to be played - for everyone
:eek:

How is it glitching nvidia when the problem only exhibits itself in an API mode nvidia hardware doesn't support?

Not sure where you're getting the idea NV hardware is having problems with AC. The game runs magnificiently at 1920x1200 max everything no AA for me. Relative to its visuals its probably the best performing game I've played next to COD4 on the PC.

does your GTX run DX10.1?

rose.gif


look at the thread and follow the link Keys gave back to the issues with NVIDIA cards .. i don't have AC so i can't test it :p

No it doesn't, and doesn't on any NV hardware since DX10.1 isn't supported. The skipped render pass/20% AA improvement only exhibits itself in DX10.1 mode, which by default requires 1) Vista 2) SP1 and 3) ATI HD 3000 series hardware. Any problems related to this DX10.1 glitch wouldn't impact NV as it simply cannot. I did not see any specific mention of this problem on NV hardware by the Devs or in that link.

I have a feeling the 8800-series problems relate to the 174.XX beta drivers as they are clearly hit and miss with AC based on Guru3D forums feedback. Many users indicate crashes in either DX9 or DX10 or both, but 169.44 runs the game fine. I've had no problems with 174.32, 174.85 and 174.93 in AC or other titles, but 174.74 gave me alt-tab crashes and another gave a BSOD on boot. If I had to guess that's the reason 174 have not been WHQL'd as a unified driver package, as there still seems to be lingering problems with older 8-series parts.

Again, the game runs spectacularly at 1920x1200 max everything, no AA (disabled in menu) 16 AF between 30-60FPS with Vsync on. If every game ran as well and looked as good as AC I wouldn't have anything to complain about.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Thanks for the clarification .. it appears to me now that GeForce owners were just complaining more
- i don't have AC so i can't run it - at all - on anything

rose.gif
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: apoppin

i have 2900 Crossfire and i am not going to run out an buy a 3870x2 PoS just to see this BORING game - not even IF it was exciting, just for added visuals.

Hmm, in one sentence, you've claimed the 3870x2 is a piece of crap, and that Assassins Creed is a boring game, and not worth the money.

Yet reviews show the 3870X2 being a very fine card, and reviews show AC being a fine game. Metacritic shows an average of 81% out of 76 different critic reviews, and 85% out of 363 user votes. So it seems its you that are not in agreement with the vast majority of people out there. You need to stop trying to make your opinion, sound like a fact.

Its also funny to see NV's paid mouth pieces out in full force with any hint of good news for ATi, and bad news for NV.

You'll have plenty more free time to bait people on other forums should you keep this up. Need I remind you of the rules?

-ViRGE

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Ackmed
Originally posted by: apoppin

i have 2900 Crossfire and i am not going to run out an buy a 3870x2 PoS just to see this BORING game - not even IF it was exciting, just for added visuals.

Hmm, in one sentence, you've claimed the 3870x2 is a piece of crap, and that Assassins Creed is a boring game, and not worth the money.

Yet reviews show the 3870X2 being a very fine card, and reviews show AC being a fine game. Metacritic shows an average of 81% out of 76 different critic reviews, and 85% out of 363 user votes. So it seems its you that are not in agreement with the vast majority of people out there. You need to stop trying to make your opinion, sound like a fact.

Its also funny to see NV's paid mouth pieces out in full force with any hint of good news for ATi, and bad news for NV.
your are right i did dismiss the game .. and i based it on our OWN thread that said it was VERY repetitious.

i linked to pontifex' thread in PC gaming

i generally am NOT in agreement with the vast majority of people and i don't care that you perceive me as arrogant that i dismiss reviews as "sheepies"

Now where the HELL did i say my opinion is Fact? :|
You and i both have strong opinions and i generally do not like you personally and i don't care for you reasoning and your posts at all. But i do NOT have to say "this is my opinion" after every goddamn thing i post

it is my fricking opinion - get it?
- i consistently say the 3870x2 is a PoS - just like the Gx2 is another PoS ... and i give my reasons for my own STUPID opinion - which i generally think is more well thought than your cynical and negative ones

As to your reference to "funny to see NV's paid mouth pieces" i think it is "funny" that you are still posting here

rose.gif



the BAD NEWS is that Ubi F#cked-up the DX10.1 implementation of Assassins Creed and has to patch it out - and ReDo it - that IS funny you mention "Good news" for AMD - it is not!
-- and i am sick of your insinuations and thinly-disguised personal attacks

EDIT: in fact i would be very PROUD to work for either AMD or NVIDIA .. but then i would realize that i could no longer post here. :p
i like my freedom to say what i want .. and it IS my OWN opinion!

Now, i always had ATi in my rig .. and i like the 2900xt .. still do and i currently have HD2900 Crossfire which is a very nice solution which "keeps up" with and perhaps beats a single old GTX-ultra in my rig. However, AMD is no longer heading in a direction i personally like ... i don't want to stick in an extra GPU to keep up with a single one .. so consistent with everyting else in my life up to this point, my own feeling change and so may my HW - in accordance with my own personal feeling.

Don't confuse it with an "artificial bias" by my being tempted by "HW" .. it is material, transient and ultimately unimportant to my life.. on the other hand, knowledge IS important to me and i will continue to share "my opinion", thought and observations honestly with you - as i hope you do. I just hate to see everything POISONED here by insults of "corporate mouthpiece". IF i ever DO become "corporate" - anyone; you will not see me here anymore.