Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I do not recall any of this stuff affecting the in-game benchmark though, sadly. The hitches I spotted do not happen in the benchmark, they were repeatable during gameplay only and only in the larger cities.
Use Flip Model + 3 backbuffers + 4 max device latency + 1 presentation interval will probably deliver a change similar to DXVK.
Sadly I wasn't able to accomplish any decent improvements.
SK_Odyssey_slow_stream-Mod with Flip Model, 3 backbuffers, 4 max device latency, 1 presentation interval. (I also had to set Fullscreen=true, ForceFullscreen=true and OverrideRes=1920x1080 in dxgi.ini for it to work properly.)
(values in () give the difference to the Special K-benchmark from above)
FPS min: 17 (+4)
FPS avg 39 (+2)
FPS max: 57 (+3)
CPU min: 17 ms (+2)
CPU avg: 26 ms (-1)
CPU max: 59 ms (-17)
GPU min: 13 ms (+/- 0)
GPU avg: 21 ms (+/- 0)
GPU max: 53 ms (-12)
For comparison, another benchmark using DXVK:
FPS min: 28
FPS avg 72
FPS max: 107
CPU min: 9 ms
CPU avg: 14 ms
CPU max: 36 ms
GPU min: 9 ms
GPU avg: 14 ms
GPU max: 33 ms
Overall, I'd say just stick with DXVK though. I've been saying since the game launched that D3D11 is a bottleneck for the engine. I did not think that translating D3D11 -> Vk is all it would take to lessen that bottleneck :P
https://github.com/doitsujin/dxvk/releases
There isn't much from 1.6.1 but the compiled version can be found via.
https://git.froggi.es/doitsujin/dxvk/-/jobs
Work for DXGI 1.6 support and the newest improvements from the Vulkan 1.2.139.0 SDK although for AMD they support up to 1.2.131 I believe on Windows with 20.4.2 and NVIDIA is using 1.1.x for their main release branch drivers of 445.98
(Vulkan branch is a bit behind the main one but I think Vulkan SDK support for that is at 1.2.135.0 now, should be merged in ready for 450.x whenever that's finalized.)
But it should still cover for the bigger gains in 1.0 and 1.1 anyway. :)
Though while I have heard and seen NVIDIA comparisons it's interesting that it still works really well for Origins and Odyssey, AMD's situation is what it is but NVIDIA has a near 40% gain for D3D11 anyway due to for one thing the driver having less CPU overhead.
Still works though and apparently really well, hopefully the recently revealed Valhalla actually utilizes D3D12 or Vulkan and with other Anvil games and Ubisoft using these chances are we'll finally get these gains naturally and other improvements from using low-level API's and gains since D3D11.x even if it's done pretty well over the years.
I'm mentioned in a much more important project :)
Here's hoping whoever is porting Nier Replicant (And it's added features and improvements.) either does a good job with the PC port directly or they actually get greenlit this time to support and update it properly should there be any issues.
For Ubisoft well Valhalla using low-level API support is hopefully a thing and while I don't expect much for Odyssey or Origins patching out Denuvo would be nice whether the implementation affects performance in any way or just decreases the startup time.
Probably best as can be hoped for unless something critical comes up or they release a updated edition some years from now.
(Stadia uses Vulkan I think so that at least speaks for Valhalla and the Anvil Engine using this API which is a positive thing.)
EDIT: And HDR support but I think Ubisoft handled that pretty competently for these games and I have no idea what bleak looks like with HDR for Nier.
(Black levels, white levels what's there even to do for the gray ones.)
EDIT: Thanks Google I know what Automata looked like and there's probably more to Replicant (Daddy mode?) or Gestalt (Bro mode?) than this one outfit for Kaine...
EDIT: Especially if Steam is doing it's +1 score tally for running bans and that situation again. Yay.
While many rural parts of the game run at a fairly smooth 100+ fps on my Windows machine, some of the towns and cities crawl to an unacceptable speed (which for me is anything under ~80 fps).
Using DXVK 1.7 boosted my fps from 64 to 84 fps at one of the most CPU demanding parts of the game (Euobia, Posideon Temple sync point, straight after syncing).
64 to 84 fps!
It was like getting a free CPU upgrade on my aging (but healthily overclocked) i7 4790K! This was one of the few games giving me the twitch to upgrade, but I will now see if I can hang on until DDR5 systems launch (2022?).
Steps:
1. I downloaded DXVK 1.7 from doitsujin/dxvk/releases at Github.
2. I usd 7zip to extract d3d11.dll and dxgi.dll from the x64 folder in the download archive and dropped both of these files into the root of the game installation folder (alongside ACOdyssey.exe).
3. Ran around a bit and let Ikaros fly over the map to iron out the stutters due to shader compilation.
4. Went to Argos, Posideon Temple sync point, recorded fps reported by my GSync monitor = 84fps!
5. Running around towns is now noticably smoother and much more enjoyable.
Removing the DXVK .dll files / dropping them back into the game root directory reproducibly flicked the performance from 55 fps (standard game .dlls removed) and 84fps (DXVK 1.7 dlls present) at this demanding location. There are occassional skipped frames as my machine compiles new shaders into the cache (but this is decreasing the more that I play the game), but this is well worth it given the massive boost to performance in crowded areas. The in-game benchmark scores also increased using DXVK, but not by much; 5fps or so since the benchmark is nowhere near as CPU demanding as places such as the Posideon Temple sync point.
My specs/settings:
4790K @ 4.9 GHz (1.30 V and 64 C under load)
32 GB 2400 MHz CL10 RAM
GTX 1080Ti (driver 442.19)
1440p 165 Hz Gsync monitor
Uplay version of the game
Custom game settings
ACOdyssey.exe set to high CPU priority in Task Manager
FPS capped at 120 in Nvidia control panel.
Uplay client forced closed (not running in background)
Windows 10
It's not my 4C/8T PC that needs an upgrade, just Ubisoft need to seriously consider changing the backend of their games to reduce CPU overhead.
Hmmmm, I wonder if DXVK might help speed the original Crysis up enough to run at an "acceptable" framerate on my PC (i.e. above 100 fps for a first-person perspective game)...Edit: Crysis had a slight improvement of 3 fps, but not enough once the action heated up. GTA V, might have been given a bit of a boost by ~10%, but more testing is needed.
Once again, thank you so much!
The engine that Assassin's Creed uses (Anvil) is D3D12 (on Xbox) / GNM (on PS4) native. On PC they run it through a compatibility layer that runs D3D11. So we have an engine that is D3D12, translating to D3D11 and then D3D11 translated to Vk.
While all of this is going on, talking pigs are flying around remarking how unlikely the whole situation is.
You know Ubisoft's engine sucks when running it through 2 translation layers is faster than running it in its native API :)
Hopefully the upcoming one will be using Vulkan or D3D12 natively and these limits and whatever limits Ubisoft added by their D3D12->D3D11 translation will be gone.
(Pretty certain Vulkan 1.1.x was developed enough to have been able to handle Odyssey while also allowing Windows 7 and Windows 8 compatibility.)
EDIT: Though part of it might be how the game also just all but requires a hexa core CPU or better otherwise the performance becomes pretty wonky especially in some areas like the cities plus you get a bunch of extra threading for how the NVIDIA driver handles command context queues I believe it's called and driver command lines what it was called again which also doesn't scale down for some reason and is just 8 I think it was.
Game itself still needs a lot from the CPU though, impressive how DXVK improves things considering it should have a bit of a CPU overhead but it might be improving average and mins even if max framerate is a tad slower but that's not as bad as lower mins or worse on average framerate plus what it does to smoothness if frame time and such is also good.
(Latency too and all that.)
Indeed! Hopefully they go native DirectX 12 for Valhalla. I gave it a shot for Crysis and only got ~ 3 extra frames (77 to 80 fps at the view over the bay at the start of the game), but still dropped too low to be playable for me once explosions started bringing walls down. GTA V *might* have given me an extra ~10 fps on each scene in the benchmark (e.g. ~100 to 110fps), but I can't be 100% certain because the game reset all of the graphical settings after putting DXVK in place, so I had to reset those from memory AND it disabled the Nvidia-specific AA implementation that I had activated previously. So GTA V requires more testing when I get time for it.
You might be right about that; I didn't focus on the max framerates because I capped my fps at 120 in the Nvidia control panel for this game in the hope of smoothing things out. Indeed, I don't care about high max frame rates, but do care about performance of the minimums in busy areas.
AMD's CPU usage varies but this also results in more uneven framerate and Odyssey in particular was almost broken on launch dropping into the teens for FPS during busier parts of the game world which for how much that game is pushing draws and all is pretty frequent.
There was also a issue with Zen and CPU scheduling but Microsoft and the newer Windows 10 updates should have improved how this is handled as should improvements to AMD's power schedule they use for Zen2 processors.
Beyond that the game itself creates a lot of worker threads upwards of I think 16 or so for what available CPU cores are available (Logical and physical with hyper threading on.) it usually isn't too terrible but it is one of the rare more recent games where a quad core can be a bit of a limit for once and it's not just about efficiency or clock speed for the primary core even if that has plenty of things going on too.
Kaldaien probably had more to say and that could have helped if Ubisoft hadn't acted on that whole malware report and just thrown a community ban without verifying those claims though at least this wrapper manages to resolve some issues which seem to work just as well on NVIDIA GPU's kinda proving there's something limiting the game pretty badly.
AMD's lack of D3D11 driver command lists and poorly multi threaded driver situation overall hinders things which NVIDIA resolved with one of their previous driver updates significantly improving DirectX 11 performance which for how these games are pushing multi-threaded rendering means there's a significant performance penalty for AMD and these games.
(And outside of D3D9 which is single threaded having this work as well as it does for NVIDIA too points at a pretty big bottleneck situation of some sort.)
Don't have a full understanding of what the engine is doing, other than pushing way above specs for D3D11 draw calls it seemed like a pretty impressive engine but it has it's issues so hopefully the next game will see these resolved.
(Guess it's incredibly unlikely Ubisoft would patch in Vulkan or D3D12 support much as it could help now that support for these two games is all but over.)
EDIT: Or just using Ghost Recon Breakpoint as a comparison also on a version of the Anvil Next engine and it's Vulkan API update while initially suffering from a memory leak it still gave a hefty 15 - 20% performance uplift plus improvements to CPU and GPU utilization overall.
Lots of text aside yeah Vulkan (Or D3D12) can really help long as the implementation is all good and stable, again it's really impressive (And a bit problematic for what's going on with the game and D3D11 then.) how it even assists on NVIDIA systems too.
There's posts confirming similar results on newer high-end processors as well so it's not just a hardware limit or CPU bottleneck either, Odyssey can be a bit finicky to get stable with DXVK apparently but Origins just seem to work and benefit nicely overall.