I get 60 FPS with everything maxed 1440P but it is not consistent - I have not seen any terrible drops but it has dipped below 50 more than a few times. But I'm not going to hold my breath Click to expand... I got 4k running flawlessly with sli bits from ryse - i have settings maxed out, with the exception of shader quality, which is set to high. I find that the only thing that makes a difference for performance in this game, is shader quality. This is the game maxed out 4k, with shader quality also set to ultra, giving fps in the 40s With shader quality set to high - runs capped at 58 fps With shader quality set to low - reqruires about half gpu power compared to shader quality at ultra. And fyi, there already has been released a driver for kingdom come deliverance... Since the files aren't encrypted let's just take a look. So for shading quality it seems the difference is with SSR and SVOGI mainly, at least between Ultra and V. Though SSR is actually disabled so it's all down to SVOGI if it's working the way I'm seeing from the config file. SSR could actually have been disabled by default so it's all down to the voxel global illumination technique being incredibly expensive if so. EDIT: And then you have version 5 of Cry-Engine, would have been nice to have the game running on this due to the fixes and optimizations it brings, new features too but mainly bug fixes and further optimizations. I'll have to try and check if there's any clear version info on what the game is actually using though, I could be wrong. From what i can see in my testing, there is no visual difference between shader high and ultra. As is evident from my screenshot with low shader, there is obviously some changes to lighting there, especially apparent on the trees in the background. But the shader setting appears to be very taxing on the cpu aswell - both my cpu and gpu's are maxed out at 40ish fps with the setting at ultra. Did they use the latest version, I mean... GameGPU just end up testing the game and they got this, as far as I'm concerned they never failed on me with the results, dispite I always take them with a little bit of grain of salt. Take a look Full perforamance review you might want to use google translator : I'm saying this because both AMD and now gameGPU display good performance overall on their results, I'm also watching a guy with a 1050TI with the latest patch and drivers having around 30FPS with high settings, the game seems to not use much CPU as well as Vram. EDIT: But he also noticed some micro-stuttering while turning the camera, so this might be the same problem it happens with the consoles. From what i can see in my testing, there is no visual difference between shader high and ultra. As is evident from my screenshot with low shader, there is obviously some changes to lighting there, especially apparent on the trees in the background. But the shader setting appears to be very taxing on the cpu aswell - both my cpu and gpu's are maxed out at 40ish fps with the setting at ultra. Yeah global illumination isn't a cheap effect at all and having settings that extend the effect further could really affect framerate. Effect can also be quite subtle for the higher settings. Plus how most of the Cry-Engine V builds have several further fixes and improvements to the technique too which might not be available depending on the engine version used by the game. EDIT: That said it's possible they did something similar to Ark and manually merged in certain code improvements and fixes even if the engine build itself is a older version. Something else I can tinker with in the game I guess seeing how it has a console command prompt, it's a impressive effect but there's probably a diminishing return on framerate versus quality after a while. Fancy voxels or no extending the distance of the effect way off is going to be hard to actually see, said framerate hit however is probably quite a bit more noticeable. EDIT: Not that I'm some expert on Cry-Engine or anything of course but going by the config files then dropping shader quality down a notch isn't going to have any larger impact on overall image quality but it will reduce the GPU and CPU load a bit. Imo the global illumination effect from shader set to ultra deffo isn't worth the performance hit - you'd have to have an 8700k clocked to 5 ghz, and 1080 ti sli, clocked to at least 2 ghz, to be able to get anywhere near 60 fps with shader set to ultra at 4k, and even then i'd doubt that you could get 60 fps. I was just thinking that anyone having performance issues, should just lower shader quality, and possibly shadow quality, until they hit a desirable performance level... I didn't get to play any yet, but I did launch the game at maxed absolutely everything including the sliders 1440p and that main menu background was at ~80 FPS with GPU at 99% usage. So my guess will be that the game will run around those numbers, some lower, some higher. I also think that I could get over 100ish consistently if I lower some settings, but I'll see if the eye candy is worth it to have lower frames. Yeh and GPU usages seem good too 99 % and 97 %. Don't have SLI but nice find regardless! Also easy to enable if Ryse's bits work out-of-the-box. Incidentally, especially given today's GPU prices, I might consider getting a second-hand 980 Ti if I can find an affordable one. If SLI works would get a nice boost Elder Scrolls 3: Morrowind is around 5x5 kilometers in size. This game appears to be around 3x3 from what I've found from searching and that's mainly because the first installment here has the two first chapters together Out of three in total. Elder Scrolls 4: Oblivion appears to have been around 7x7 to compare against that. For Cry-Engine I don't think this is much of a issue and it's well within the max allowed map size I believe though more densely detailed.