Topic-Free Mega Thread - v 1.11.2020

So RT reflections are breaking down for me in a lot of places, look at this glowing mess

Hope that means the issue is resolved, including the potential consequences of such drama

Speaking of reflections, i’ve had such an annoying battle with SSR. After 20 or so hours of play, i finally decided to just leave it disabled. The grain it causes is super annoying, and can only be countered if set to the Psycho setting, but it drops the frame rate too hard.

Okay so I’ve spent some time trying to “fix” the black floor in 2077 and I think I have nailed it for all scenes. Levels worked well apart from night time scenes where the fine level for daytime = pitch black

Reshade, HDR10 (Reshade effects don’t work with HDR scRGB (849nits, 1.7 tone, 100nits UI), PD80_Curved_Levels:
27 Black Point
255 White
.75 Grey X
.775 Grey Y
.06 Toe Y
.153 Toe X

HDR Images, reveals an NPC but not really a spoiler

Thanks it looks much better but still crushes some detail in near black.
https://cdn.discordapp.com/attachments/564036055292968960/787481608101953576/VID_20201213_014913.mp4

Did some more testing, the engine caps out at ~28 threads.

My attempt to spoof the info:

12/13/2020 00:57:57.005: Allocating 52 extra CPU cores
12/13/2020 00:57:57.005: Returning 64 cores, 128 logical

Results in 28 worker threads. Which is still better than the original 11 workers, but it can’t be tricked into creating 64 or 128 workers :slight_smile:

dxgi.7z (7.6 MB)
dxgi.ini (750 Bytes)

Before

After

1 Like

Can’t see the file since I’m on my phone, so what I’m saying might not be relevant depending on your example.

The way the game tries to handle it’s lighting, along with having too many NPCs results in lots of areas being underexposed. Usually, games can address this by at least adding an artificial ambient light to NPCs to brighten them up a bit, but I guess the game was aiming for a more natural composition, which would’ve required more specific light placements to address this underexposure in areas where it’s needed, but that’s a lot of work.

https://www.youtube.com/watch?v=l5jC97Pt-W0

Lower the SDR/gamma slider value in the HDR widget.

That’s because you’re supposed to use scRGB Passthrough Mode.

Just turn on sRGB Gamma Bypass and set the Max Luminance Level / Paper White in SK’s widget to ~75. That will fix some of the problems, and you can always make minor corrections using the gamma slider.

But don’t turn the game’s HDR off and use SK’s instead, that’s not going to work.

I assume the user is going off of the reshade preset I posted earlier

WTF NVIDIA?! :nvidia: :chart_with_downwards_trend:

I’m all for NVIDIA getting into the performance overlay arena, but shouldn’t they learn how to count first?

        

If they can’t even get the framerate counted in a meaningful way, then I don’t trust their Render Latency metrics either.


I’ve been looking at that overlay NVIDIA draws for 5 minutes now and it’s said 43 FPS the whole time. 7 FPS lower than reality. FPS here is only got that hiccup in it because I took this screenshot with the Windows Game Bar (you have to if you want the NV overlay to show up in a screenshot).


This is comically invalid. How does a world-class GPU vendor suck this much at measuring performance?

Is there any way to kill DWM when playing dx12 games?
I’ve heard that some games using dx11 don’t support exclusive fullscreen and killing dwm reduces input lag.

Also when using your skif_cyberpunk im always getting bug with levels of black.
Only way to get rid of it without skif is:

  1. Disable hdr ingame
  2. Disable windows hdr
  3. Run game in sdr
  4. Enable hdr in windows
  5. Enable hdr ingame.
    But i can’t do it in skif because i need to have hdr already enabled ingame when i launch it.

Hello,

Are you talking about Cyberpunk? Where can I find this setting?

No, that’s completely ridiculous. D3D12 needs the DWM. There’s no such thing as Fullscreen Exclusive in D3D12, there’s something that behaves kind of like Fullscreen Exclusive, but the DWM’s still calling all the shots.

The DWM’s only a bottleneck in games that don’t use Flip Model. D3D12 games re required to use Flip Model, and Special K converts D3D11 games to use Flip Model because developers are lazy and they keep shipping D3D11 games that don’t use Flip Model.

1 Like

Think the two week timeout from Reset Era should be up now, might want to be careful throwing that around though they’re kinda sensitive to that particular little bit even when it’s from other developers. :smiley:
(Was a interesting thread yesterday about UX development and problems, fascinating but it kinda derailed after a while ha ha.)

Hah although at times I wonder if it’s not actually more true than compatibility though older maybe even by this point legacy coding practices could also be a big part of why some of these newer fun and also very useful features end up ignored.

Community is one thing too of course but maybe with the new consoles if this is something on the XBox it’ll actually start seeing wider usage now and over time some of the established no longer entirely accurate statements or facts as they’ve been can start changing too. :slight_smile:
(I get a lot of stuff wrong myself as well, it’s fun to learn but it’s kinda difficult to remember and everything keeps changing. :stuck_out_tongue: )

Interesting about this CPU spoof thing too, keep seeing three cores six threads a lot guess that’s inaccurate but it has to stem from something for Cyberpunk 2077 hmm, well another good reason to pick up the game and have a look in addition to actually playing the game also once the backlog is a bit lessened.

…As if that’s ever been a thing but it can at least be reduced even if actually eliminating it isn’t ever a thing ha ha!

EDIT: Also wasn’t aware it should be set higher and I think I’ve been using that for physical but not logical cores too so half of what it should be with Simultaneous Multi Threading from AMD and Hyper Threading from Intel when that’s in effect.

…Shouldn’t it just be Multi-Threading?
(Separate Multi Threading isn’t a thing is it like nope these cores first then these “extra” cores can do their job. Feels a bit redundant to add in simultaneous but oh well.)

Twin threading I guess is how it works, AMD’s working on some server variant with three or four though but I guess that doesn’t transition well for gaming or possibly the Windows 10 desktop OS in it’s current state handling and scheduling these additional non-physical cores and getting that right. :smiley:

Since you guys were driving me absolutely insane with complaints that you can’t turn HDR off in Cyberpunk 2077 and turn SK’s HDR on… I went and fixed the Cyberpunk engine to allow changing the backbuffer format without crashing :wink:

UPDATE: Made a Plug-In to control AMD CPU thread count

    


Cyberpunk 2077 Plug-In v 0.0.1

   dxgi.7z (7.6 MB)

  • SDR → HDR support
  • AMD Threading Control
6 Likes

" Your payment is being reviewed because of regulations. We’ll send an email when it’s complete or if we need more information."

Hmm never seen that before.

Well good to know though like if I ever send over something through Paypal for the upcoming compilation I need to remember to word it better ha ha.

“Ryu Ga Gotoku”

Apparently they don’t much like the “Yakuza” term with PayPal. :stuck_out_tongue:
(Even if it’s in video game terms.)

EDIT: Now then time to see what else is coming out in December - January 2020 - 2021 here and what I’ve missed.

Valhalla wasn’t too bad but it got a bit mashed up near the end and it feels like some of the segments should have been put together differently and there’s no real good ending or conclusion to the game though then the two DLC are direct follow ups same as what Odyssey had although Odyssey did also have a more definitive conclusion although depending on the order it was accomplished some of it could feel a bit bungled too. :smiley:

EDIT: Also where did the entirety of the PC GPU hardware go?

Nothing in stock at all checking earlier today whereas last week the Polaris and Pascal cards were at least somewhat stocked for NVIDIA’s GPU models same with Vega and Polaris from AMD now it’s just gone. Interesting although weird.

Would this work with Intel or any reason to do that anyways with intel?