Load Reshade before Special K to handle HDR?

Hello,

I am trying the UIDetect feature of Reshade, and it works great, but obviously not in HDR, as I cannot find the RGB colors (tried to take screenshot with Geforce Experience in HDR, but then I have no tool to edit it in HDR and find the RGB value for the shader to work).

So I thought: I will use Special K, have Reshade as Early, so reshade sees all “SDR” data, and then Special K enables HDR on top of it.

Sadly I cannot get it to work, I tried Reshade Custon 4.9 and 5.0.2, in both cases Early or Plugin is causing colors issues with most Reshade shaders (like when Reshade didnt support HDR).

I am also trying to use the Unofficial Special K Reshade, but for whatever reason it doesn’t load at all (tried Special K Injector and direct dll). Any clue for this?

Is it possible?

Thanks

There is no SDR data for ReShade to see, SK’s HDR functionality isn’t a post-process. If it can’t work with 16-bit HDR color after SK outputs a final image, it won’t be able to deal with it before either :-\

ReShade still doesn’t support HDR :slight_smile:

The unofficial version is even older, and probably wouldn’t work with whatever you’re using to detect the UI.

I don’t think there’s a solution to this problem at the moment. ReShade needs HDR support, and I’ve got my hands full with other stuff not able to implement it for ReShade at the moment.

Thanks for your reply, alright I thought there was an order!

I think Reshade supports HDR now no?

Once upon a time, effects like MXAO would break the colors in HDR (native game HDR, no SpecialK), but now it works pretty fine. It actually still break them with Special K (16 Bit HDR it seems?), but not with Elden Ring HDR. Unless Elden Ring is using a weird HDR!

My issue with UIDetect is that it takes an RGB value of 0-255 of specific pixel to stop effects when a menu appears, and I guess when HDR is enabled we are speaking 10 Bit RGB 0-1023.

Sadly I found zero tools on internet to find this 10 Bit RGB value of a pixel (supposing UIDetect shader actually understands 10 bit).