Topic-Free Mega Thread - v 1.11.2020

Well they jammed DirectX 12 into the game…still seems really demanding though but one or two settings a bit down and it shouldn’t be a big issue. :smiley:

Looks like it really pushes CPU then in higher settings. I feel like I fall under the Recommended due to my CPU but my GPU is in the enthusiast part. And I am sporting 32GB of DDR3 Memory. Wondering if my CPU would be a bottleneck for 1440p. I know that in pairiing CPU with GPU at low resolutions the game application will rely more on CPU than GPU. So its interesting to see that kind of CPU requirements in those areas. Unless some of the settings they refer to are actually CPU bound for most part. Which wouldn’t surprise me.

I personally have that game preordered.

I also have Watch Dogs Legions Pre-ordered too lol.

My old Intel CPU would be a bottleneck for sure, it had quad-channel memory and according to this stupid chart, dual-channel memory is required for Ultra :slight_smile:

Ubisoft’s got this weird history with memory channels. They constantly feel the need to discuss them as though they are some kind of magical thing. They’re not, and all games are going to suffer on systems if the user stupidly doesn’t kit their system out with enough DIMMs to use all available memory channels.

How would you know if your CPU / mobo is dual channel or quad channel memory?

Max of memory channels?

NVM looking at CPU-Z shows Channel # as dual.

The CPU contains the memory controller, the motherboard has nothing to do with this equation at all.

All modern AMD Ryzen CPUs are dual-channel, the ThreadRipper CPUs may be quad-, or octa-channel.
All Intel CPUs are dual-channel until you step up to Xeon or HEDT and then they got quad- or six-channel.

Basically:

If you have to ask, your system is dual-channel, and if you didn’t put at least 2 DIMMs in it, you @#$%'d up and it’s time to go buy another stick of memory.


You would know if your CPU were high-end enough to require 4-, 6- or 8-memory channels… they’re not cheap.

I got 4 sticks of memory each 8GB. SO yeah I didn’t mess up and I always did two sticks at least in my PC build.

ON side note I am getting the weirdest bug in Baldur’s Gate III. Whenever I switch to main char the screen suddenly becomes bright and insanely lit. But moment I change chars it dims down. Is that b/c my char has some ability to see better?
Edit: Yup something about my char puts some light all around them lol. Well that’s interesting. Can’t figure out what causing it rofl.

And that’s normal.

It’s only stupid OEMs that tend to ship systems to users without enough DIMMs installed to use all memory channels.

Their rationale for doing that is usually that they’re using a cheap motherboard with only 2 DIMM slots and they don’t want to leave the end-user in a situation where in order to upgrade memory capacity after-the-fact, they actually have to throw DIMMs away.


That’s crazy moon logic though – in order to make upgrading easier, cripple the user until they actually upgrade… uh, no. Just ship better motherboards in your systems :stuck_out_tongue: Then upgrading doesn’t involve removing stuff!

</rant>

Sunlight on Earth is a few billion nits.

Unless you are related to this dumbass,

you should avoid staring directly at it.

This is Dungeons and Dragons, so … yeah.

Infravision is a huge game mechanic in AD&D. Or at least it would be if 99% of metal helmets did not grant the ability. 5 minutes into any Baldur’s Gate 1 or 2 campaign you’ll probably loot a helmet that will give you infravision.

I guess Larian’s taking a slightly different approach here; in Baldur’s Gate 1 and 2 if any character in your party has infravision, then everyone has it.


Dwarves and Elves have innate infravision, it’s only the Human races that suffer night blindness, probably because they evolved the need to survive doing stupid things like staring at the sun as evidenced in my last post :slight_smile:

Yeah except in my case when its the player itself the one I created it applies the lighting. But when you switch in combat to another character the light isn’t doing the effect anymore.

Edit: oh my character That I made is Elf. So that would explain it.

Also on side note Baldur’s Gate III patch is quite interesting way of Updating game. It basically sticks entire game inside a !Temp folder so the game folder doubles for a bit while it applies patch.
Screenshot_1886

Is this someone’s idea of a joke?

I own a SEGA Saturn… Virtua Fighter is the limits of its 3D capabilities (and probably the best selling game on the system, too). If you’re going to fake box art, maybe pretend to be Dreamcast? :stuck_out_tongue:

SEGA’s doing a 60 year promo or what it was, might also be the time for another odd reveal on October 19th when some of these are no longer available.

They put sheep in just about everywhere when Catherine got revealed just to name one example.

EDIT:

You also get stuff if you watch their recent video broadcast I think is how that works?

1 Like

Hmmm… I wonder how long this has been creating weird debug problems? :slight_smile:

D3DKMTDestroyDevice                    = SK_GetProcAddress (SK::DXGI::hModD3D11, "D3DKMTDestroyDevice "); 

SK may have been handing low-level stuff in Windows an invalid function pointer when they try to cleanup. There’s not supposed to be a space at the end of that string.


Thankfully, the damage here would mostly be limited to situations where you put SK in a game’s directory and call it d3d11.dll. That’s a pretty uncommon way of injecting Special K or any other render wrapper for D3D10/11/12. It’s way more common to just pretend to be dxgi.dll.

So basically only used by me, lol.

I don’t typically use dxgi.dll since I believe it still interferes with the “Disable fullscreen optimizations” compatibility mode.

ReShade occasionally gets the D3D11 or D3D12 .dll “slots” if it doesn’t want to hook by the usual means.
Otherwise SpecialK is mostly the DXGI file when doing local mode.

Keep forgetting that about fullscreen optimization too and however that gets tangled up. :slight_smile:

To be fair it is mostly an obscure issue for now that not many really ‘suffers’ from. The worst that happens is that disabling FSO fails, so ‘exclusive fullscreen’ mode still gets turned into that weird flip model-state that Microsoft enforces.

That said, this issue affects ReShade when it’s named as dxgi.dll as well, so it’s not only a Special K thing.

It’s worse with D3D9, where using ReShade and Special K as d3d9.dll and enabling the Disable fullscreen optimizations option used to (haven’t tested it recently) silently crash games on launch without any error prompt.

1 Like

Most of those problems would just go away if you used global injection instead :stuck_out_tongue: Except for the ReShade one… that software’s always going to require injection as a game-local DLL and get all the compat. issues that come along with doing that :-\

We can do this …

“I’m the smartest programmer who has ever lived.” Priceless, lol. Paste multiply designed symbols and pass that off as an accomplishment.

This is a weird turn of events.

I did own some GameStop stock for a while when they were worth absolutely nothing and switched to dump-everything-into-dividend mode to attract investors, because the notion of making any return on their negative earnings amused me.

Zero faith in the company to actually make money, but I guess Microsoft bankrolling their incompetent business practices will keep them afloat for a little while longer.

It will be fun to watch them run this into the ground, mark my words. They understand absolutely nothing about the market they serve.

I’m not familiar with the person, so this isn’t entirely aimed at him, but i find it ironic how most people believe they’re smart, or at least smarter than the average person, which as we know is impossible.


Seems like SK can hook to Dolphin emulator (Gamecube/Wii) in D3D11 mode. Need to test more, but so far all i’ve noticed is that you can’t push the internal resolution beyond your monitor res. I’ve been able to debug shaders too.