lol, ironically… I had a job offer lined up at Crystal Dynamics around the time they were redoing their engine for the 2013 Tomb Raider reboot. I probably would have made a tool to sexualize Lara if I officially worked there Obviously the artists had the same idea, since they wasted art assets on layers of clothing that were never supposed to be visible.
Sounds like they kept you in mind
That’s really interesting, honestly. Was the job role more specific?
They were interested in my, at the time, expertise in OpenGL. Even back then, experienced GL developers were something of a novelty. Laughably, since starting Special K, I have only written Direct3D code. I did a complete 180.
Screw it, I’m going full AMD More AMD stock, more AMD CPUs, and now an AMD GPU.
No RTX 3xxx for me, I’m pulling the trigger on their 6900 (nice) XT and hoping drivers don’t make me regret it. Not that NVIDIA’s recent driver quality is anything to really remark at, in fact … it might be nice not to have games breaking when they use NvAPI to enable HDR for a change.
DLSS and RTX never really worked correctly on NVIDIA GPUs (Control doesn’t even recognize that I own an NVIDIA GPU), so when weighed against that, it seems impossible for the AMD experience to be any worse.
I am split… Since G-Sync is still such a deal-breaker for me I’ll probably go for an Nvidia card again (if they ever get back in stock that is), but I am also really interested in replacing my GTX 1070 in my server with an AMD card. Nvidia nowadays are quite annoyingly restricting when it comes to virtualized machines and I am hoping AMD would be more open to such use.
Speaking of NvAPI breaking HDR, I just reviewed the documentation for NvAPI HDR and something horrific came to my attention:
typedef enum
{
// Official production-ready HDR modes
NV_HDR_MODE_OFF = 0, //!< Turn off HDR
NV_HDR_MODE_UHDA = 2, //!< Source: CCCS [a.k.a FP16 scRGB, linear, sRGB primaries, [-65504,0, 65504] range, RGB(1,1,1) = 80nits] Output : UHDA HDR [a.k.a HDR10, RGB/YCC 10/12bpc ST2084(PQ) EOTF RGB(1,1,1) = 10000 nits, Rec2020 color primaries, ST2086 static HDR metadata]. This is the only supported production HDR mode.
// Experimental
NV_HDR_MODE_UHDA_PASSTHROUGH = 5, //!< Experimental mode only, not for production! Source: HDR10 RGB 10bpc Output: HDR10 RGB 10 bpc - signal UHDA HDR mode (PQ + Rec2020) to the sink but send source pixel values unmodified (no PQ or Rec2020 conversions) - assumes source is already in HDR10 format.
NV_HDR_MODE_DOLBY_VISION = 7, //!< Experimental mode only, not for production! Source: RGB8 Dolby Vision encoded (12 bpc YCbCr422 packed into RGB8) Output: Dolby Vision encoded : Application is to encoded frames in DV format and embed DV dynamic metadata as described in Dolby Vision specification.
So, if you read those comments, the correct choice of mode if you want your game to work is NV_HDR_MODE_UHDA
, right?
Apparently, not if you’re every developer publishing software on Windows. They all use NV_HDR_MODE_UHDA_PASSTHROUGH
, except Andromeda, that uses Dolby Vision.
Does nobody even read documentation anymore? If the driver vendor tells you not to use something, and you ship your software to customers only using that thing you were not supposed to use, WTF is wrong with you?
Lucky me, I don’t care one way or the other about G-Sync Plus, my OLEDs support HDMI 2.1 VRR if I actually wanted that. Non-proprietary is always best.
I found Freesync to be a lot more bothersome since it causes massive flickering. And funnily enough SK makes my game run much smoother than Freesync ever did. So I am definitely now thinking of gunning for a 6800XT Most likely.
FreeSync and G-Sync really have very little potential to improve the smoothness of a game. You can get the same level of smoothing (probably more, in fact) with Special K’s framerate limiter in drop-frame mode.
It’s why I don’t even bother with G-Sync even though my displays support it. Of course, I’m in the opposite boat as you are, I like it when my displays flicker because that reduces motion blur.
In my monitor’s case anything lower than 60hz looks like a strobe light type flicker. On side note yeah SK definitely does soo much better in most games. Even fps as low as 30 feels damn smooth and not laggy like normal vsync is.
On side note by disabling Freesync on my monitor I can change response speed up to extremely high. Which cuts down frame latency by a ton. In a small way it uses BFI as well when freesync is disabled.
Hahah, you know this:
That image in SK’s “box art?” comes from this scene in Tomb Raider. It’s visualizing SK’s HDR, but I think I like this version better.
Missed opportunity. Evidently this mod is built using Special K, it should have been in the trailer (lol).
Yeah allow texture modding and well that’s what’s likely to happen ha ha.
There’s some really good implementations and usage of texture overrides besides certain sorts of mods but a quick check on the more popular downloads for some games and yeah.
I do miss it when developers provided more direct SDK’s or modding kits for their games although I suppose the way of these third party libraries and extensive license and cross license agreements and deals makes some of this problematic and then internal custom engines and how you’d have to built with modding in mind.
That or more developers saw what happened to Bethesda’s games and what the popular top mods for those were ha ha!
Yeah those unofficial patches hah well those and a few other things.
Quite some dedication though going through not just script files and fixing assets but hand correcting floating props and all sorts of changes, code optimizers and compatibility fixes for Morrowind going even further still.
(Still borderline impossible to get GameBryo to actually play nice but getting some stability into it that’s somewhat possible at least.)
EDIT: Suppose you have the early Tomb Raider art and direction to thank for some of what that game had to endure, pretty sure I remember one of the original employees outright hating the direction Eidos was trying to push although Crystal Dynamics then tried to improve though the first game veered off course and kinda crashed and then Lara became terminator and borderline psychotic in the respective sequels but they somewhat improved over the first game in many other areas.
(Toned down in Shadow similar to the kill sequences in the first game that got oddly weird over the course of that game but Lara has quite the mood swings instead of just action heroine.)
EDIT: Plus the notion of a archeologist who can’t explore a area without exploding it.
(And Jonah just go through one redesign after another and they threw the rest of the support cast in the fridge or worse.)
EDIT: I do like that the original team disliked and tried to intervene with Eidos with the whole promoting sexiness not that it helped much and the redesign and reboot under Square wasn’t too bad but the writing could have been a lot better and having a trilogy of the Tomb Raider becoming the Tomb Raider and then becoming the Tomb Raider in the next two games too was not the best.
(Tried to go more into characterization too at least not always with the best results for how some scenes came across but eh it’s something.)
Tried doing more puzzle stuff too but that eventually almost reduced combat to a smaller bit of the game at the end and a few odd rush / horde moments.
(All the weapons and ridiculous amounts of ammo and nothing to really use them against and the bow is still super overpowered anyway heh.)
Now what else was there oh yeah Watch_Dogs Legion should have unlocked so time to get that update and texture pack and see about stripping BattlEye before using SpecialK to toggle shaders and stripping the London citizens.
Suppose such things can’t quite happen if using D3D12 though ha ha.
Might put the FPS limiter to some good work and a few general DXGI overrides at least.
Possible test if D3D11 runs comparably and if it can be joinked off see about DXVK once BattlEye isn’t a pain in the back-end of all this ha ha.
It doesn’t even belong in the game. It bugs me that this kind of stuff is never touched on in reviews. Anti-cheat in a game where the only purpose for having anti-cheat is to protect micro transactions is a practice that should be cast in a negative light.
Wonder how many reviewers even touch on some of the more technical aspects in any meaningful detail.
Suppose there’s Digital Foundry when they do cover PC versions and then Hardware Unboxed has a fairly good series for balancing optimal settings and comparisons.
But for reviews and details if the PC version is even reviewed at all much of the tech stuff feels like it often is glossed over.
EDIT: Huh what do you know Taiwan CDC has a Disease magazine and calendar.
Just what some of these were lacking, sex appeal.
EDIT: And there’s that texture pack for Legion, now to see about that Batl Eye thing.
(Seems similar enough at least, going to be interesting to see if it can be removed.)
EDIT: Same exact thing as Watch_Dogs 2 too a extra view distance slider and goodbye framerate.
Funny but I suppose that explains some of the performance reports then.
EDIT: And out goes BattlEye. ■■■■ that thing.
Awesome to see Epic finally put some effort into their platform
That’s a really neat UI, shows the distribution between disk read/write/network I/O operations.
C:\Games\Uplay\Watch Dogs Legion\bin\BattlEye\
BELauncher.ini
[Launcher]
GameID=wd3
BasePort=9003
64BitExe=WatchDogsLegion.exe
SilentInstall=0
PrivacyBox=1
BEArg=-BattlEyeLauncher
Set silent install to 0 then when BE starts up tell it “No” or “Cancel” rather and not to remind you again.
This goes into Uplay for the command line arguments. “-BattlEyeLauncher”
The game starts Battl Eye and it’s related services do not.
Some config tweaking and the game defaults to D3D12 but D3D11 can be selected there’s a fair bit of VRAM usage particularly when installing the optional texture pack DLC enabling ultra texture quality but dropping shadows one notch also frees up a lot of VRAM.
Extra details will do terrible things to framerate but this time as you slide the option above 0% it’ll gradually show a higher GPU and CPU load from low to mid to high.
Haven’t gotten much done though just tinkered with throwing out the anti cheat ha ha.
Effectively it’s pretty simple deny the silent background install so you can cancel out and tell it to never bother again and then set the command line argument or the game might be bouncing back and forth in the task manager trying to accomplish something and terminating and trying again.
(Just ending it works but that’s what the command line argument resolves kinda like -EAC for Watch_Dogs2.)
Hmm Dunia or rather Disrupt has a anti-debuger in it, curious.
Ah, nice. I can automate that so nobody ever needs to edit that, lol.
There are quite a few games that Special K supplies command line overrides to in order to work around anti-cheat automagically. This will be yet another one.
I, of course, only do that for games that have no legitimate reason for having anti-cheat. I’m not about to get people banned
Here’s a build of SK that doesn’t trigger the game’s anti-debug shenanigans. Why doe they even bother with this stuff? It’s only a mild inconvenience and anyone writing an actual debugger has the skill to reverse engineer their anti-debug.
SpecialK64.7z (7.6 MB)
Like, seriously, I can reverse engineer the hell out of their game and it only took me 5 seconds to defeat their debug check. They’re being completely obtuse.
Not to be that guy, but I am pretty sure that have been present in the launcher since day 1.