Topic-Free Mega Thread - v 1.11.2020

G-Sync is overrated when you have a GPU with those levels of performance. It’s only beneficial when you plan to have performance problems :wink: I never use G-Sync, even though my displays support it. It’s just not useful.

In my recent tests, it’s very conclusive that G-Sync creates more latency in well performing applications than it removes in poorly performing ones. It’s not meant for this market segment.

Ya, and this is why I feel so conflicted. I guess I’m just gonna need to see 3rd party benchmarks, before I can make my decision. I honestly didn’t think I’d be this conflicted after the announcement. I figured Nvidia would still be the performance king, even if AMD gave you more for the money. What numbers were shown for those prices (at least on the 6900) gives me serious pause. Not to mention better Linux support, which would be damn nice.

This also happened, and is more important :smiley:

1 Like

Reminds me of the SGI Onyx Fridge. I’ve wanted one forever, but now that the only place you can find SGI Onyx’s is at the bottom of landfills, it’ll never happen :frowning:

This would just make the coolest refridgerator.

Interesting with SAM as well between the RDNA2 GPU and a 5000 series CPU though overall it seems like a extra and nothing too crucial although it’s a interesting bit they’ve added with this Smart Access Memory.

Thought it was mostly about performance claiming the X500 motherboard, 5000 series processor and the 6000 series GPU but there was a bit more to it.

Good pricing too 6800XT seems to fit nicely and then they really pushed the 6900XT against the 3090 pricing wise so even if it’s near 1000 Euro it’s still 500 cheaper.
(But for productivity tasks it has less VRAM though for gaming this is not too crucial it’s still 16 GB.)

Nvidia kind of hamstrung themselves with the 3090, by disabling all the Productivity features. Now that is a driver level restriction, so it could be lifted, but Nvidia can be stubborn.

1 Like

Ah yeah the float performance and how that’s not quite the same compared to the Quadro GPU lineup.
Could make the 6000 series from AMD attractive alternatives for when CUDA isn’t a factor. :slight_smile:
(Which is a pretty big hurdle for AMD as I understand it.)

Looks like Gamer Nexus also had a bit about how while AMD does now have support for DXR 1.0 to 1.1 they do not support custom implementations which would be NVIDIA RTX.

Might be one reason as to how Cyberpunk 2077 was rumored to only support NVIDIA ray tracing when it launched and AMD possibly later on.
(Earlier games especially those now out of active support being a bit of a question mark.)

GPU’s should be available around the 18th November (December 8th for the 6900XT) so that should clear up a lot of the statements that AMD was a bit vague about or didn’t quite answer.

I like it overall, competitive both getting the hardware out closer to NVIDIA and also performance and pricing.

Software side and launch driver state as the only potential issue and then whenever third party customized models will be available from partners like Sapphire, MSI and others. :smiley:

And outside of AMD’s control store availability and how that’s going to be handled.
(Sold out really fast I guess for this first batch as per the norm ha ha.)

On similar news it sounds like physical availability of the new consoles is also going the limited and they’re promoting digital instead, guessing that’s for the pandemic situation.

I was talking more that they disabled the features that Titan cards offered, that normal Geforce cards did not, but not quite the Quadro features. While the 3090 is a “Titan level” card according to their presentation, it is not a Titan card. At least that is what they told Gamer’s Nexus when they noticed the discrepancy in their 3090 review.

Some of the leaks also list the AMD cards being able to boost near 2.5 GHz, which, that could be very nice. I really want to see Gamers Nexus’ review. i’ll probably go for the 6900 XT if anything though. I’ll make my decision off of what the 6800 XT reviews, whether I go through with it or not. The much better Linux support actually essentially offsets the no GSync for me, especially since GSync doesn’t even really work under Linux, at least not without a lot of hoops (same holds true for Freesync though).

1 Like

Ah I see so there’s some additional bits and how the driver might be restricting or limiting certain functionality and how NVIDIA might but might also not choose to lift this.

That’s a bit messy and yeah NVIDIA might not have any reason to lift these limitations from the driver unless pressured into doing so for well I guess that would be from competition of AMD’s GPU’s but then they have CUDA as this big thing that AMD can’t support as it’s NVIDIA tech.

Wonder what Ampere Titan would even be there was some rumors about a up to 40 GB card but that sounds more to me like the full server or workstation professional Quadro lineup the A100 or what it’s called I forgot.

Isn’t gsync useful though whenever you can’t hit the refresh rate especially in 4K high refresh rate gaming, which I believe your OLED supports if it’s a 2019 model, or newer.

No more useful than a good framerate limiter would be :stuck_out_tongue:

My OLEDs support the feature, but I would have to give up Black Frame Insertion to use G-Sync. I’d get exactly zero benefit from G-Sync, just a blurrier image.

Hmm so Watch_Dogs Legion uses BattlEye, wonder if the same workaround as in Ghost Recon Breakpoint would work otherwise well it might accept ReShade but that’s a bit of a if due to restrictions on injectors including ReShade and it would definitively block SpecialK.

EAC would have been about the same too and more of a problem to remove for the newer versions of it as well compared to the older skip it and disable online multi-player.

…Which the game doesn’t actively have as it’s one of the updates that will be added later on from my understanding.

Also cut out train rides, shop interiors and dogs for some reason.

Thought Queenie loved her dogs oh well.

It’s D3D12 anyway, so … I don’t expect to do anything fun in that game with Special K even if I can get the DLL injected.

It’s really sad that the days of D3D11 in AAA games are coming to a close. It’s such a more robust API for what I want to do with my games (take control of their engine and pull all the strings :P).

D3D11 and D3D12 though focus on D3D12

Assassin’s Creed Valhalla is the same and hopefully anti cheat free.

EDIT:

EDIT:
The game runs “not well” regardless of the D3D12 API so short of RTX reflections it doesn’t seem to matter unless the performance of D3D11 somehow hits negative values.

From that topic there’s these images.

Quality DLSS

Balanced DLSS

Performance DLSS

Ultra-performance DLSS

There’s testing without RTX and DLSS and yeah the performance is not very great and that’s with the 1.01 update too which is out now.

EDIT: Err how does this work again.

2x, 4x, 6x and 9x something like that I believe eh there’s a percentage to how it scales that’s more accurate for how it does it but ultra performance so upwards of a 9x resolution downscaling and sampling back up and still not holding at least 60 minimum on some of the highest end hardware available?
(And no blaming the draw calls when using D3D12 either like what the extended view distance did to Watch_Dogs 2’s performance when set at a higher value.)

EDIT:

EDIT: CPU scaling is hampered by something it seems.

How can it not scale well to a quad core system even?
Wonder what’s going on here, considering the Watch_Dogs 2 CPU hindrance and D3D11 issues like having a extra view distance slider and hitting API limits that seems like something that should have been addressed early on during the games development.

WTF? No wonder nobody uses HDR in Vulkan…

https://www.khronos.org/registry/EGL/extensions/EXT/EGL_EXT_gl_colorspace_scrgb.txt

Add two paragraphs after the 4th paragraph above:

When using a floating-point EGL surface with EGL_GL_COLORSPACE_SCRGB_EXT,
the display-referred values in the range of (0.0, 0.0, 0.0) to
(1.0, 1.0, 1.0) correspond to a luminance range of 0 to 80 nits, which is
the same luminance range for sRGB. To achieve a larger dynamic range of up
to 10000 nits, the output values can go beyond 1.0 and to a range of
[0.0, ~7.83] for each channel.

They went and invented some non-standard version of scRGB and cannot even be bothered to explain how you get 10,000 nits from a value of 7.83. scRGB is a linear colorspace, 10,000 / 80 = 125.0. 10,000 nits is represented as 125.0 in the correct definition of scRGB.

There’s a gamma curve in whatever the hell this thing is that’s not supposed to be there.

2 Likes

There’s a bunch of extensions, thought the main one for color space was this.
https://www.khronos.org/registry/vulkan/specs/1.1-extensions/html/vkspec.html#VkColorSpaceKHR

I don’t know what/how NVIDIA does Vulkan HDR though, AMD goes via a different extension entirely it seems and then these go through other extensions and functionality.

https://www.khronos.org/registry/vulkan/specs/1.1-extensions/html/vkspec.html#VK_AMD_display_native_hdr

(Through the GPU Open website.):

I had a hard time finding how it’s done for NVIDIA for some reason but I thought HDR was supported in Vulkan 1.1.x somewhere apparently that wasn’t the case. Curious.

EDIT: From NVIDIA’s Vulkan beta drivers though it’s going through somewhere.

HDR is there going by that last bit from this driver from some time back.

Possibly some passthrough functionality I’m well I’m not technical enough to figure it out really but I can get some bits of it from the info via these changelogs.

Why on Earth would you want to disable local dimming? AMD’s silly. That’s super important for HDR.

I may just have a look at that AGS stuff in the near future though, AMD GPUs are looking pretty compelling for the first time in years. If NVIDIA’s pre-order situation weren’t so hilariously messed up, I probably would not even be in this position right now.

1 Like

lol, I have no idea what this has to do with me, but okay…

I’ve brought this up before :laughing: Believe it or not, you’ve unintentionally made the perfect tool for sexualising Lara. Shader debugging to remove clothing, plus texture injection.

Or was it all unintentional?? :eyes:

Joking of course

Edit:
I think every nude or sexualised mod for SOTTR uses SK. Tread with caution, or just avoid: Shadow of the Tomb Raider Nexus - Mods and community

1 Like

“The game is ready for the PC and runs great on the next-gen consoles, and could be shipped on the scheduled date on those platforms. However, even though the game has been certified on the current gens by both Sony and Microsoft, some very final optimization processes for such a massive and complex game require a bit of additional time.”

tenor (5)