Topic-Free Mega Thread - v 1.11.2020

I don’t know if that feature even works anymore. I kind of had to downplay its existence while Special K was a product slated for Steam release, even though it was only jokingly DRM-related.

I would have to look at the specific details of that game to figure out what’s going on, clearly it was not tested properly and cannot deal with the different behavior of SteamAPI when no connection is present.

Often parts of SteamAPI return values of -1 in offline mode, and then a developer tends to not check these values and try to allocate memory for arrays large enough to hold (-1 + 1 = 0, best case or -1 = 0xFFFFFFFF… 4 billion, worst-case), lol.

Yeah, my suspicion is the memory bandwidth increase is necessary to make RayTracing actually practical. AMD has shipped cards with way more bandwidth than needed for a very long time, NVIDIA’s pretty on the money typically – only going full-tilt with their compute-only cards.

I’m still not that interested in RayTracing, but if this generation turns out to dramatically improve its throughput that could change.

I don’t expect much from raytracing until hardware is good enough where rasterization can begin to be less of a main focus with ray-traced effects on top but even with focus on only one effect current RTX hardware hits almost a 50% performance decrease.

It’s nice but not enough for having shiny surfaces, improved shadows or self shadow improvements plus some implementations still combine screen space effects likely to lessen the performance impact.

World of Warcraft Shadowlands shows a dramatic gain from the ray traced shadow implementation particularly when focusing on the smaller details that previously lacked shadowing entirely but at halving the framerate it’s still too demanding to recommend.

From having memory as the bandwidth bottleneck it makes sense NVIDIA would focus so much on improving that if ray tracing is going to be focus even if it might take a while and could be split between NVIDIA RTX, AMD’s what it might be and game engine solutions like Cry-Engine and a optimized mode that draws on ray tracing but also together with screen space shaders from my understanding so it’s not fully physically correct. (But it’d also work on hardware like the console systems without the massive performance decrease especially if 4k30 is going to be feasible with these types of effects.)

Interesting about AMD there too because even when they utilized HBM the bandwidth was still a bottleneck though they had the bus width in turn compared to current 256-bit GDDR6 on Navi10 which is still limiting things somewhat.
(Likely one of the big gains the Radeon VII had until driver support got worse for Vega overall due to more stacks and the 16 GB total but it’s a worse Mi50 workstation card - binning wise. - repurposed and sold at a high cost yet still a loss to desktop and consumer hardware.)

Vega had other issues though, not just memory or even the drivers, Navi20 on 384-bit or something and 10+ GB GDDR6 could also see some boosts from having less of a memory bandwidth limit.

For the first few years though I don’t have too high expectations on D3D12_2 overall unless developers or publishers try forcing it, no-one is going to target a baseline hardware from late 2020 let alone everything already in development coming out in the next year or two.

I assume that the 0.9 and 1.1 feature levels Microsoft had as baseline for 12_2 here are also from Ampere and finalizing that GPU while retaining 12_2 compatibility and compliance and the specs that were used then while Microsoft also put the finishing touches and implementation on this API thus some extras here which well it’s probably nothing too important beyond just having support for the new features and for 12_0 and 12_1 AMD and NVIDIA also took a few GPU generations to fully support most of these levels in full.
(Not too sure if ray tracing feature level 2 is going to be much of a issue and if NVIDIA mostly goes with their own design through RTX anyway.)

Eventually also what I expect will be the baseline for Vulkan 1.3 and these capabilities to some extent drawing on ray tracing for it’s VK_RT extensions and however mesh shaders will work out which is something I am curious about but also can’t really be a main focus if it’s a replacement for some of the current shader methods until hardware availability makes 12_2 focused development possible as a mainline not a extra and also including Windows 10 OS availability version 2004 and newer along with the hardware itself and however usage of DirectX 12 and Vulkan will change now with this coming console generation.
(Can’t really use this for D3D11 and that API isn’t going away anytime soon but we might see more purely D3D12 or Vulkan developed games and game engines now at least from the bigger studios.)

EDIT: Doubt I’ll be around to see rasterization phased out though, it’ll see bigger support and implementation as hardware improve but it’s still nowhere close to where that is possible to do but that’s going to be a huge shift although not anytime soon.

Non-broken reflections, self shadowing of even finer details and accurate shadows isn’t bad though once the performance impact lessens followed by attempts like ray traced global illumination and maybe also stuff like audio or other non-visual usages for this. :slight_smile:

Trying to be a bit of a realistic or how to say. :smiley:

Focus is still much on HDR, 4k and then whatever the framerate ends up at now also supported up to 120Hz / 120FPS meaning developers could make a performance mode or a 30 FPS and a 60 FPS target.

If possible even with a ton of research needed and years of work I would assume focus to also be on upscaling and reconstruction and AI assistance like DLSS because a quarter of actual 4k upscaled to look close enough especially from a distance is huge and also big in terms of performance gains.

If NVIDIA could decouple DLSS 3.0+ from needing game specific support and fine tuning (It’s improved but this isn’t gone entirely just yet.) I would also expect they could easily win this entire generation of GPU performance fighting and get even more than the already massive 80% consumer desktop market they have.

Even 5k - 8k could be used as a downsampling mode while running ridiculously well and improvements over time will smooth out and fix flaws and display glitches or other issues too plus the performance for mid to low end cards of the 3000 series and newer that can do 1920x1080 - 2560x1440 or even 3840x2160 at a part of what the GPU performance would really have been.

Working that out would be a huge thing, set and forget and maybe some settings for scalability or just a 4x ratio fixed and refining it through driver development and that’s it.
But not yet, needs less data for it’s training model but the training itself can’t just be thrown out currently and I doubt some cloud solution or user driven training data could work just like that either.

Still a thing though, I don’t doubt NVIDIA would do it if they could and that might not be completely unfeasible although focus might start on specific game engines like what they already have for the UE4 NVIDIA engine branch. :slight_smile:

EDIT: Consoles are still going to be the main focus or how to say and the first year or two might be a bit up and down plus upgrade versions of games and patches and exclusive deals and terms and cross-gen ports and whatever the PC version will get for all of this.

Yay. Again.

WTF?

image

Usenet is a text newsgroup, WTF would you want 2 TiB of data for

Usenet is only text? Nope.

“Binary Newsgroup”

Edit:

Did they fix the stutter when walking around town? Even with Special K I couldn’t make it go away.

Why 250 Hz and not 1000 Hz polling? 250 hz, 2stuttery4me

Hi guys. Has anyone tested SpecialK on Control? I want to play Control soon. Knowing SpecialK works would tip the scales for me. Edit: I was rather disappointed that Control does not have native HDR.

Does it matter as long as you fit the frequency of your monitor? 250hz is 4ms, not sure if that fits 144hz (6.9ms) properly but that’s assuming Aemony uses a 144hz monitor.

Dx11 mode will probably be fine, but you’ll lose out on raytracing and DLSS if you care for either.

1 Like

You are thinking of what happens when people do stupid things such as “buying on margin”.

Buying on Margin, is basically taking a loan, but you are getting the loan from the brokerage, not a bank.
People do this so they can buy more stocks than they can afford.

Typically you are allowed to borrow up to 50% of the purchase price of a stock. So I deposited $20k, and now I can buy $40k worth of stock. They also have a minimum of equity that is required in the account, this is called the maintenance margin. Let us pretend the maintenance margin is 25%, and I borrowed to my limit and got all $40k.

My equity is [$40k (Stocks current value) - $20k (Amount Borrowed)], so I have $20k in equity, and $20k equity is 50% of the $40k stock value, well above the 25% minimum.
To calculate the stock value I absolutely must keep in this account, the formula is :
[$20k (Amount Borrowed) / 75% (1 - Maintenance Margin%), which = $26,666.66.
You can also see that $26.67k (stock value) - $20k (amount borrowed) = $6.67k, and $6.67k / $26.67k = 25%, the absolute minimum of equity i can get away with.

If it falls below 25%, the bank can do what is called a margin call. This means that you must immediately deposit more money to make up the difference, or the brokerage can sell off your stock. Lets pretend the stock plummeted to $24k.
$24k - 20k = $4k / $24k = 16.667%, big oops. Margin call time. Im required to keep it at 25% of the $24k, which is $6k. Therefore I must immediately deposit $2k or the brokerage will sell off my stock.

Here is the kicker, the brokerage does not have to inform me of this, and in fact can immediately sell my stock the second this occurs. Even if this stock just had a bad day, and went up to $30k an hour later thanks to some huge announcement. They just liquidated enough stock to pay back the $20k loan, now I only have $4k left. And remember, I put down $20k of my own money, I’m now down $16k. Stonks

Here is the even more nightmare scenario. The $40k in stock dropped to $25k today, triggering the margin call. But brokerage didn’t tell me, and this price keeps dropping and dropping and dropping. The brokerage decides to sell, but now the value of it is only $12k. I not only just lost $20k of my own money, I now have to come up with $8k more to pay back this loan.

Edit: I can’t believe I just spent an hour writing this :skull_and_crossbones:

Hmm that sounds like a nightmarish thing to risk, and I can see why people would be enticed to do this.

Good freaking luck, Unwinder is extremely sensitive, how you feel about Steam is how he feels about the users of his product, he is sick and tired of everyone and everything. Years of bitterness add up. Also he is Russian, and English does not seem to be his first language. I really appreciate the work he has done, but he does not play well with others. Maybe that would be different when interacting with other developers. If anyone has a decent shot of it, it would be you, simply from sharing the technical backgrounds.

Big risk, big rewards. If that $40k turned into $60k and I cash out, I repay the loan, and just doubled my money. If I only invested my own $20k, it would only have been $10k profit.

Easily. For the most part, the most powerful hardware would no longer be the full picture, for the first time? It would make for awkward debates in the future over the best GPUs, since you can’t expect DLSS in every game, and in already existing games that users may want to replay. But if they can implement DLSS to a good standard in at least 50% of all the upcoming biggest titles, i’d say that’s a win for Nvidia.

We do have yet to see DLSS implemented well in a first person game, i don’t think we have. BFV had a horrible older implementation. Cyberpunk 2077 may be that game with a new version, but in first person we have to hope ghosting artifacts isn’t pronounced.

Because of issues caused by a too high mouse polling rate, in both applications and games. It’s something that I have been aware of for quite some years now, and a couple of years back I decided to not even bother with 1000 Hz since the gains were miniscule and the loses annoying whenever they happened.

250 Hz is a good middleground where I haven’t experienced any noticeable issues but still get an improvement over the default 125 Hz.

In regards to games, you can read the below posts from Kal on the subject:


Yes – works fine as long as you disable Steam enhancements of Special K.

HDR retrofit can also be enabled, although I’m currently unsure whether you get anything beneficial of it or not.

2 Likes

Usenet is often used for piracy. It’s like the paid version of torrenting kinda, except I guess it’s more like filehosting. But it was around long before Google Drive and Zippyshare.

@Kaldaien
Looks like Apple got angry at Epic Developer methods lol

1 Like

It’s just Apple’s latest in a longer series of events that was initiated when Epic decided to knowingly break Apple (and Google’s) terms of service for mobile apps in regards to payment processors for in-app purchases. Epic is fighting the 30% revenue cut on both platforms.

Apple’s terms of service does say that they will remove the developer accounts and apps tied to the corporation that broke their terms, and so they did so today for Epic Games.

Initially they also wanted to terminate all accounts related to the Epic International corporation (another legal entity where Epic handles Unreal Engine under – this legal entity has not broken the terms of service with Apple), but the court blocked Apple from doing so as removing the developer accounts related to Unreal Engine would have consequences on a shit ton of other developers and whatnot.