One of my biggest gripe with such behaviour or "PSA my performance improved 20 FPS by doing this" on sites like Reddit and Steam are, is that majority of the time, they never provide any benchmarked results. The person who wrote the memory pool adjustment post made zero attempts to justify his findings or his performance improvement. And I'm supposed to believe that, especially considering the numerous combination of PC specific hardware out there?Actually, this touches on something I wanted to remark on due to these patch notes.
Is basically a polite way of saying "you imagined that editing some numbers in a file that isn't even used changed performance". Quite a few people imagined it drastically affected performance.
The person who originally wrote about it (with, among other things, the "qualification" that they know what to do with config files since they are a C# programmer; I thought that was quite amusing back then) also implied that CDPR was somehow gimping performance for not doing something so apparently obvious. And lots of people jumped on the bandwagon (just like with the SMT thread count on some Ryzen CPUs, which was first supposed to be missing optimizations due to Intel being evil, until it turned out that it's just code from a library and no Intel compiler is involved; From an AMD library, FWIW).
This is emblematic of the kind of "voodoo+outrage" approach to performance and tuning that I noticed in the gaming community many years ago -- and if anything it's getting worse (I should note that this is really not unique to gaming, I feel like the same thing applies broadly to lots of online discourse). I wish people would use incidents like these to reflect a bit about how quickly they trust random information, and particularly what level of care they expect and demand when generating and presenting performance results.
And since I feel like this kind of opinion is often misunderstood, I want to make really clear that I don't at all feel like only people who have a deep technical background should be discussing performance or even trying to discover improvements or document behaviours. However, if you aren't reasonably sure about your information you should stop short of blaming specific people, technologies or companies, and you should really try to follow best benchmarking practices and double check your results before publishing your findings.
Misinformation on Reddit spreads like wildfire, anyway. Steve from Gamer Nexus did a video on this particular problem.