Gamers Nexus

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
You’ll hear some retards say that programmers don’t like writing parallel code. That couldn’t be further from the truth. Multi-threaded code has a been first-class citizen for a long time. It’s just not a panacea that laymen think it is.
>some retards
>aka 5 people in the thread already
lol
I don’t like writing multi-threaded code in C bc it can introduce subtle bugs that are extremely annoying to trace down, and the concurrency primitives you get in most OSes just kinda suck. I hear Windows is kinda better than Linux/POSIX in this regard, but I don’t program on Windows so idk. Doing concurrency in other languages can be nice, though. I find Go really nice with its gothread+channel model, and the async/await model used in other languages is pretty nice as well.
Also, it’s true games typically have a lot of global state that makes it difficult to do proper concurrency, parallelism can be beneficial even in sequential algorithms for things like batch processing large amounts of data. Also also, some design patterns such as ECS lend themselves pretty well to concurrency depending on how you break out components and systems.
 
No amount of based takes can make me not look at him and think "that guy would get pussy if he just got a decent haircut instead of clinging to his 'personality' cut".
There's a certain type of audience that wants to see a hipster Jesus ramble to them about the evils of AI while occasionally delivering a humorous remark.

It's kind of stupid to say he has clean hair though. Like mofo you can't smell the hair through your screen. Very parasocial and creepy ngl.
 
Does a single game exist that actually use more than.. 1 core? If not, what are the use of several cores? Rendering, programs? Shit, even RAM was a meme for basically all of the time I recall. Suddenly you need 64gb of the zigahertz variant, when anything over 16gb 3000mhz was overkill.
the games that could benefit from multithreading have retarded devs who don't want to multithread, because they are most likely retards targeting 2 core people.

Games typically do everything in one big while loop. For decades, calculating your position, all the enemies, any enemy AI logic, all the level logic .. all was done in a single rendering loop. When you hit 40fps or 80fps or 100fps .. it was doing all the game logic in one loop. That's one of the reasons, decades ago, everyone was benchmarking with Quake 3 Arena. It was one of the very early games that could support SMP (and it was long before the Pentium D, multi-core era).

There are more multi-threaded games today, but you can do a lot in that main loop. Today, you can offload a lot to the GPU. Most games use immediate mode rendering, need snappy menus ... a lot of stuff that doesn't require, or even lend itself to, multi-thread usage. You've got to deal with added synchronization, which can often cost more compared to just optimizing for fewer threads.

Most games still don't need more than 16 GB RAM. Hardware Unboxed just tested this again. 24-32 GB is preferred to have some headroom.

I can see this. If you're a developer, docker containers, VMs, and all the crap you need in integration flows can easily lend itself to needing 32GB+. I run 64GB and can max out when I have a lot of stuff running .... or if I start 2 electron apps. But my gaming rig is still just 16GB of DDR4 with a 3080Ti in it, and it's way more than I need for any game I give a shit about playing today.
 
Most games still don't need more than 16 GB RAM. Hardware Unboxed just tested this again. 24-32 GB is preferred to have some headroom.
I can see this. If you're a developer, docker containers, VMs, and all the crap you need in integration flows can easily lend itself to needing 32GB+. I run 64GB and can max out when I have a lot of stuff running .... or if I start 2 electron apps. But my gaming rig is still just 16GB of DDR4 with a 3080Ti in it, and it's way more than I need for any game I give a shit about playing today.
i mean the new COD is the only game where it can push beyond the 16GB threshold and it can reach 28GB VRAM needed easily, i posted about a monkeytuber that did a benchmark with a modded MI50 and well, i just told you... 28GB...
so that it's something to maybe possibly look forward to as tech goes on, sure the entire goal of 8GB + fucktillion of fake frame bullshit is what pretty much devs will look for now with ngreedia and amd can push for 8GB vram to still be a thing in 2025+ but those that forked a few G's on their card will still have the option to run natively and eat the 24GB ish+ of vram of their expensive ass cards... i mean sort of, the modded MI50 is like 100$...

also on RAM itself most games don't load hugeass maps so yeah 16GB is pretty comfy but if you play large mapsize games then you will reach 16GB easily, unreal engine doing a gamebyro and loading maps as cells will have some impact on the overall ram needed to play a game yes buuut that will depend on timmy tencent's team on how to optimize the overall asset streaming, whereas cpu cores in regards to that is more to calculations and shieet, sim games are the ones that need it the most and curiously enough X3D cpu's are the ones that make the game run smoother than regular cpu's, something even xeon cultists will snark over.

also sim games are the ones that usually work on a dual core tops rather than multithreading because i don't know, bugs probably, i mean these games are usually pretty buggy as is... minecraft is a example i can give, people say bedrock is multithreaded while java isn't and bedrock is full with bugs that usually causes death in the most insane ways like flying and falling to your death midair or having the game just getting stuck in a eternal lag which will kill you eventually...
 
Last edited:
curiously enough X3D cpu's are the ones that make the game run smoother than regular cpu's, something even xeon cultists will snark over.
Cache size is one of the most important factors in CPU performance and it’s a small wonder why every manufacturer isn’t pursuing absolute maximum cache size for all their CPUs.
If you want to design fast software on modern CPUs, make your program use a small section of contiguous memory over and over and over again. The bigger your cache, the bigger a “small section” can be and still have the previous statement apply.
 
Last edited:
AyyMD fanboys fell for it again and the "savior of the PC market" turned out to be as greedy as the rest of them.
https://youtube.com/watch?v=WsCrKGY9F1oIt's a great schadenfreude to see all the retards who thought that AMD is somehow "the good guys" like Valve suddenly piss and shit their pants when a publicly traded multi-billion dollar corporation does publicly traded multi-billion dollar corporation things. Like what the fuck did you expect, they didn't "betray" you, you're just a mouth breathing retard that had loyalty to a corpo that doesn't give a shit about you.
Lolcow arc data points:
1. Theater kid title: AMD failed nobody. Anybody who’s knowledgeable in this space knows you never trust AMD in the first place. AMD can’t fail anybody because nobody has high expectations for AMD
2. Entirely vague title and thumbnail: solely intended to get people to click the video so they can even know what the subject is about.

So what is it about? Just general AMD updates and their shift towards AI. Big deal. What did you expect? That they wouldn’t try to get some of Nvidia’s pie?
 
Theater kid title
It's just PC gamers being nigger cattle and thinking AMD are the Jedi to Nvidia's and Intel's Sith. Same with how Steve keeps shitting on Windows and praising Linux while struggling with doing Linux benchmarks and only ever making one video, where the title and thumbnail were clickbaity "fuck Windows" type of shit while the video itself has prefaced the benchmarks with a sleuth of major issues Linux gaming has that will effectively dissuade anyone who just wants their shit to work. I wouldn't say this is a "lolcow arc", just Steve getting too comfortable with using sensational titles and thumbnails and reinforcing retarded biases PC gamers hold, like AMD being some godly savior or Linux being infinitely better than Windows just because 11 is ever so slightly more shit over 10.

What would be a bigger sign of a potential lolcow arc, which you'd perhaps notice if you bothered to watch the video, was Steve's oil comment. That's the spiciest thing he has said to date.
 
just Steve getting too comfortable with using sensational titles and thumbnails and reinforcing retarded biases PC gamers hold
I was going to disagree with you but then hit this part, I actually in general agree.

The dude is totally hamming up a wholesome chungus tech jesus act and its kindof annoying because of how basic it is and in particular how well its working for him. A lot of his recent stuff, for instance jumping on the RAM sperging bandwagon without adding any analysis or information of value is simply lazy and annoying.
 
Last edited:
He finally uploaded the Nvidia keynote review after saying in the previous two keynote reviews that it would already be up.
https://youtube.com/watch?v=mFG3Ah-zf18
This was hard to watch.... moreso than the Inten and AMD one, and that's all thanks to Jensen Huang, whom acts like such a pompus asshole... the kind of fuck-face that would wind up getting shoved in a locker or have his head shoved in a toilet in a heartbeat by the other kids in school. That's how fucking annoying he is. This motherfucker is so autisticly cringe, I wind up getting 3rd, 4th, and 5th hand embarrassment every time he opens his mouth. The fact that his shit slides kept breaking on him was the icing on the shit sundae he was serving.

The AI fatigue is fucking real. I can physically feel my IQ drop everytime it's brought up, and the sooner this fucking bubble explodes, the better.
 
The AI fatigue is fucking real. I can physically feel my IQ drop everytime it's brought up, and the sooner this fucking bubble explodes, the better.
it has reached guberment, expect them to flush a few billions before it explodes and then there will be no consequences for the parties involved, back in the day the king usually had these people executed in the castle courtyard and invited all the peasants to watch.
 
This was hard to watch.... moreso than the Inten and AMD one
Someone posted the nvidia keynote in a work chat when it was live and I watched 10~15 min. It's not as bad in context .. but it was still bad. I told my workmate I'd just watch the GN version where they cut it to 3 minutes of just the word "AI" over and over again.

it has reached guberment,
Oracle's first customer was the CIA, Facebook was funded by Thiel and launched the day DARPA shut down Digital LifeLog (a coincidence, nothing to see here), the US Navy built TOR and Signal was likely funded by the CIA via Open Whisper Systems.

It didn't "reach" the gooberment. Their hands funneled money into the machine back in the 80s/90s and continues today.
 
Oracle's first customer was the CIA, Facebook was funded by Thiel and launched the day DARPA shut down Digital LifeLog (a coincidence, nothing to see here), the US Navy built TOR and Signal was likely funded by the CIA via Open Whisper Systems.
It didn't "reach" the gooberment. Their hands funneled money into the machine back in the 80s/90s and continues today.
what i meant with "reached the guberment" is that nvidia basically became a car company and will get direct taxpayer money while the companies that get funneled dosh as contractors will still get theirs too, although i did like the funding for the fabs, it will make importing from US take less time than china, even express shipping takes a couple of weeks while from US is 3 to 5 days tops, ebay is another thing that takes only a few days to ship, it's fucking awesome not having to wait a fucking month for computer parts.

also i kind of forgot to mention that this type of concession money funneling is old as fuck and happens pretty much everywhere, especially if politicans are allowed to buy shares or let lobbyists give them shares, kinda like how govt construction buildings take a ton of time to complete because the workers up the laziness to 11 as they will get paid anyway.
 
Last edited:
This was hard to watch.... moreso than the Inten and AMD one, and that's all thanks to Jensen Huang, whom acts like such a pompus asshole... the kind of fuck-face that would wind up getting shoved in a locker or have his head shoved in a toilet in a heartbeat by the other kids in school. That's how fucking annoying he is. This motherfucker is so autisticly cringe, I wind up getting 3rd, 4th, and 5th hand embarrassment every time he opens his mouth. The fact that his shit slides kept breaking on him was the icing on the shit sundae he was serving.

The AI fatigue is fucking real. I can physically feel my IQ drop everytime it's brought up, and the sooner this fucking bubble explodes, the better.
His jacket is so tasteless it physically hurts to look at him.
 
It's just PC gamers being nigger cattle and thinking AMD are the Jedi to Nvidia's and Intel's Sith. Same with how Steve keeps shitting on Windows and praising Linux while struggling with doing Linux benchmarks and only ever making one video, where the title and thumbnail were clickbaity "fuck Windows" type of shit while the video itself has prefaced the benchmarks with a sleuth of major issues Linux gaming has that will effectively dissuade anyone who just wants their shit to work. I wouldn't say this is a "lolcow arc", just Steve getting too comfortable with using sensational titles and thumbnails and reinforcing retarded biases PC gamers hold, like AMD being some godly savior or Linux being infinitely better than Windows just because 11 is ever so slightly more shit over 10.

What would be a bigger sign of a potential lolcow arc, which you'd perhaps notice if you bothered to watch the video, was Steve's oil comment. That's the spiciest thing he has said to date.
Fanboying for a company (or person) is retarded and always will be. Stripped down to it's most basic it's investing time and money on the basis of something other than what do I get for that time and money. Add in the emotional commitment that accompanies it and rationality has long departed. It's the Steve Jobs religion approach to marketing; faith not facts.

Steve (GN Steve) comes across as a typical soft liberal. He pontificates about green and other social issues whilst salivating over multi hundred dollar cases, PSUs north of a KW and GPUs and other components that are in resource, energy use and cost terms, far in excess of his audience need (as opposed to want) and tries to ignore the more unsavoury elements of their production chains flying round the world to uncritically present the views of those wanting to sell things to his audience. (People almost universally recoil in horror at the nature of UK factories during the Industrial Revolution ("the dark Satanic Mills"). Funny thing is they never needed suicide nets and life expectancy and living standards even at the bottom of society dramatically improved.)

That oblivious but obvious hypocrisy is nowhere near enough for lolcow status. It's unfortunately far too common. What might get him there is his increasingly common tendency to make grand self congratulatory assertions for virtue on matters he has no real understanding of. The oil ref is just the latest but we've had it on tariffs, markets and economics generally in recent times. He's not gone full "muh capitalism bad amirite" yet but it's a dangerous road to start down.
 
jumping on the RAM sperging bandwagon without adding any analysis or information of value is simply lazy and annoying.
How can you be this retarded? He was arguably the first one to report on it within hours of RAM skyrocketing. It would be a different story if it was weeks or days after, but no. GN was one of the first to report on it
 
Steve (GN Steve) comes across as a typical soft liberal. He pontificates about green and other social issues whilst salivating over multi hundred dollar cases, PSUs north of a KW and GPUs and other components that are in resource, energy use and cost terms, far in excess of his audience need (as opposed to want) and tries to ignore the more unsavoury elements of their production chains flying round the world to uncritically present the views of those wanting to sell things to his audience. (People almost universally recoil in horror at the nature of UK factories during the Industrial Revolution ("the dark Satanic Mills"). Funny thing is they never needed suicide nets and life expectancy and living standards even at the bottom of society dramatically improved.)
He coats that stance with a ‘consumer advocacy’ coat of paint but it’s nonsensical. He seems to think that Intel, AMD, and Nvidia should only work on incremental upgrades and never deviate from the current paradigm because it’s ‘for the consumer’ and yet most of the tech that benefits consumers starts as a need for industry and business. And consumers are often business owners as well.

You’d think him being a business owner himself and with all his industry contacts he’d get this, but he just doesn’t.
 
He seems to think that Intel, AMD, and Nvidia should only work on incremental upgrades and never deviate from the current paradigm because it’s ‘for the consumer’ and yet most of the tech that benefits consumers starts as a need for industry and business. And consumers are often business owners as well.

I haven't watched GN long enough to see it that way. The most recent videos criticised a clear absense of consumer products at an event that's supposed to cater to, well, just that. He actually had some positive to say about big N's server bit, which is impressive considering it was 99% Jensen performing a humiliation ritual. (Something about the layered memory approach. Idk)

The sharpest criticism is about N's shift towards the military industrial complex and how that's costing and fucking everyone else over. I find it rather bizarre that any reasonable person would take issue with GN on that.
 
He coats that stance with a ‘consumer advocacy’ coat of paint but it’s nonsensical. He seems to think that Intel, AMD, and Nvidia should only work on incremental upgrades and never deviate from the current paradigm because it’s ‘for the consumer’

It may have not come out in this video, but the whole argument he's been making is that all these shops (Intel, AMD, nVidia, Micron) have been getting CHIPS Act money. Tax payer money, funding something that's ultimately fucking over the taxpayer/regular PC building consumer. If it was just a matter of companies shifting and leaving consumer markets, that'd suck, but it'd still feel like regular a "free market" situation. But it's not. It's tax payer money subsidizing to enterprise products.

He actually had some positive to say about big N's server bit, which is impressive considering it was 99% Jensen performing a humiliation ritual.

I'm glad he did go over the impressive stuff in the second act of this video, and didn't just focus on the negative. Jensen talking about the machines being able to "think" and "reason" shows he either has no fucking idea at all how predictive token generation in LLMs actually works, or he fully realizes it's bullshit but is saying all the marketing speak that was written for him anyway. It doesn't matter if the teleprompter goes off and he has to improvise, because what was written on there literally didn't make any sense to begin with.

Who is all of this for? We're seeing OpenAI and Antrophic struggle to make a return, even with companies buying up Github Co-pilot licenses and consumers actually paying for the $200+ GPT Pro plans to get the older models to regain access to their GPT girlfriends.

If you actually run this shit at home, you'll immediately see how computationally expensive it all is. I've got comfyUI running on an AMD Pro R9700 and an older Pro W6800. For some of the more complex models, generating an image takes about the same amount of time on both cards (~18min for a batch of 8~10, depending on the prompt and LORAs loaded). The newer R9700 is faster at loading/switching models with the newer gen of PCI-E. From what I've read, this pales in comparison to even the lowest-end nVidia chips, but a 5070 isn't going to have 32GB of Ram. Rendering in Linux on an AMD card also slows the machine to a crawl and makes it unusable, so I'm often running renders on the machine I'm not currently using.

So maybe an nVidia card would be nice for some of this stuff, but the useful ones are still insanely expensive. Most DGX units are $3k~$5k, and they often come in two packs for people who I guess think a set of AI bricks is worth the price of a usable 2000s Honda Civic.
 
Back
Top Bottom