- Joined
- Jan 2, 2025
These people are so fucking stupid, man. I hope that wasn't some bum they talked into doing this. They'll 100% kill someone, which I guess isn't surprising given that's what they do. "Teenagers Kill Elderly Man With Watermelon" should tell you everything.Please let this catch on.
Please let this catch on.
Please let this catch on.
watermelon challenge.mp4
There's definitely shit heads everywhere, but Karen was supposed to be some annoying, entitled busybody. "I want to speak to the manager" is like where she tops out at, which is the opposite of Shaniqua. But there's tons of reasons to legitimately need to see the manager, but you don't want to risk being a Karen!I have seen some bodycam or other just civilian videos where the white Karen is nearly as bad or even worse than the average nigger in a similar setting but, having heard what I have about customer service in general, the white woman is typically a much, much safer option.
Even when white women are nags or upset, the stereotype is requesting an authority because they understand chain of command and shit. Blacks seem to have zero concept of that shit. A police officer with a gun pointed at me, telling me if I reach for my pockets, they're going to shoot me? LOL, bullshit. I'll hit him so he knows who is really in charge.
Anyway, the point I was making is the rapid, overnight, public acceptance of this term. Now it's used to silence white women for daring to call something out, such as Citibike "Karen." It's morphing into "all white women are inherently racist, so when a white woman calls something out, especially with other races, she deserves everything coming."
Agreed, but these aren't unique issues to LLMs. The same script kiddie will delete his shit by copying and pasting from the garbage result he Googled, too. I mean, a huge % of the available web and other human writing is the foundation of LLM architecture. It's biased toward all the hugely propagandized topics, of course.I've used DeepSeek and ChatGPT with a pretty decent amount of success but, like with any tool, being skilled in its use means knowing its limits. I will rarely ask any LLM for any culinary advice save for simple things I can immediately vet. I've read about people copying and pasting AI-generated shell scripts and then nuking their files, when anyone a little knowledgeable about Linux and Unix scripting who just gave the output a quick once over could have seen the deletion was anything but obfuscated, as inrm -rseveral times right there in a short script. Also if you ask DeepSeek anything about Xinjiang or organ harvesting in China or whatever it immediately goes full propaganda mode for the CPC. It's somehow disturbing to look at.
LLM's are amazing. I'm puzzled how people think search result pages better. I still maintain people just suck at using them. Haven't tried DeepSeek yet.