Business Big Tech Layoffs Megathread - Techbros... we got too cocky...

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
Since my previous thread kinda-sorta turned into a soft megathread, and the tech layoffs will continue until morale improves, I think it's better to group them all together.

For those who want a QRD:


Just this week we've had these going on:

1706112535506.png

1706112610401.png

1706112702576.png

But it's not just Big Tech, the vidya industry is also cleaning house bigly:

1706112854585.png

All in all, rough seas ahead for the techbros.
 
It will break sales records; launch day, launch weekend, first week and first month sales. That's a given. What happens afterwards is where the game goes to shit.
RDR2 sold 65 million copies and it's all the niggercattle talked about how it was the best, most realistic, most amazing game ever, day in, day out, for about a week. Then it was never talked about or mentioned again. I brought up the game a few weeks after and received "meh/I haven't played it in weeks/I haven't turned my PS5 on in weeks".

If that happens to GTA6, then rockstar are cooked.
The core gameplay loop will be good.

And GTA Online will be nuts for the first ~90 days until it's a pay to win hellscape.
 
Last edited:
When one of my superiors told me and a few others that we should all learn to "program via prompts" I thought he was being a retard, months later and am proven right.
I admit I don't know enough much about "AI" and even less about how it actually works but whats so special about prompting thats treated as this arcane knowledge that needs to be mastered? If you know how to program, not just write code, then it is trivial to ask the machine "hey I am working with this language and this framework and I need to do x, how can I make a function that can do x but also validate that y is a proper input?"
I want to believe I'm missing something and is not just course salesmen trying to inflate the image of "prompting"

Edit: ah I didn't see the reply a few posts down, that answers it perfectly.
 
Last edited:
When one of my superiors told me and a few others that we should all learn to "program via prompts" I thought he was being a retard, months later and am proven right.
I admit I don't know enough much about "AI" and even less about how it actually works but whats so special about prompting thats treated as this arcane knowledge that needs to be mastered? If you know how to program, not just write code, then it is trivial to ask the machine "hey I am working with this language and this framework and I need to do x, how can I make a function that can do x but also validate that y is a proper input?"
I want to believe I'm missing something and is not just course salesmen trying to inflate the image of "prompting"

Edit: ah I didn't see the reply a few posts down, that answers it perfectly.
Avoid power levelling but I’m currently sweeping up the mess of an Actual Indian who did needful program by prompt for a project. I’m just a humble Excel whisperer but am somehow fixing the mess Mutahar level engineers caused.
 
I am just a database jockey, but a division at my workplace a while back was harassed by a third party to adopt Indians. The Government actually hired them behind management's back, but it put them behind their work for months. In the end, we had to consult one of our contacts and use some of their programmers to correct their errors, incurring significant costs and lost time.

We were a very liberal company, but the stuff I heard from white women was very based. HR has outright Indian applicants on auto-ban now. I do mostly finance stuff and warned that this will end horribly, along with my insanely based manager. It's nice being the "we told you so group".

It feels like these people stalk codemy users but have no idea how to program. They don't realize that this affects other divisions too.
 
So they want to hire more Jeets and this is the only way to get more Jeets to pass their already low standards.
More like they can't get anyone under 25 to pass the tests and technical interviewers are being too damn anal about AI. They wouldn't even need to do this to hire jeets since they coach and train them to pass the interviews to begin with.
 
Well, if they want to commit codebase sudoku, they're welcome to go ahead. Curious to see how the actually informed investors take this news that Facebook is no longer interested in hiring the best - Because that is what this is saying, that they're more interested in hiring people who can effectively prod an AI into holding their hand over people who actually know how to do a thing on their own. Guess they're hedging on the uneducated retail investors reading this as AI positivity and bumping the stock enough to keep the major firms onboard.

This is one of those hiring decisions that doesn't hurt too much at first, then rapidly becomes septic well after its too late to course correct. AI Assisted talent is both inhibited by the knowledge the AI systems themselves possess, but also lack the technical foundations to develop past the AI, due to that reliance on it. In the short term in the immediate, likely lower position you hired them into? Not the end of the world. Couple years down the line when you're trying to actually do something new, or promote team members, etc, you have no talent that can actually be moved, because that talent never actually developed, just coasted on AI. It actually gets worse for them too. As a codebase accumulates inevitable technical debt, and particularly the odd types of redundancies AI code generates, you're left with developers that don't understand the foundations enough to actually fix it - Because the AI did it for them. This generally turns into stacking hack atop of hack to make deadlines, with the hope of a proper refactor in the future. With AI code though, those weird floating redundancies can easily turn into an absolute nightmare of fatal conflicts and edgecase bugs that the AI can't identify, and now neither can the undertalented developer working with it.

They're effectively setting themselves up to have a one-two punch of terminal code rot and no talent development among the people who rotted it, leaving you with a brittle, dysfunctional product and people who didn't even have the opportunity to actually learn from the mistakes of its development. They never made the core decisions, the AI did, leaving them with the wrong lessons learned in retrospect. Everyone fucks up a codebase with junior level decisions, but everyone learns from that mistake just by having actually done it, and now being able to see how they'd achieve the same goal different.

When it'll blow up on them, or how big it'll be? No idea. It'll depend on how long they stay this course before business inevitably pivots again. Six to twelve months of this kinda hiring practice, and the implosion might never even be visible to us. But if they go past that, it'll be a lovely shitshow at some point in the future.
 
Eh, I don't think it's that bad.
Well, if they want to commit codebase sudoku, they're welcome to go ahead. Curious to see how the actually informed investors take this news that Facebook is no longer interested in hiring the best - Because that is what this is saying, that they're more interested in hiring people who can effectively prod an AI into holding their hand over people who actually know how to do a thing on their own. Guess they're hedging on the uneducated retail investors reading this as AI positivity and bumping the stock enough to keep the major firms onboard.
In normal circumstances (read: smaller, smarter company) this would be a fairly decent decision. Give the candidate a problem, let them use AI to help determine a solution, and then question them on their solution and the choices made in coming up with it. Does the solution work when run? How have they tested it? Are there any optimizations they can think of adding? The goal isn't leetcode knowledge as much as seeing how the candidate thinks and problem solves (and how quickly), and no amount of AI use can cover that up if you're interviewing with that goal in mind.

Of course this is Facebook though so naturally it's just going to be leetcode 2.0, just with standardized prompt regurgitation instead of rattling off optimized algorithm knowledge.
 
Speaking of,

1754326376973.webp


Can't say I disagree with the zoomers on this one.

In normal circumstances (read: smaller, smarter company) this would be a fairly decent decision. Give the candidate a problem, let them use AI to help determine a solution, and then question them on their solution and the choices made in coming up with it. Does the solution work when run? How have they tested it? Are there any optimizations they can think of adding? The goal isn't leetcode knowledge as much as seeing how the candidate thinks and problem solves (and how quickly), and no amount of AI use can cover that up if you're interviewing with that goal in mind.
Exactly what I prefer, but each company has their own hiring method so interviews as a whole are a crapshoot.

Throw in AI agents into the mix, and it's a recipe for disaster.
 
HR, stretched thin? Nothing about HR workers is thin.

But seriously, what the fuck does HR actually DO at this point? They don't interview, they don't hire, they don't do any internal support. Are they seriously just dicking around on Instagram 8 hours a day?
 
HR, stretched thin? Nothing about HR workers is thin.

But seriously, what the fuck does HR actually DO at this point? They don't interview, they don't hire, they don't do any internal support. Are they seriously just dicking around on Instagram 8 hours a day?
Protecting the company from it's employees.
 
HR, stretched thin? Nothing about HR workers is thin.

But seriously, what the fuck does HR actually DO at this point? They don't interview, they don't hire, they don't do any internal support. Are they seriously just dicking around on Instagram 8 hours a day?
Of course not. They're on TikTok.


Protecting the company from it's employees.
The corporate version of human shields.
 
Speaking of,

View attachment 7735689

Can't say I disagree with the zoomers on this one.


Exactly what I prefer, but each company has their own hiring method so interviews as a whole are a crapshoot.

Throw in AI agents into the mix, and it's a recipe for disaster.
hilarious to see this after reading about a company who does hiring/HR work being sued over their AI discriminating candidates based on age.

and yes I ask myself the same thing: what does HR do at this point? Harassing workers?
 
Back
Top Bottom