View attachment 7725802
Eh, I don't think it's
that bad.
Well, if they want to commit codebase sudoku, they're welcome to go ahead. Curious to see how the actually informed investors take this news that Facebook is no longer interested in hiring the best - Because that is what this is saying, that they're more interested in hiring people who can effectively prod an AI into holding their hand over people who actually know how to do a thing on their own. Guess they're hedging on the uneducated retail investors reading this as AI positivity and bumping the stock enough to keep the major firms onboard.
This is one of those hiring decisions that doesn't hurt too much at first, then rapidly becomes septic well after its too late to course correct. AI Assisted talent is both inhibited by the knowledge the AI systems themselves possess, but also lack the technical foundations to develop past the AI, due to that reliance on it. In the short term in the immediate, likely lower position you hired them into? Not the end of the world. Couple years down the line when you're trying to actually do something new, or promote team members, etc, you have no talent that can actually be moved, because that talent never actually developed, just coasted on AI. It actually gets worse for them too. As a codebase accumulates inevitable technical debt, and particularly the odd types of redundancies AI code generates, you're left with developers that don't understand the foundations enough to actually fix it - Because the AI did it for them. This generally turns into stacking hack atop of hack to make deadlines, with the hope of a proper refactor in the future. With AI code though, those weird floating redundancies can easily turn into an absolute nightmare of fatal conflicts and edgecase bugs that the AI can't identify, and now neither can the undertalented developer working with it.
They're effectively setting themselves up to have a one-two punch of terminal code rot
and no talent development among the people who rotted it, leaving you with a brittle, dysfunctional product and people who didn't even have the opportunity to actually learn from the mistakes of its development. They never made the core decisions, the AI did, leaving them with the wrong lessons learned in retrospect. Everyone fucks up a codebase with junior level decisions, but everyone learns from that mistake just by having actually done it, and now being able to see how they'd achieve the same goal different.
When it'll blow up on them, or how big it'll be? No idea. It'll depend on how long they stay this course before business inevitably pivots again. Six to twelve months of this kinda hiring practice, and the implosion might never even be visible to us. But if they go past that, it'll be a lovely shitshow at some point in the future.