IMEC (an industry group) has a roadmap planning slow and steady progress out to 2039, with new GAAFET and sequel-to-GAAFET transistors, and other technologies deploying in parallel, like backside power delivery. They assume the use of high-numerical aperture extreme ultraviolet lithography (0.55 NA), and later beyond-high-NA EUV (0.75 NA).
So there's a plan to follow for the next 15 years, even if the gains are much slower and more modest than they were 20-50 years ago. AFAIK, it doesn't lean into the use of 3D, other than changing the 3D structure of the transistors like what you would see with future
complementary FETs (CFETs). If we can figure out how to build layers of logic transistors without heat destroying them, there's potentially decades more improvements in transistor density to come after 2040.
SRAM (e.g. L3 cache) has been successfully 3D stacked in a 1-layer configuration since the introduction of AMD's X3D CPUs. Because SRAM doesn't scale very well, and a large amount (e.g. 50-70%) of typical chip area is taken up by SRAM, moving it off of chips can make them smaller, cheaper, and better performing. More than 1 layer should already be possible. This could become a standard feature (out of necessity) rather than a premium feature in the future.
The major memory manufacturers are planning 3D DRAM for the early 2030s, like what was already done with NAND flash storage since 2013. It's harder to build 3D DRAM, but once they pull it off, I anticipate at least 1-2 orders of magnitude improvement in density and cost. That's more exciting to me than what we're getting with CPUs/GPUs. I think you're looking at $0.10/GB, or 1 TB of RAM for $100 by 2040, if not cheaper. Before the recent explosion in pricing, we were between $1-2/GB. Application memory requirements stalled out for the most part as RAM scaling slowed down 15 years ago, but AI LLMs are a big new memory hog. RAM needs to be cheap and plentiful if you want to run local models with hundreds of billions of parameters.
There's a race between many companies to make quantum computers practical. Eventually, they could fall in the hands of consumers, if they can be made to work at so-called "room temperature". But it's still unclear what consumer-facing software they would be useful for. The exciting part is that if the usual semiconductor manufacturing methods can be applied to quantum computers, we could see rapid exponential growth as qubit counts follow behind transistor counts.
Aliens are real and there's a lot of
hope/cope that President Trump will be more interested in revealing them than the Epstein files.