Moore's law had a great run: ~40 years from early 60s to early 00s.
During that time, every generation boosted density, gate count, clock speed, and value per dollar.
The (exponential!) rule of thumb was 2x more every 18 months.
Everyone knew it had stop sometime: you can't make things smaller than atoms.
What finally did stop it (considerably north of atom-scale) was gate tunnelling current.
In a MOS-FET, the gate is separated from the channel by an insulator (SiO2).
As you scale the transistor down, that insulator gets thinner, along with everything else.
When the insulator thickness is less than the wavelength of an electron, you start to get significant tunnelling current.
This acts like short-circuit from the power to ground.
The technology hit the wall around 2003.
Gate tunnelling current was then over half of total power dissipation.
The power density of the CPU chip was 150 W/cm^2 (like a stove top),
and going further was clearly impractical.
As it happens, the clock speed at that design node was 3 GHz,
and that's pretty much were we are today.
Everything since then has been building bigger, not faster: multi-core, caches, SoC;
plus architecture tweaks and optimizations, like pipelining and super-scalar.
It was a great run while it lasted, but it's over,
and we're not getting another one without a fundamental scientific/technological breakthrough,
on the order of coal, or steel, or quantum mechanics.