That's the way it looks to enthusiasts, but that's not what Intel has been doing. About a decade ago, we hit the point where processors were "fast enough" for mainstream tasks. People stopped buying i5 and i7 processors, in favor of i3, Celeron, and Pentium. For the last 10 years, only enthusiasts and gamers have cared about improved performance. The vast majority of the market cared more about power consumption. Intel hasn't been worried about AMD, but they were scared to death of ARM. They rushed to bring Atom to market to keep the low-end on x86/x64, instead of moving to ARM.
So they haven't been resting on their laurels. They've been working hard at reducing power consumption. That's what really hurt AMD after they lost their performance lead. For a few years AMD still offered more performance per Watt, making AMD the natural choice for moderate-load servers and systems meant to be left on 24/7. But Intel soon beat AMD there, taking away AMD's only advantage. (That's when AMD used their ATI acquisition to integrate a GPU which could beat Intel's integrated GPU - essentially carving out a spot in the low-price gamer market.)
A Core 2 Duo system would use about 70 Watts idle, 150 Watts under load. A Sandy Bridge system would use about 45 Watts idle, 120 Watts under load. A modern Skylake system uses about 35 Watts idle, 80 Watts under load. Subtract the 20-30 Watts consumed by components other than the CPU, and the reduction in CPU power consumption over the last 10 years has been remarkable.