I can't believe I have to spell this out.
Lets look at a game like Far Cry.
To get good frame rates (graphics aside), back in the day you needed a desktop CPU. And that would be fine because Far Cry would run on the mainstream desktop CPUs of the day.
Today there is Far Cry 5 or whatever they are up to. To get decent frame rates, you need a desktop CPU as you always did. Because spoiler ... not only are the CPUs better today, so are the games. And the operating systems, and productivity programs and evrything else.
Now, when developers optimize their products, they look at the tech of the day. They'll say "Hmmmm. What is the shittiest PCs people are likely to have right now?" Say an average PC from 5 years before. And they'll then make the game work for that on low settings and write down what those settings were and call them minimum requirements.
Then they'll say "Ok, so what is an average PC of today like?" and they'll make the medium settings of the game work for that hardware and call it recommended settings.
Then they'll say "We need to sell this game and so we want it to look boss. What are the best PCs enthusiast gamers have right now?" and they'll make sure their game with all settings at ultra match the best GPU and CPU combos of the day for the best gameplay experience possible.
Now, whether that was in 2001 or 2018 the recommended settings are going to be based on a 65W desktop processor of that day. Historically Intel's mainstream desktop offering.
So, if you want recommended settings in 2001 you needed a 65W desktop CPU, but those weren't in laptops back then because of heat. But today they are in the laptops and so ... you can play games with recommended settings and enjoy the top titles of today without turning everything down to low.
And so ... because you can get desktop CPUs and GPU chips into sff and laptop devices, you can now for the first time enjoy the same experience as a desktop user ... and your device will be filled with heat pipes to make that happen.