Well, some of the power is passed on to other components. CPU needs to talk to the ram? Needs to send electrical signals to it. Wants to talk to the GPU? More signals needed. Motherboard, resistors, signals to drives and other chips.
This is negligible, since the purpose of these signals is to transmit information and not power, so there isn't a lot of current involved. But more importantly, this will just be turned into waste heat when it reaches those components anyway, so it makes no difference to the total heat the system generates.
Some is lost to electro-magnetic radiation.
Which is a form of heat, albeit not one that needs to be extracted by a cooling system.
Some is noise, you can hear a CPU.
Not over a 10W speaker. Also negligible.
Some is vibration. If a CPU operates at 4GHz ... that's a lot of very shallow amplitude but high frequency oscillation. Imperceptible to you as a human, but guzzling power all the same.
4 GHz waves cannot permeate effectively in air. A 4 GHz sound wave in air has a wavelength of about 80 nm, which is short enough that it spans about 20 nitrogen molecules. You're playing pool with molecules at this point. This means the vibrations are never going to leave the computer, and therefore do not qualify as a form of loss.
Guess what happens to vibrations when they impart energy but don't travel very far as waves. Go on, you can think about this one for a bit.
Electrostatic losses when not all power send to capacitors/transistors reaches them.
Another form of heat...
These days even quantum tunneling is a loss. Not all your electrons do the thing they were sent to do.
"These days"? Are you aware that quantum tunneling didn't just spawn into existence when it was discovered?
Nevertheless, this is both negligible and irrelevant, since electrons are matter and not energy.
There are loads. A CPU isn't just made to generate heat.
Actually, that is literally what a CPU is designed to do. It's an entropy machine.
Are you familiar with the Second Law of Thermodynamics? It states that the total entropy of a closed system must always increase. Now consider what a computer is supposed to do. The job of a computer is to take disorganised data -- with high entropy -- and turn it into a form that can be easily accessed by humans -- with low entropy.
Now we have a problem. To do its job, a computer needs to reduce the total entropy in its memory banks. Since the entropy in a closed system cannot decrease, this means that the computer cannot be a closed system, and it must generate more entropy than it removes. The more entropy it generates, the better it performs.
This is the technical reason for the common sense correlation between higher TDP and better performance. When it comes to computers, better performance
is more waste heat.