On the general notion:
Consoles were a much better option back in the 70s-80s and even in the early 90s. Computer hardware was expensive ($1200 for a Tandy 1000 in 1983 compared to $300 for an NES) and lacking Hard Drive space, you had to run everything on a floppy disk anyway so why bother buying a very expensive computer that could run games when a much cheaper solution was available? Especially since PC programming was extremely tight. Drivers were a pain to code. And a program had to fit everything on a 5.25" floppy disk.
Granted, the NES wasn't exactly a complex machine even back then. They could be made cheaply and require only one language with one hardware set. PC gaming couldn't compete with the cost but it made up for it with more complex games, like Adventure Games.
Sega Genesis and the SNES were also big sellers and their graphics were pretty good for the time. And what did the PC have? Windows 3.0 and Dos. Granted, there are some great Dos games of that era but getting them to work required actual knowledge of how computers worked. You had to set your video driver (EGA, VGA, SVGA) as well as audio IRQs. AND you had to usually type in command line text to get something running. The console gave players ease of use at an affordable price. And with both Nintendo and Sega not porting their games to PC for profit reasons(They'd be morons if they did), this was the start of the console exclusives. Today, consoles aren't much different than PCs. Just with very specific hardware designed for high resolution graphical rendering. But they still hold that place of the "traditional" gaming machine.
As for the popularity of games. As I'm sure most of the programmers here will agree, coding for one set of hardware under one OS is much easier than coding for multiple hardware types with multiple OSes. Heck, just putting all the graphical options in the game(turning off anti-alising, render distance, sky effects, etc...) takes time. On a console, you don't have to care about that stuff, just make it work on the one piece of hardware it needs to work on and you're done. Debugging is far simpler and as such development is, theoretically, cheaper.
That is all incidental, and a circular argument. The reason many games exist for consoles is that people buy consoles. If people didn't buy consoles, there would be no gain in making games for them.
Your argument therefore boils down to "consoles make sense because consoles make sense".
Not quite. Consoles make sense because there has been a growing market for consoles for more than 30 years.
PC games didn't exist 30 years ago?