Once upon a time, every home computer came with a programming language. You turned on the machine, and you were presented with a prompt at which you could enter code. One of the first machines I used was a BBC Model B, made by a British company called Acorn Computers. Acorn later went on to design the ARM CPU, now the most widespread CPU architecture in the world. This had a dialect of BASIC with some really neat features, such as the ability to generate machine code on the fly (so you could write your own compiler), proper support for structured programming, and a load of other things. The machine also had some nice things to play with in hardware, like a teletext chip (so you could make teletext-like programs), a sound chip that supported several channels and a white noise generator, and a host of I/O ports to which you could easily connect home-electronics projects.
Over the years, these features gradually went away. DOS came with GW-BASIC and later QBASIC (a cut-down version of Microsoft's Quick BASIC). With the advent of Windows, if you wanted BASIC you had to buy Visual Basic, which was later rolled into Visual Studio. Other tools, such as Borland's Delphi, offered a similar experience—but, again, weren't included. Programming became something you had to want to do, rather than the default way of interacting with a computer.
Then the Web happened. People started running applications in web browsers. Suddenly the barrier to entry went away again. Anyone with a text editor could run his or her own code in the same environment used for running web apps. For a little investment, you could get some space on a remote host that would allow server-side scripting, too.