Seeing young people today taking technology for granted that was quite literally the stuff of SciFi stories during my childhood makes me wonder how we're going to get to the next level, if fewer and fewer people get into engineering and science careers now than in the past 50+ years.
Consider for a moment how much computers and their processing power they possess have advanced over the past 24 years: when I started this business in 1992, we were playing video games like Zero Wing ("CATS: All your base are belong to us", released in 1991) and Myst in 1993.
Now we're immersing ourselves in a virtual world like Destiny in 2014 and The Order: 1886 in 2015, and are on the brink of even more immersive experiences with VR goggles such as Oculus Rift and Microsoft HoloLens on the horizon. Yet if you consider the advances from Zero Wing to Destiny you're still only looking at about ⅔ of the progress that I've personally witnessed since I became interested in computers at age 12...
Back then we had a TRS-80 in my middle school and a friend's dad owned a Commodore PET. Later, during my high school career, we had Commodore 8032s to work with, and at my dad's laboratory at the university I had a chance to work with an Apple II (actually, to be more precise, it was a French Apple II clone).
It wasn't until my Junior year that I was able to afford my first very own computer, an Apple IIe, and later one of the first IBM PC-XTs and then one of the first Macs during college.
I started programming early on and wrote software for a variety of local small businesses, which allowed me to be an early adopter and buy some pretty neat computers at that time.
All of these machines had - by today's standards - extremely slow CPUs and laughably small amounts of RAM (and all, except the PC-XT, didn't even have hard disks!).
In fact, your typical smartphone today has more computing power, memory, disk space, etc. than all of NASA had in their "supercomputers" when they placed a man on the moon.
So why is it then, that we see so few young people interested in anything more than just playing games on their computers, consoles, and phones? Why do we need efforts like code.org to try to encourage more students to explore programming and computer science? Why is the age old question of "how do I program this darn thing" not burning in the minds of more young people?
All I can imagine is that there is, perhaps, a significant difference between then and now due to increasing complexity? Back in the early days, it was maybe a bit easier to be fascinated by computers and to be sucked into wanting to program them, because it was still possible to completely comprehend how a computer worked. Within just a few weeks you could teach yourself a programming language and create your first program. And you could create something cool in just a few months. By contrast, nowadays, to create something "cool" you need almost a movie studio budget and a team of programmers working for several years.
However, the barrier to entry was much higher back then in economic terms: you had to use a computer in a lab, at the school, or in college. Very few people could afford their own computer. By comparison, with a budget of < $80 you can build your own Raspberry Pi today and hook it up to an old monitor and off you go. You get all the programming tools in the world and a platform that is open and invites you to experiment not only with the software, but also with the hardware!
So why are young people today more inclined to play video games (be it on their smartphones, on PCs, or on consoles) than to want to program computers? And is code.org the right approach to get more people interested in computer sciences?