Monday, June 4, 2012
Are we getting mixed up between computer science and computer programming? Are degrees in the former trying to train more people for the latter? Is any of this a problem? Jason Gormon thinks so, and I think it's worth looking at his argument, even if I don't entirely agree with it.
Gormon starts with a comparison between music and computer science. In music, he says, you don't need to know much (if any) theory to play music. Many popular musicians learned to play by ear on their own.
The generation of software developers created by the 80's home computing boom are largely self-taught, and are largely "programming by ear" even today. Some will have gone off and studied computer science, but most of us didn't (because, frankly, yawn!)(Yawn? Good way to start - insult those of us who actually like computer science and turn off those who might also have liked it...) Insult aside, his point is that you can learn programming on your own, and as he discusses later, if you are passionate about it you will also take the initiative to learn the CS you need.
I'm sure there are those who have both the initiative and the ability to learn about data structures, algorithms, languages, and more from a CS perspective, but there are many who don't. The question is whether it matters. According to Gormon:
I'm not going to suggest that there aren't times when it pays to know how a hash map works, or to be able to design a small domain-specific language. But such times are few and far between (if you're not working on compilers or core programming frameworks, which most of us aren't and don't need to be), and for that we have Google (the search engine, not the company).I have to disagree. If you don't know what you don't know, Google is not helpful. Having at least a basic knowledge of how multiple data structures and algorithms and languages work will make it that much easier to know what the right tool for the job is. Or how to debug a language that uses unusual constructs that you happen to understand the implementation of conceptually. Or why that algorithm that seemed reasonable takes forever to finish and what a better alternative might be.
It's possible that Gormon, a self proclaimed enthusiast of 20+ years, much of this came from experience. But the world is a different place than it was when he started. I'm not sure the systems being built today can afford to rely on developers to build this experience and knowledge fast enough on their own.
And this is where I think he's got it spot on: we need to offer a mix of computer science and practical experience.
So I propose that the right course would be a 5+ year apprenticeship with part-time degree study - CS in the classroom 1 day a week, software development in the office the other 4.In my case, five solid co-op work terms provided me with the experience I needed, and the courses I took did happen to teach me about a variety of so-called useful things from automated testing to issues with user interfaces. Maybe more CS degrees could make these topics available and even mandatory without taking away the theoretical aspects. Maybe current models for internship experience should be rethought.
Either way, I still think that the best software developers are often those who do know at least computer science, and that it's well worth teaching elementary and high school kids about it. If computational thinking and high-level knowledge of computer science concepts becomes second nature by the time they graduate, their problem solving abilities will benefit them wherever they go, and those who go into software development are more likely to hit the ground running.