And so with the unusually good timing the universe sometimes provides, I stumbled on a great article in Communications of ACM called What Should We Teach New Software Developers? Why? by Bjarne Stroustrup. It discusses the disconnect between what industry wants from computer science graduates, and what universities aim to teach.
Industry wants computer science graduates to build software (at least initially in their careers). That software is often part of a long-lived code base and used for embedded or distributed systems with high reliability requirements. However, many graduates have essentially no education or training in software development outside their hobbyist activities. In particular, most see programming as a minimal effort to complete homework and rarely take a broader view that includes systematic testing, maintenance, documentation, and the use of their code by others. Also, many students fail to connect what they learn in one class to what they learn in another. Thus, we often see students with high grades in algorithms, data structures, and software engineering who nevertheless hack solutions in an operating systems class with total disregard for data structures, algorithms, and the structure of the software. The result is a poorly performing unmaintainable mess.There are some amazing software developers that come out of Carleton's computer science program. Some have even gone on to start their own successful companies. But there are also a lot who don't find meaningful internship experiences, and who have rarely (or never) written large software projects in a team.
Things look even more bleak for many grad students, for whom getting something that works just well enough to prove a particular result is more important than writing good, reusable code. I know I've definitely learned how to do exactly what is needed when it comes to course projects, and though I always wish I could do more, I know it is just not possible. Once courses are done, though, there is a great opportunity to keep your skills in check. I try to choose projects that require me to write code, and lots of it. I am also considering dabbling in the world of entrepreneurship to give myself an opportunity to write real-world software that needs to work well.
For those who aren't sure what they need to do for industry, or just can't find the right job or project to practice it, it would be nice to have a bit more balance in the degree programs themselves.
Industry would prefer to hire "developers" fully trained in the latest tools and techniques whereas academia's greatest ambition is to produce more and better professors. To make progress, these ideals must become better aligned. Graduates going to industry must have a good grasp of software development and industry must develop much better mechanisms for absorbing new ideas, tools, and techniques.I can't really speak for the industry side, but I am intrigued by the changes that faculty in Carleton's School of Computer Science are trying to make. Program streams like the upcoming option nicknamed the 'iPhone stream' are meant to provide the same core computer science as everyone else gets with a few extra classes geared towards a career in industry. Granted, these streams are focused on only a narrow portion of industry, but they are definitely a start.
This balance between academia and industry isn't going to be easy to achieve. I honestly don't know exactly how it should work. I like Stroustrup's suggestion of encouraging profs to code, but somehow that doesn't seem to be enough. It also always happens that when changes are proposed to our degree to help students be better prepared for industry, some students cry out that the program is being dumbed down, and that if you want to be so industry-ready you should go to college instead of university. Figuring out how to get the best of both worlds and make everyone happy won't be a fun job, but it's one worth talking about.