There's a commercial on the radio around here that asks "What if [some thing or service] had never improved? [Clip of said thing or service not improving.] Here at [their insurance company], we believe insurance can be better."
The most recent edition wonders what music would be like if it hadn't 'improved' since the Baroque period. A man with a rock star British accent yells through the microphone, "Hello Sudbury! Are you ready to.. soNAta?" Harpsichord music follows.
It so happens that I think modern rock concerts are not an improvement over classical music, but it got me thinking about our field of computer science. What if certain areas had never advanced? Are there some areas that are still stuck in the technological stone age?
The first example that jumped into my mind was that of compilers. Imagine what software development would be like if we still had to write our computer programs in assembly code. No fancy objected-oriented paradigms, no interpreted languages, and no simple procedural code! Only the likes of memory addresses and processor instructions. For all but the most ambitious, this sounds like torture. Fewer people would be capable of working in development, fewer large software products would be created, and many other areas of computer science would likely never progress, or do so excrutiatingly slowly.
There's no doubt that computer graphics have come a long way since the early days, growing from text-based terminals to the rich graphical user interfaces of today. If it wasn't for the advances in this field, I believe that computers would never have become accessible to the masses. The ability to display anything the heart desires on a computer screen unlocks so much potential for making interfaces that make sense to everyone, young or old. Modern graphics bring us entertainment, productivity, and everything in between.
The Internet is an obvious example. Remember what web pages looked like in 1995? Plain, non-interactive, and not always entirely useful. Today we are treated to rich, interactive applications, secure online shopping and banking, and about a zillion ways to connect with friends and family from all over the world. Heck, I was even able to watch President Obama's inauguration thanks to the advent of streaming video! The Internet and its related entities permeate our everyday lives so deeply that I'm not sure we remember what it was like to have to see the bank teller, or pay exuberant long-distance fees to contact anyone outside your own town.
What about those areas that haven't broken a lot of ground? I'd like to suggest that artificial intelligence is such an area of computer science. Sure, lots of really great research has come out in the name of AI, but let's go back to the beginning and ask ourselves whether these advances are actually solving the original problem. The field of artificial intelligence was supposed to capture, model, and mimic the intelligence of human beings. This was, still is, and perhaps always will be an impossible task. Many techniques are able to somewhat mimic intelligence levels much lower than our own, or even small particular areas of the way we think, but basically the field seems to have branched away from its original vision. So, in that sense, it never really did progress, though what it has accomplished instead is still wonderful and valuable.
Just like the sonata in Sudbury, some may argue that certain areas of the world of computing may have been better off had they stayed the same. Similarly, maybe the direction AI has taken is indeed a much greater accomplishment that I'm giving it credit for. I suppose that, in the end, it comes down to personal taste and opinion. What I do know for sure is that I can't even imagine where the field is going to be in 100 years, and what commercials we could possibly be making about it then.