Tuesday, January 20, 2009

What if Computer Science Had Never Improved?

There's a commercial on the radio around here that asks "What if [some thing or service] had never improved? [Clip of said thing or service not improving.] Here at [their insurance company], we believe insurance can be better."

The most recent edition wonders what music would be like if it hadn't 'improved' since the Baroque period. A man with a rock star British accent yells through the microphone, "Hello Sudbury! Are you ready to.. soNAta?" Harpsichord music follows.

It so happens that I think modern rock concerts are not an improvement over classical music, but it got me thinking about our field of computer science. What if certain areas had never advanced? Are there some areas that are still stuck in the technological stone age?

The first example that jumped into my mind was that of compilers. Imagine what software development would be like if we still had to write our computer programs in assembly code. No fancy objected-oriented paradigms, no interpreted languages, and no simple procedural code! Only the likes of memory addresses and processor instructions. For all but the most ambitious, this sounds like torture. Fewer people would be capable of working in development, fewer large software products would be created, and many other areas of computer science would likely never progress, or do so excrutiatingly slowly.

There's no doubt that computer graphics have come a long way since the early days, growing from text-based terminals to the rich graphical user interfaces of today. If it wasn't for the advances in this field, I believe that computers would never have become accessible to the masses. The ability to display anything the heart desires on a computer screen unlocks so much potential for making interfaces that make sense to everyone, young or old. Modern graphics bring us entertainment, productivity, and everything in between.

The Internet is an obvious example. Remember what web pages looked like in 1995? Plain, non-interactive, and not always entirely useful. Today we are treated to rich, interactive applications, secure online shopping and banking, and about a zillion ways to connect with friends and family from all over the world. Heck, I was even able to watch President Obama's inauguration thanks to the advent of streaming video! The Internet and its related entities permeate our everyday lives so deeply that I'm not sure we remember what it was like to have to see the bank teller, or pay exuberant long-distance fees to contact anyone outside your own town.

What about those areas that haven't broken a lot of ground? I'd like to suggest that artificial intelligence is such an area of computer science. Sure, lots of really great research has come out in the name of AI, but let's go back to the beginning and ask ourselves whether these advances are actually solving the original problem. The field of artificial intelligence was supposed to capture, model, and mimic the intelligence of human beings. This was, still is, and perhaps always will be an impossible task. Many techniques are able to somewhat mimic intelligence levels much lower than our own, or even small particular areas of the way we think, but basically the field seems to have branched away from its original vision. So, in that sense, it never really did progress, though what it has accomplished instead is still wonderful and valuable.

Just like the sonata in Sudbury, some may argue that certain areas of the world of computing may have been better off had they stayed the same. Similarly, maybe the direction AI has taken is indeed a much greater accomplishment that I'm giving it credit for. I suppose that, in the end, it comes down to personal taste and opinion. What I do know for sure is that I can't even imagine where the field is going to be in 100 years, and what commercials we could possibly be making about it then.

5 comments:

  1. This is a very interesting question - but I am going to frame it in slightly a different way - WHY? Why has AI not improved at the rate of software/PL/hardware/networking?

    I think the answer lies in abstraction.

    The thing is, software has improved because we are able to cleanly cleave layers of abstraction between the machine and the programmer. Same with hardware, lots of injection of layers to ease development, ditto on networking.

    But with AI, because we have such little understanding of intelligence/learning/etc. from a neurological standpoint, and somehow we have not figured out how to inject intermediate representations of intelligence that abstract away layers above and below, we have not been able to develop AI as much. It's too quantized, it's kind of all or nothing (or rather, all or little). Anyway, that's my theory that I just thought up as I read your post.

    It's nice to read a post about CS that is thought-provoking. Thanks :).

    ReplyDelete
  2. That theory definitely works for me.

    I randomly realized that I should have made the disclaimer that I am commenting on AI from the outside in. I took a class or two, but it's definitely not my research area. Maybe that makes what I'm saying less valid, or maybe the fresh, unbiased look makes it all the more interesting ;)

    ReplyDelete
  3. I remember my assembly language programming class.... Shudder! It's a wonder I stayed in comp sci, lol! I think you have a point about AI -- although it's sorta my area (applied AI in education) I'm still very much a n00b. Most of the techniques I'm reading about were deceloped in the 80s and 90s. You'll hear about the "dark ages" of AI when of was really difficult to get funding for research in AI after pol were disappointed when early AI didn't deliver the magic that was initially expected. Here's hoping for change! :D

    ReplyDelete
  4. gah, "after people were disappointed".... Typo... That's what I get for typing on a mobile device while breastfeeding..

    ReplyDelete
  5. Just a guess, but..
    Maybe the field of computer graphics has advanced significantly due to the huge demand for pretty-looking user interfaces and video games. I think most advancements in graphics have probably come from software companies trying to supply this demand, rather than from the research conducted in academic institutions. And I'm not sure whether there's as much demand for AI at the moment, or rather that research into AI doesn't yield much profit.

    Actually, I think it's because our idea of what AI should be like by now is just unrealistic.

    ReplyDelete

Comments are moderated - please be patient while I approve yours.

Note: Only a member of this blog may post a comment.