Tuesday, July 29, 2008

The Next Step in Computing Surfaces

Remember the Microsoft Surface I talked about way back in November? Looks like it's being used in AT&T stores now, and has the potential to be useful in many other collaborative scenarios, like in hotel lobbies. But if you really think about it, there is one flaw with the table-top design.

If you have several people working at the same Surface, somebody's going to be sitting in what pretty much amounts to a "Master Control" spot, while others may end up getting upside down views of the current data. Or, if everyone can see a right-side-up view of their data, then things could get pretty darned jumbled looking.

The solution?

Why, use a sphere instead of a plane!

And that's exactly what Microsoft is working on. The obvious applications involve mapping, but even being able to send data to the "other side" of the sphere to make collaboration a bit easier on the eyes is pretty appealing. I'm sure there are some other innovative interfaces possible to make certain tasks easier with this new sphere surface; we just need some ultra-creative folks to think of them.

Monday, July 28, 2008

Women and the Quality of Code

This article was originally written for NerdGirls.com.

A good way to stir up some controversy is to bring up possible differences between the way men and women write code. Add to it a tone that suggests that women write better code, and you’ll really get people talking. Is there something to it? Would you really be able to tell the difference between male and female code, and could you really say that one is better than the other?

Programming computer applications involves writing multiple lines of instructions that will be turned into something a computer can understand. When writing these instructions, or code, a programmer can also add extra information that the computer will ignore, called comments. To get a bit of an idea of what code is all about, go to ‘View, Page Source’ on your browser to see what a web page really looks like. When we ask about the differences between code that men and women write, we are talking about what instructions are used, what style is used to write them, and what kinds of comments are included.

A story was recently featured on Slashdot, a news website geared towards nerds like us, called Men Write Code from Mars, Women Write More Helpful Code from Venus – I think you’ll probably agree that even the title sets the ‘women are better’ tone! This article describes how “Emma McGrattan, the senior vice-president of engineering for computer-database company Ingres – and one of Silicon Valley’s highest-ranking female programmers – insists that men and women write code differently.”

Apparently, McGrattan believes that women’s code tends to include more useful comments than men’s. She says that women use comments to explain what they were thinking when writing their code so it is easy to understand by programmers that haven’t seen it before. On the other hand, she suggests that men actually try to make their code more cryptic, with fewer useful comments, in an effort to show how clever they are. To help fix this problem, she set up rules and standards for Ingres programmers to follow, forcing everyone to document their code, and any changes they made to other peoples’ code, with detailed comments.

The comments section back at Slashdot has some interesting reaction. One reader says:

It’s now well established that the human brain builds negative stereotypes more easily than positive ones and that people see what they are expecting and apply a double standard. This person sees what she wants to see.

Another points out:

Personality is complex, not binary. I know many girls that code beautifully, and many more that can not code at all.

The general theme throughout the comments is that men and women do think differently, but that neither writes better code than the other, nor could you really tell the difference arbitrarily.

Interestingly, there seems to be a good selection of research trying to determine whether men and women write English compositions differently. One study concludes that “there are indeed different strategies employed by men and women in setting forth information and especially in encoding the relation between writer and reader in texts.” In other words, there are apparently several indicators that can help decide fairly reliably whether a text was written by a male or female, such as the number of personal pronouns (I, we, you) used in place of noun specifiers (much, some). Could this difference extend to how men and women write code, too?

From my own job experience in two different companies, I can say that I was very careful to write useful comments, and was even complimented for it. While some of the guys did occasionally forget to add in the same kind of detailed comments that McGrattan is asking for, many of them wrote just as many comments as I did. On the flip side, some of the girls I worked with barely wrote any at all! In my case, it really did seem to depend on the person, not the gender.

So what’s the answer to the question of whether women write better code than men? And is it possible to tell the difference between women’s and men’s code just by looking at it? There may be some noticeable trends like those found in English composition, but this has yet to be scientifically studied as far as I know. Besides, until women make up a little closer to half of the programming workforce, how can we do a fair comparison? I think we’ll have to mark this one down as inconclusive for now, though I’m leaning toward a no for the part about women writing better code. What do you think?

Friday, July 25, 2008

Geek Dating

I couldn't resist the urge for another slightly off topic post. I'm getting ready to be a bridesmaid for a friend's wedding today (rehearsal tonight, big event tomorrow!). That made me reminisce on my own wedding, and I realized our one year anniversary is coming up at the end of August. Some people say you shouldn't partner with someone too much like you, but I have to say, marrying another geek was the best thing I could have done!

Let's have a dreamy flashback for a moment. In high school, I knew I wanted to do something with computers afterward, but wasn't sure exactly what. I had always had my own computer in my room (thanks to my dad and the government's surplus of machines). I toyed around with layout, graphics and design software, right back to the good old days of Print Shop. I wanted to know how these things worked "behind the screen". Computer science or software engineering were looking pretty good.

In Grade 11, I met Andrew. He was into computers, too. In fact, he seemed to already know about programming and networking. He was planning on going to Algonquin College for a computer science diploma program. Turns out he was able to inspire me and help me figure out which program I should take. As the time drew nearer for me to embark on that post secondary journey, he helped give me the confidence to dive into a subject (i.e. programming) I never had time for in high school.

And here we are, all those years later, happily married and both still geeks!

It may be true that doing the exact same thing might incite competition between two people, and could be bad for a relationship. Luckily for us, even though we are both computer science geeks, we are different enough in the field to avoid this problem. He works in industry and is more about learning about different technologies and understanding systems, while I'm still in school and enjoy working with algorithms and visual stuff.

But I have to say, there's nothing better than being able to come home from a long day, and be able to complain about with the other person understanding what the heck you are talking about!

And just for fun, I leave you with a classic Craig's List post: Why Geeks and Nerds Are Worth It.

Celebrating One Hundred

I just noticed that I had 99 posts. Wow, after a year and a few months, I've made it to 100? That's crazy, but great! Thanks to everyone's who's been along for the ride. I've enjoyed writing this blog, and reaped some surprising benefits from doing so. High five!

Saturday, July 19, 2008

Artificial Intelligence in Animation

How timely stories found on the mental_floss blog can be! It was only a week ago that I wrote about some of the exciting applications the concept of evolution can have in computing. Around the same time, I found a story about a TED Talk called Simulating Humans.

If you didn't see the movie on that page yet, take a few minutes to watch it in its entirety. It's both amazing and amusing to see how researchers have used evolution to teach a stick man how to react to various stimulus (such as being pushed over).

The first results show the stick man learning to walk, and the first attempts are just plain hilarious. Many face plants ensue. But after several generations, things start to improve, and finally the stick man can walk in a straight line, more or less like you and I.

Next, the stick man was shown to wobble and brace himself for a hard landing after being pushed from the side. It was amazing how natural it looked. I don't think it would be possible to spend enough time animating this behaviour by hand and have it look as real, though perhaps a pricey motion capture system could do the trick.

Finally, on a whim, they decided to see how well their technology would work for a famous stunt that has James Bond jump off a dam and get caught by a bungee on the way down. The animated version performed very well! This means that dangerous and expensive stunts may be replaced by animations that look just as natural.

This kind of technology could be applied to everything from big film productions to video games to medical simulation. That last one piqued my interest. Apparently one of the upcoming projects was to give software to surgeons who could then using the walking stick man to predict the outcome of surgery performed on, say, children with cerebral palsy.

Friday, July 18, 2008

The Magic Behind Wall-E

This article was originally written for the Nerd Girls blog.

If there was ever a doubt that computer science and engineering are exciting career choices, let the recently released hit animated picture Wall-E change your mind. What most people see when they watch this movie is a touching love story from an unexpected source. Sure, I saw this too, but I also saw some of the many reasons it would be really cool to work in the computer animation industry!

Back in the old days, many of the people who worked on computer animations were in fact computer scientists (as explained here). While we may consider ourselves to be creative, we certainly aren’t all artists! It shouldn’t come as much of a surprise, then, that early animations were controlled by scripting, and that this wasn’t so easy for artists to use. Now there are many software packages that artists can use instead, like 3D Studio Max, Maya, and Blender. But none of these are complete, as you can see on this comparison chart. Wouldn’t it be cool to work on the tools that help make the next Wall-E? (Check out the animators and their software at the beginning of the second video here.)

But of course there is a lot going on behind the scenes of animation software. While the most basic form of computer animation would involve moving a model object bit by bit, and rendering each new position as a separate frame, there’s no chance anybody could stand doing this for an entire feature film! The next step is to use key-framing, where only the major distinct positions are defined, and the stuff in between in calculated automatically. Still, animating, say, the scenes in Wall-E where hundreds or thousands of humans are floating around on their chairs, chatting away to their hover screens, would be pretty tedious. Enter the research that helps make this happen as automatically as possible.

Let’s look at this crowd example in more detail. To avoid manually telling each person what to do, artificial intelligence is employed. Wikipedia explains it best:

The entities – also called agents – are given artificial intelligence, which guides the entities based on one or more functions, such as sight, hearing, basic emotion, energy level, aggressiveness level, etc.. The entities are given goals and then interact with each other as members of a real crowd would. They are often programmed to respond to changes in environment, enabling them to climb hills, jump over holes, scale ladders, etc.

As you can imagine, there is a tradeoff between how complicated individual behaviour can get and how much computing power will be needed to run the simulation and ultimately create the animation. Luckily, the people who made Wall-E have a lot more time on their hands than, say, a video game running in real time, so they are able to not only get more complicated and interesting behaviours, but they have the ability to tweak anything they don’t like the look of. Still, time is money, so if we can come up with ways to get things right the first time, animation studios would be very happy.

If you want to see how crowd simulation works for yourself, you’re in luck! You can download the free, open source animation package called Blender and then try BlenderPeople, a set of crown simulation scripts that work with Blender.

Of course, animation software and artificial intelligence techniques are only a couple of ways that a computer scientist could get involved with computer animation. Whether you enjoy topics in human-computer interaction, design of efficient algorithms, techniques for music and images, or even the massive, parallel systems that render as fast as they can, there’s something for you in this exciting industry.

Nerd Girls

Do you think that brains are beautiful? Geek is chic? Smart is sexy? Then run, don't walk, over to the Nerd Girls website now! With these beliefs in mind, the Nerd Girls mission is to "encourage other girls to change their world through Science, Technology, Engineering and Math, while embracing their feminine power." Now you can join the movement, too!

The Nerd Girls website just launched today, and there was supposed to be a segment about it on the Today Show (I'll hopefully find it online later, since I don't have cable). If you look around, you might find something interesting on the blog section of the site.

Did you see it?

Yup, one of the bloggers is yours truly! I'll be posting one or two stories like those you find here every week. I'll also point you to those stories from this blog so you'll always know when they show up.

While you're there, you may as well also sign up for the forums, where you'll undoubtedly be able to connect with other awesome girls from around the world.

Monday, July 14, 2008

Games and Learning

While I perused some of the many stories and emails I have saved for writing about later, I noticed a theme popping up in a good number of them. It seems that educators are taking advantage of the hold video games have on young people these days by pulling some pretty impressive head fakes with them. Let's see what kids (and adults) are learning while playing.

I came across a list of 25 educational simulators and games on a distance learning website. It reminded me, first and foremost, of some of the classics I used to play as a kid. Take, for instance, Oregon Trail and Where in the World is Carmen Sandiego. I used to play the originals for hours on end (and with only the most basic of graphics and controls, too!). The same list goes on to suggest that Age of Empires will help you learn history, for example, and Railroad Tycoon will teach business skills. I was a bit sceptical about these choices at first, but after thinking about it, I guess I can see where they're coming from. Through the simulations involved, you would get a bit of a sense in how people used to live or what works in the business world, even if the actual details aren't entirely accurate.

My next example takes us away from traditional games to those you might call "edutainment".
[Immune Attack is] an educational video game that introduces basic concepts of human immunology to high school and entry-level college students. Designed as a supplemental learning tool, Immune Attack aims to excite students about the subject, while also illuminating general principles and detailed concepts of immunology.
The educational nature of this game is much less subtle than it is with games like Age of Empires. To master the game, students must learn about how the immune system works, plain and simple. It's also free, which is a whole lot cheaper than setting up complex labs to learn similar concepts (and it also means you can download it and try it for yourself!). It seems to be a successful concept; one commenter mused that "I wish we had had games like this when I was flunking advanced biology in 1969!"

In a news article about how technology is reshaping the face of classrooms in the States, 11-year-old Jemella Chambers talks about her experience with math software that has students compete against each other for the highest score by solving the most math equations. She's quoted as saying "This makes me learn better. It's like playing a game." The software is called FASTT Math and claims to "automatically differentiates instruction based on each student’s individual fluency levels in customized,10-minute daily sessions." Reminds me of the Train Your Brain activity that gets you to fill in the sign for a simple equation as fast as you can. Fun because you want to beat your previous time, and educational because you get really good at fast mental math.

As an added bonus, games seem to help bring out creativity in some students, according to a study described in this article. To quote, "
in real-life terms, the study appears to indicate that after playing the game, happy or sad people are most creative, while angry or relaxed people are not." Perhaps there is an opportunity to kill two birds with one stone here.

While relying on video games (or even computers in general) to replace all traditional forms of education doesn't sound like a very good idea, the power to engage students' interest cannot be ignored. Educational games can serve a very beneficial supplementary role in the classroom, and their design could be an interesting research topic for the computer scientist interested in software engineering or human computer interaction.

Friday, July 11, 2008

Evolution is Exciting

I suppose you've all heard about Spore by now. Many of you have probably already tried out the Creature Creator, too, since the full game isn't available yet. I haven't download it yet, and will probably just wait until I can get the full deal. There are also many other related projects to play with while we wait.

And if you don't know what's Spore's all about yet, the important piece of info is this: with Spore, you can "EVOLVE Your Creature through Five Stages - It’s survival of the funnest as your choices reverberate through generations and ultimately decide the fate of your civilization."

Sounds pretty cool. But while Will Wright had the insight and ability to turn evolution into a compelling video game, there are many more applications of the concept. Check out this article written by a friend and fellow Carleton U alumni, Elan Dubfrofsky. It's called The Ultimate Problemsolver: Computer + Evolution = Genius, and includes all kinds of interesting uses of evolution.

Take, for instance, the travelling salesman problem. It turns out it's not so easy to automatically figure out how to travel to an arbitrary number of cities, only once each except for a return home, in the shortest possible way. In fact, this problem is classified as NP-hard, and if the number of cities is large enough, only approximate solutions can be computed in any reasonable amount of time. But not to worry - by starting with some random solutions, mutating and recombining them, and finally filtering out the best solutions so far, we can eventually come up with a pretty darned good approximation of the best answer. We just need to give this process enough time. Ah, the beauty of evolution!

Another example that Elan points out is the design of an antenna for NASA, developed in 2006. Many very strange looking devices were suggested through this process of evolution, but once it was all said and done, the winning design really was the most fit for the job, consuming only a small amount of power and being easy to produce.

I've even talked about evolutionary techniques earlier in this post. After reading Elan's article, you might enjoy the discussion about the state of artificial intelligence and the evolution of a virtual checkers player. Be sure to read through the comments as well!

So the next time you are working on an optimization problem, ask yourself if you might be able to evolve a good solution rather than trial-and-error your own. Or just wait for Spore and watch evolution in action while creating some silly looking creatures on the side.

Monday, July 7, 2008

Give Your Photos a Cel-Shaded Look

Did you ever get the chance to play The Legend of Zelda: The Wind Waker for the Nintendo GameCube? If so, you may recall that the graphics of this game looked rather different from other games. The below image from Wikipedia gives a pretty good idea of what I mean.

Notice how it looks as though it was hand animated? This technique is called cel-shading, and although it appears to be a simplistic drawing made by hand, the process of achieving this kind of rendering on the computer is actually pretty complex.

According to the Wikipedia article on cel-shading, a 3D model is usually used as a starting point for creating this kind of image. To get the desired look, unconventional lighting is applied to the scene. For example, the way an object would look under normal lighting conditions would be calculated first. Then, each pixel that was lit would have its value discretized into a small number of specific ranges, taking away the smooth transition from dark to light and replacing it with something more abrupt. Of course, this is simplified, so have a look at the article to learn more about it.

Using tutorials like this one found at instructables.com, it's possible for you and I to create our own objects and have them appear cel-shaded. But wouldn't it be cool to be able to take our own photographs and have them automatically converted?

I recently found an article to be published in IEEE Transactions on Visualization and Computer Graphics called Flow-Based Image Abstraction. The authors have been working on improving the automatic conversion of a photograph to a rendering that looks a lot like cel-shading.

They break down the problem into two steps: creating a line drawing, and performing region smoothing.

To get the line drawing, they analyze the direction that colours are changing in an image to get a sense of where the lines should be. Think of this as the optical flow. This step is like a fancy edge detector (you may have heard of standard edge detectors, like Canny Edge Detector). The next image shows a comparison between other edge extraction techniques and that developed by the authors, seen at the far right (image directly from the paper):

In a separate process, unimportant details (in terms of colour) are removed from the insides of regions defined by the detected lines. After this, the two images can be combined together.

(Once again, things are more complicated than this; if you want the technical details of how this is accomplished, the paper outlines the steps very well.)

The following image summarizes the process (image directly from the paper).

So who knows - thanks to computer science at work, maybe this will be the newest filter in the next version of Photoshop, available for you to make some pretty cool artwork from your own photos!

Sunday, July 6, 2008

Nintendo DS: First Impressions

As you know, I recently got a Nintendo DS. Now that I've messed around with it for just over a week, I have some first impressions that could be fun to look back on a year from now. The summary is that I played more video games this past week than I have for several years.

I've seen and read from several sources (like this episode of Good Game Stories) that women tend to be casual gamers who prefer to play for short periods of time when they have the opportunity between other jobs and projects. In fact, it seems that the majority of casual gamers are women. I have to say that this has proved to be quite true for me. Rather than sit for hours working away at a single game, my attention span seems to prefer messing around with something not too involved for a just few minutes.

The first game I tried out was Ninja Gaiden, and I was immediately impressed with the visuals. I thought it was really clever that the game had you hold the DS rotated, like a book (I would later discover that many games use this idea). Using the stylus for all navigation and fighting was another cool idea I hadn't seen before. I liked how the cut scenes looked like anime because I know the DS doesn't have a whole lot of memory. Why not have something "simple" look really good in favour of more complicated and less detailed 3D stuff that ends up looking kinda cheesy? Unfortunately, I couldn't concentrate on this game for long, as per above. We'll see if I go back to it later on.

Next, I found The Sims 2 DS at a discount price, so decided to give that a go. This game suited my tastes much better than Ninja Gaiden. Instead of having to attack non-ending bad guys all the time, I could take my time to explore Strangeville and complete goals at my own pace. It was easy to play for an extended period of time, and I could stop whenever I felt like it. This game was made in true 3D, as opposed to Ninja Gaiden's mixture of 2D backgrounds with 3D characters. This really surprised me - it's amazing what you can do with that little DS. Too bad the cut scene videos looked a little marginal.

Finally, I tried Brain Age 2: More Training in Minutes A Day. Once again, this game was great for playing a few minutes here and there. But it wasn't the main training activities that I really liked; it was the Sudoku feature. I can't help but figure that having Sudoku in its own prominent section has more to do with capitalizing on the current craze than the usual brain training, a decision no doubt made by marketing. Nonetheless, seeing as I had never tried one of these puzzles before, I looked past this fact and solved a puzzle (can't hurt your brain!). Boy oh boy, are those things addictive!

It'll be interesting to see if I can keep up my video gaming for more than a few weeks. I would feel pretty down if I ended up neglecting my new toy after getting bored. For now, though, I can't wait to try out a few other games, especially The Legend of Zelda: Phantom Hourglass!