There were many cool demos at ISMAR this year, and you can check out the description of them in the conference schedule. In this post, I've included a few photos and a video of some of my favourites. Click through the images to get a brief description. EDIT: You can also visit this blog post by Tom Carpenter to get a more detailed description of what you see below.
Object depth and shape extraction for Augmented Reality Interaction
Put a Spell: Learn to Spell with Augmented Reality
A Mixed Reality Painting Experience for Physical Rehabilitation
Computing Alpha Mattes in Real-time for Noisy Mixed Reality Video Sources
ProFORMA: Probabilistic Feature-based On-line Rapid Model Acquisition
Animatronic Shader Lamps Avatars
Saturday, October 24, 2009
Thursday, October 22, 2009
ISMAR09: Human Factors and User Interfaces
There were more than a few good papers presented at ISMAR this year on human factors and user interfaces. Here's just a taste of them. See the conference schedule for information about authors and their affiliations.
Using Augmented Reality to Support Cross-Organizational Collaboration in Dynamic Tasks
This student paper was an honourable mention for the best paper awards. It was all about a crisis management system designed for use by commanders with different backgrounds. Augmented reality is intended to give each user a personalized view that they can most easily understand based on their culture and so on.
The scenario used for the user study - the first such study for joint realtime operations - was planning the fight again forest fires. Rescue, police, and military helicopter units are all involved.
The initial brainstorming stage with field experts in these areas suggested that hand held displays should be used to give individualized views of a command map. But we all know how important it is to ask the real users what works best for them, not their managers; it turned out that the field workers couldn't use the handhelds. They were too clumsy and took away their ability to use their hands freely. They wanted a shared map that they could point to and have the others see. In other words, they wanted a heads up display with joystick control.
When compared with a paper based map, the AR system with custom markers for each type of field worker performed significantly better.
Interference Avoidance in Multi-User Handheld Augmented Reality
Have you ever wondered how safe multi-user augmented reality games really are? I mean, when you're competing furiously while looking through your mobile device, it seems like it'd be pretty easy to knock into each other as you move around in the virtual world in front of you, right? Well, trying to avoid this is what this paper is all about.
The concept is pretty simple. As you move closer to your opponent, the virtual objects in your view shift slightly away from them. The key is to make sure that you as a user don't notice thisvhappening, so certain compensations are needed, such as covering the playing surface with a flat texture that can also shift with the virtual objects.
The amazing thing is how effective this approach is compared to other proximity warnings, like dimming the screen, beeping, and disabling user actions when they get too close to another user. Users perceived the shifting to be less distracting but also less effective than other methods. However, the real distance maintained between players in a competitive two-player game was significantly better than the other methods, making it quite effective in reality.
Interaction and Presentation Techniques for Shake Menus in Tangible Augmented Reality
The investigation in this paper sought to find a way to interact with objects directly in the environment using some kind of menu system. Objects should not require any kind of tags or electronics added to them beforehand, and hands should be able to manipulate the object freely without having to pick up something else as well.
The idea of a shake menu was inspired by shaking a gift to see what's inside. So you shake an object to open a menu, and then move the object to the desired menu selection and hold it there to make the choice. But what's the best way to present the menu items in relation to the object?
A user study looked at a clipboard paradigm in which menu items (which look like cubes) are aligned along the right of the object, and "stick" to it as it moves around in the camera's view. Other layouts include aligning the choices surrounding the object (this seems very similar to the clipboard version), aligning relative to the display only (so it sticks to the screen and doesn't move again), and aligning to the world coordinated, but not the object's.
The hypothesis was that the object alignment would be the fastest and most intuitive, and would be appreciated for the ability to examine the menu choice from different angles (after all, it could be any 3D object). However, user studies proved this wrong. The object was almost tied with the display alignment for the best speed, but display had far fewer errors than any other method. The display choice was also the best in terms of perceived intuitiveness, with object in second place.
Using Augmented Reality to Support Cross-Organizational Collaboration in Dynamic Tasks
This student paper was an honourable mention for the best paper awards. It was all about a crisis management system designed for use by commanders with different backgrounds. Augmented reality is intended to give each user a personalized view that they can most easily understand based on their culture and so on.
The scenario used for the user study - the first such study for joint realtime operations - was planning the fight again forest fires. Rescue, police, and military helicopter units are all involved.
The initial brainstorming stage with field experts in these areas suggested that hand held displays should be used to give individualized views of a command map. But we all know how important it is to ask the real users what works best for them, not their managers; it turned out that the field workers couldn't use the handhelds. They were too clumsy and took away their ability to use their hands freely. They wanted a shared map that they could point to and have the others see. In other words, they wanted a heads up display with joystick control.
When compared with a paper based map, the AR system with custom markers for each type of field worker performed significantly better.
Interference Avoidance in Multi-User Handheld Augmented Reality
Have you ever wondered how safe multi-user augmented reality games really are? I mean, when you're competing furiously while looking through your mobile device, it seems like it'd be pretty easy to knock into each other as you move around in the virtual world in front of you, right? Well, trying to avoid this is what this paper is all about.
The concept is pretty simple. As you move closer to your opponent, the virtual objects in your view shift slightly away from them. The key is to make sure that you as a user don't notice thisvhappening, so certain compensations are needed, such as covering the playing surface with a flat texture that can also shift with the virtual objects.
The amazing thing is how effective this approach is compared to other proximity warnings, like dimming the screen, beeping, and disabling user actions when they get too close to another user. Users perceived the shifting to be less distracting but also less effective than other methods. However, the real distance maintained between players in a competitive two-player game was significantly better than the other methods, making it quite effective in reality.
Interaction and Presentation Techniques for Shake Menus in Tangible Augmented Reality
The investigation in this paper sought to find a way to interact with objects directly in the environment using some kind of menu system. Objects should not require any kind of tags or electronics added to them beforehand, and hands should be able to manipulate the object freely without having to pick up something else as well.
The idea of a shake menu was inspired by shaking a gift to see what's inside. So you shake an object to open a menu, and then move the object to the desired menu selection and hold it there to make the choice. But what's the best way to present the menu items in relation to the object?
A user study looked at a clipboard paradigm in which menu items (which look like cubes) are aligned along the right of the object, and "stick" to it as it moves around in the camera's view. Other layouts include aligning the choices surrounding the object (this seems very similar to the clipboard version), aligning relative to the display only (so it sticks to the screen and doesn't move again), and aligning to the world coordinated, but not the object's.
The hypothesis was that the object alignment would be the fastest and most intuitive, and would be appreciated for the ability to examine the menu choice from different angles (after all, it could be any 3D object). However, user studies proved this wrong. The object was almost tied with the display alignment for the best speed, but display had far fewer errors than any other method. The display choice was also the best in terms of perceived intuitiveness, with object in second place.
Wednesday, October 21, 2009
ISMAR09: Workshop on Handheld Augmented Reality Games
This was a great workshop given by Blair MacIntyre from Georgia Tech on mobile augmented reality games. I got a lot out of it, from being reminded of some solid game design topics to getting new ideas about the game I want to make for my PhD.
The goal of augmented reality in this context is to embody social interaction in the physical world, enabled by a tight integration of the physical and virtual world. In terms of games, it's important to remember that design is more than just form and function - it needs context, too (which AR can give). Game design is about solving a problem within a set of constraints, and making something fun, challenging, awe inspiring, and captivating.
In augmented reality, mobility is usually assumed. But it's not just a combination of the physical and virtual world - there should also be registration between the virtual and physical worlds and real time interaction.
It's worth remembering that there are two classes of AR systems: task based and experiential. Task-based AR is perhaps not as well suited to handhelds, since your hands aren't totally free to complete the task, and it's hard to hold something light up for a long time. This is one of the areas that give head mounted displays (HMDs) an advantage: they can provide zero-effort, hands-free interaction and continuous peripheral information. Both interfaces provide some privacy, an in-place display, and per-user customization.
So what makes "good" AR anyway? What is unique about it that can be leveraged? Multiple people can work in a shared space, for one. Each person gets a unique view of the world while not giving up the global perspective. It allows for direct and natural interaction, and the physical world can be leveraged with props, spatial understanding, and dexterity.
Some of the graphics issues to consider when determining a platform for your game (cell phone or something more advanced?) include lighting, shadows, occlusions, and physics capabilities. Graphics don't always have to be real, either - non-photorealistic effects can help alleviate the processing power needed. Remember that latency is a bigger issue with AR.
Back to game design. We, as computer scientists, have to think like a game designer when coming up with new ideas. AR games shouldn't be all about the technology, turning them into demos, essentially. We need to create something that's fun to play. "The designer needs to envision how a game will work during play ... planning everything necessary to create a compelling player experience." In other words, you need to decide first what you want the player to experience, not what they will do, or learn, or whatever. This is a key point for me in thinking about my educational game.
The structure of a game includes the following components:
Remember that to make a game something more than a toy, there must be goals, and interesting and meaningful choices to reach those goals. The story and characters are brought out through actions.
Some questions to ask when making a handheld AR game:
The goal of augmented reality in this context is to embody social interaction in the physical world, enabled by a tight integration of the physical and virtual world. In terms of games, it's important to remember that design is more than just form and function - it needs context, too (which AR can give). Game design is about solving a problem within a set of constraints, and making something fun, challenging, awe inspiring, and captivating.
In augmented reality, mobility is usually assumed. But it's not just a combination of the physical and virtual world - there should also be registration between the virtual and physical worlds and real time interaction.
It's worth remembering that there are two classes of AR systems: task based and experiential. Task-based AR is perhaps not as well suited to handhelds, since your hands aren't totally free to complete the task, and it's hard to hold something light up for a long time. This is one of the areas that give head mounted displays (HMDs) an advantage: they can provide zero-effort, hands-free interaction and continuous peripheral information. Both interfaces provide some privacy, an in-place display, and per-user customization.
So what makes "good" AR anyway? What is unique about it that can be leveraged? Multiple people can work in a shared space, for one. Each person gets a unique view of the world while not giving up the global perspective. It allows for direct and natural interaction, and the physical world can be leveraged with props, spatial understanding, and dexterity.
Some of the graphics issues to consider when determining a platform for your game (cell phone or something more advanced?) include lighting, shadows, occlusions, and physics capabilities. Graphics don't always have to be real, either - non-photorealistic effects can help alleviate the processing power needed. Remember that latency is a bigger issue with AR.
Back to game design. We, as computer scientists, have to think like a game designer when coming up with new ideas. AR games shouldn't be all about the technology, turning them into demos, essentially. We need to create something that's fun to play. "The designer needs to envision how a game will work during play ... planning everything necessary to create a compelling player experience." In other words, you need to decide first what you want the player to experience, not what they will do, or learn, or whatever. This is a key point for me in thinking about my educational game.
The structure of a game includes the following components:
- players
- objectives
- rules
- resources (making the game not too easy, not too hard)
- boundaries
- outcome
Remember that to make a game something more than a toy, there must be goals, and interesting and meaningful choices to reach those goals. The story and characters are brought out through actions.
Some questions to ask when making a handheld AR game:
- Who is your target player?
- When or where are they playing?
- Single or groups?
- Will there be props? How comfortable and easy to use are they?
- What exactly will the player do while playing the game?
- Fast motions are a problem.
- How will having the device (phone) in the player's hand affect things?
- It's tiring to hold up a relatively light device for long stretches of time.
- Awareness of other players.
- Small screens are tiring to look at for a long time.
- Vibrations and sounds to give feedback, especially when looking elsewhere.
Tuesday, October 20, 2009
ISMAR09: Experiential Learning 3 of 3 - Group Discussion
This is the last of three posts on the ISMAR09 experiential learning workshop. Post one and post two covered the morning presentations on current applications, while this one will attempt to capture the excellent group discussion that took place in the afternoon.
The afternoon's format was to look at three main questions about education and augmented reality, each one building on the last. For each question, we broke ourselves into three groups, discussed the topic for 15 minutes (or more, in most cases), and then shared our thoughts with the group. My notes below will consist of our own group's findings first, which will naturally have more detail. Points from the others groups will follow - members of these groups are most definitely invited to add more insight or links to their own blog posts in the comments.
What are the Key Elements of Mixed and Augmented Reality that Create a Meaningful Experience?
I got this one started by explaining something I tell my friends and family when they want to know about augmented reality. I feel that one of the big benefits of AR is that you essentially reduce the number of levels of indirection required to do something. For example, consider a traditional map. You have a bird's eye, (usually) non-photorealistic view of the world that you must rotate and project onto the real world in front of you. What if that information was augmented for you in the first place? You can free up all that cognitive power for the actual task at hand (such as learning).
Another key element suggested was the idea that augmented reality should not provide the entire story - the imagination should have the ability to work its magic, too. You should also be able to bring in other senses beyond vision, making the presence of the physical world so important. Having an EyePet in a completely virtual world is somehow different than playing with it in your living room - in the latter case, the broader context of your own culture is included in the gameplay.
Augmented reality allows non-experts to participate in and understand tasks outside their field. For example, it seems unlikely that Disney could have succeeded in getting permission to build Disney World here in Orlando today. But if the city council (or whoever needed to vote) were able to see with their own eyes exactly how it would all look, and how, say, emergency evacuations would work, things might be different.
Our group also believed that the most meaningful experiences would come from free-range AR, where much larger environments can become immersive sandboxes for learning. This setup could also lead to a more social experience.
Another key point was on the adaptability of software. Ideally, AR programs would learn you as you learned them. Of course, this requires much more advanced artificial intelligence than what is available today, but we do get better all the time in mimicking this ability.
Finally, we decided that AR would be most meaningful when it was personalized. This refers to not just the changing viewpoint of the virtual objects, but also the content of the virtual portion of the environment itself. This, among other things, will help avoid information overload.
Points from other groups:
How Do We Continue the Learning Experience Once the User Leaves
The first example our group discussed was the idea of capturing information about the experience that can then be used later in various ways. For instance, a military training exercise might record the decisions made for a particular scenario, and the user can bring that home and show his or her family what they experienced. They can compare their stats to others who have done the same scenario, and so on. The question then becomes: what is the best way to present the data? Whatever it is, it shouldn't replace the original experience. Otherwise, there's no reason to use the augmented reality again (or, for instance, no reason to go to a museum again).
An interesting discussion started about whether doing a good enough job in creating the experience is enough to spark interest in a certain topic such that the user will go home and learn more about it. The example of the Louvre was that most visitors look at art for only 30 seconds or so, when you need at least two full minutes to fully appreciate the details. If the proper viewing was encouraged by the AR experience, perhaps this is enough to want to find out more. What if the Mona Lisa had an augmentation of da Vinci putting on the finishing touches after acting out some story related to life in that era? Would you be more inclined to find out more about da Vinci?
Finally, we felt it was key to avoid making it about the technology - the tech needs to be invisible. This way, the focus will be on the topic at hand, which again will make for an easier transition to, say, a follow up activity to be done at home.
Points from other groups:
What is Novel When it Comes to Augmented Reality and Learning?
We agreed that augmented reality isn't a new paradigm shift, but rather another tool in a teacher's toolkit. However, this tool might benefit a teacher in many ways. For instance, it may be easier to employ than other computer-based demonstrations if it's as easy to use as we insisted it be in the earlier questions. Furthermore, the exploratory nature makes for an environment that allows a teacher to say "I don't know, let's find out," avoiding the fear of teaching a topic they don't understand well themselves. Finally, it might be something that much better than just Googling a topic because it would certainly be more immersive.
Other advantages of AR in the classroom are that it would be more repeatable than more free-form techniques, making it possible to standardize the content (though not the experiences) of AR scenarios across the board.
It may also open up opportunities for standardized learning at home. This might help capture the attention of the gifted students and help the struggling students catch up. It would even be possible to have distributed study groups who could interact with the same virtual object.
In a training context, augmented and mixed reality has already proven to be very effective. Apparently many commercial pilots take their first flight in a real jet because the simulators are just that good.
Thinking more simply to see what could be done now, it's clear that printed material can be augmented with markers and cell phones used to view them (and kids would love getting permission to pull out their phones in class!).
Points from other groups:
Conclusion
That concludes the workshop on experiential learning. I will be taking away the excellent thoughts and insights from the three posts on this blog, as well as a better appreciation for the big picture. I hate to admit it, but when thinking about the game I want to build for my PhD research, I got stuck in thinking of a basic marker based interaction. There's so much more to AR that it would be tragic to miss considering it all.
The afternoon's format was to look at three main questions about education and augmented reality, each one building on the last. For each question, we broke ourselves into three groups, discussed the topic for 15 minutes (or more, in most cases), and then shared our thoughts with the group. My notes below will consist of our own group's findings first, which will naturally have more detail. Points from the others groups will follow - members of these groups are most definitely invited to add more insight or links to their own blog posts in the comments.
What are the Key Elements of Mixed and Augmented Reality that Create a Meaningful Experience?
I got this one started by explaining something I tell my friends and family when they want to know about augmented reality. I feel that one of the big benefits of AR is that you essentially reduce the number of levels of indirection required to do something. For example, consider a traditional map. You have a bird's eye, (usually) non-photorealistic view of the world that you must rotate and project onto the real world in front of you. What if that information was augmented for you in the first place? You can free up all that cognitive power for the actual task at hand (such as learning).
Another key element suggested was the idea that augmented reality should not provide the entire story - the imagination should have the ability to work its magic, too. You should also be able to bring in other senses beyond vision, making the presence of the physical world so important. Having an EyePet in a completely virtual world is somehow different than playing with it in your living room - in the latter case, the broader context of your own culture is included in the gameplay.
Augmented reality allows non-experts to participate in and understand tasks outside their field. For example, it seems unlikely that Disney could have succeeded in getting permission to build Disney World here in Orlando today. But if the city council (or whoever needed to vote) were able to see with their own eyes exactly how it would all look, and how, say, emergency evacuations would work, things might be different.
Our group also believed that the most meaningful experiences would come from free-range AR, where much larger environments can become immersive sandboxes for learning. This setup could also lead to a more social experience.
Another key point was on the adaptability of software. Ideally, AR programs would learn you as you learned them. Of course, this requires much more advanced artificial intelligence than what is available today, but we do get better all the time in mimicking this ability.
Finally, we decided that AR would be most meaningful when it was personalized. This refers to not just the changing viewpoint of the virtual objects, but also the content of the virtual portion of the environment itself. This, among other things, will help avoid information overload.
Points from other groups:
- AR needs to be consistent with what's expected in the real world (it has to "make sense").
- There must be an element of surprise and magic.
- It should be social, approachable, and easy to use.
- Users should enjoy being tricked/surprised.
- The end user experience is key (not the technology itself).
- There should be some degree of being novel or special.
- It should be scalable in terms of time, space, size, and orientation.
- It will provide the ability to experiment where it was once impossible.
- It must be reliable enough to reflect realism.
How Do We Continue the Learning Experience Once the User Leaves
The first example our group discussed was the idea of capturing information about the experience that can then be used later in various ways. For instance, a military training exercise might record the decisions made for a particular scenario, and the user can bring that home and show his or her family what they experienced. They can compare their stats to others who have done the same scenario, and so on. The question then becomes: what is the best way to present the data? Whatever it is, it shouldn't replace the original experience. Otherwise, there's no reason to use the augmented reality again (or, for instance, no reason to go to a museum again).
An interesting discussion started about whether doing a good enough job in creating the experience is enough to spark interest in a certain topic such that the user will go home and learn more about it. The example of the Louvre was that most visitors look at art for only 30 seconds or so, when you need at least two full minutes to fully appreciate the details. If the proper viewing was encouraged by the AR experience, perhaps this is enough to want to find out more. What if the Mona Lisa had an augmentation of da Vinci putting on the finishing touches after acting out some story related to life in that era? Would you be more inclined to find out more about da Vinci?
Finally, we felt it was key to avoid making it about the technology - the tech needs to be invisible. This way, the focus will be on the topic at hand, which again will make for an easier transition to, say, a follow up activity to be done at home.
Points from other groups:
- A museum exhibit can have a take-home piece so the adventure can be continued (for example, your own fish from the main giant fish tank exhibit). The individual experience is sparked thanks to the larger context of the exhibit.
- Make the follow-up activity viral. Share with friends and family.
- Allow learners to finish the story at home when they run out of time.
- Provide networking opportunities online.
- Create physical activities later on.
What is Novel When it Comes to Augmented Reality and Learning?
We agreed that augmented reality isn't a new paradigm shift, but rather another tool in a teacher's toolkit. However, this tool might benefit a teacher in many ways. For instance, it may be easier to employ than other computer-based demonstrations if it's as easy to use as we insisted it be in the earlier questions. Furthermore, the exploratory nature makes for an environment that allows a teacher to say "I don't know, let's find out," avoiding the fear of teaching a topic they don't understand well themselves. Finally, it might be something that much better than just Googling a topic because it would certainly be more immersive.
Other advantages of AR in the classroom are that it would be more repeatable than more free-form techniques, making it possible to standardize the content (though not the experiences) of AR scenarios across the board.
It may also open up opportunities for standardized learning at home. This might help capture the attention of the gifted students and help the struggling students catch up. It would even be possible to have distributed study groups who could interact with the same virtual object.
In a training context, augmented and mixed reality has already proven to be very effective. Apparently many commercial pilots take their first flight in a real jet because the simulators are just that good.
Thinking more simply to see what could be done now, it's clear that printed material can be augmented with markers and cell phones used to view them (and kids would love getting permission to pull out their phones in class!).
Points from other groups:
- Will AR be a revolution or just an evolution? Can we truly improve learning with AR? Perhaps we won't truly know for another few decades.
- AR provides a different dimension related to creativity and self-reflection. It can be about exploration, not necessarily just making abstract concepts concrete.
- Main barrier: How will it improve peoples' lives? We just don't know - there is a lack of understanding that won't be solved until we start getting more products in peoples' hands.
- What are trying to accomplish with AR? Connection, relevance, and perspective? How?
Conclusion
That concludes the workshop on experiential learning. I will be taking away the excellent thoughts and insights from the three posts on this blog, as well as a better appreciation for the big picture. I hate to admit it, but when thinking about the game I want to build for my PhD research, I got stuck in thinking of a basic marker based interaction. There's so much more to AR that it would be tragic to miss considering it all.
ISMAR09: Experiential Learning 2 of 3 - Current Applications 2
This is the second of three posts on the experiential learning workshop held Monday at ISMAR09. The first post introduced the topic and summarized the first three presentations given in the morning on current AR applications. This post will summarize the last three speakers, and the last post will be on the group discussion held in the afternoon.
Infinite Story, Finite Space
Chris Stapleton, co-chair of ISMAR09, gave us his vision for augmented reality and told us about the projects he's worked on. He says "we think that if we deal with physical space, we can only deal with one story." But if the augmentations can change, this is no longer true. Using augmented reality, we can allow users to add their imagination, rather than just give them the story - imagination is the third reality.
A project that really intrigued me was a memory scape for the Maitland Holocaust Museum. The idea was to recreate stories told in children's diaries of the Holocaust so visitors could understand what happened in terms of humanity. A physical space would be created, and embedded projections used to bring the space to life. Bits and pieces of the story can be told through these augmentations, and imagination can fill in the rest. Even better, you could experience a different story each time you visited the museum.
Chris goes on to lay out the spectrum of levels of engagement:
Total Immersion
Unfortunately I didn't catch the presenter's name (wasn't on slides, didn't see it in the schedule), but he was from Total Immersion. He pointed out that 20-30% of the population are auditory learners, 40% are visual, and 30-40% are kinaesthetic. Naturally, augmented reality helps the kinaesthetic learning type unlike many other mediums. Experiences will allow for engagement, reflection, insight, and of course, learning.
Also touched on was a set of best practices for augmented reality. It needs to attract users, be easy to use, and give instant access to the experience.
Finally, a comparison between entertainment and education was made to highlight some of the differences.
Entertainment:
Museum Learning and AR
Last but not least, we have Kate Haley Goldman from the Institute for Learning Innovation. She's all about informal and free choice learning, creating voluntary, non-sequential learning experiences like Wolf Quest. Though Wolf Quest is not a space-based game, it's interesting nonetheless for its great success; kids played it much longer than the expected two to three hours it was designed for.
Kate explained that personal, sociocultural, and physical context are all factors that help influence learning. These are all things that can be employed in, say, museum exhibits. But why do people visit museums? Research has shown that reasons range from fun and entertainment, to social activity, to being a site of interest (a 'must-see' while on vacation), or even specifically to be challenged or learn something. How might augmented reality help with all this?
One project Kate talked about was an AR system that augmented the wing of a plane. Visitors could adjust various settings or move the wing, and see the resulting forces in the augmentation. They could then figure out whether the plane would actually fly under those conditions. This system helped with some of the above goals (especially learning, as tests showed), but not all. For example, the system was too separated and thus lacked the social aspect of a museum visit. Users couldn't really talk to or interact with each other.
Some of the aspirations of augmented reality that Kate mentioned were:
Infinite Story, Finite Space
Chris Stapleton, co-chair of ISMAR09, gave us his vision for augmented reality and told us about the projects he's worked on. He says "we think that if we deal with physical space, we can only deal with one story." But if the augmentations can change, this is no longer true. Using augmented reality, we can allow users to add their imagination, rather than just give them the story - imagination is the third reality.
A project that really intrigued me was a memory scape for the Maitland Holocaust Museum. The idea was to recreate stories told in children's diaries of the Holocaust so visitors could understand what happened in terms of humanity. A physical space would be created, and embedded projections used to bring the space to life. Bits and pieces of the story can be told through these augmentations, and imagination can fill in the rest. Even better, you could experience a different story each time you visited the museum.
Chris goes on to lay out the spectrum of levels of engagement:
- Passive : Absorb the media (TV)
- Engaging : Think and feel (film)
- Active : Participate (amusement park rides)
- Reactive : Choose (games)
- Interactive : Contribute (Second Life)
- Experiential : Live (enhanced media? augmented reality?)
Total Immersion
Unfortunately I didn't catch the presenter's name (wasn't on slides, didn't see it in the schedule), but he was from Total Immersion. He pointed out that 20-30% of the population are auditory learners, 40% are visual, and 30-40% are kinaesthetic. Naturally, augmented reality helps the kinaesthetic learning type unlike many other mediums. Experiences will allow for engagement, reflection, insight, and of course, learning.
Also touched on was a set of best practices for augmented reality. It needs to attract users, be easy to use, and give instant access to the experience.
Finally, a comparison between entertainment and education was made to highlight some of the differences.
Entertainment:
- Audience: Groups
- Duration: Quick
- Engagement: Immediate gratification
- Outcome: Something immersive or magical
- Audience: Small groups or individuals
- Duration: Long term engagement
- Engagement: Deep exploration
- Outcome: A new visual perspective on topic of study
Museum Learning and AR
Last but not least, we have Kate Haley Goldman from the Institute for Learning Innovation. She's all about informal and free choice learning, creating voluntary, non-sequential learning experiences like Wolf Quest. Though Wolf Quest is not a space-based game, it's interesting nonetheless for its great success; kids played it much longer than the expected two to three hours it was designed for.
Kate explained that personal, sociocultural, and physical context are all factors that help influence learning. These are all things that can be employed in, say, museum exhibits. But why do people visit museums? Research has shown that reasons range from fun and entertainment, to social activity, to being a site of interest (a 'must-see' while on vacation), or even specifically to be challenged or learn something. How might augmented reality help with all this?
One project Kate talked about was an AR system that augmented the wing of a plane. Visitors could adjust various settings or move the wing, and see the resulting forces in the augmentation. They could then figure out whether the plane would actually fly under those conditions. This system helped with some of the above goals (especially learning, as tests showed), but not all. For example, the system was too separated and thus lacked the social aspect of a museum visit. Users couldn't really talk to or interact with each other.
Some of the aspirations of augmented reality that Kate mentioned were:
- creating conversations
- making the abstract tangible
- helping visualize change
- adding sensory capabilities
- supporting critical thinking skills
- ability to act like a scientist (collect data, form and test theories)
Monday, October 19, 2009
ISMAR09: Experiential Learning 1 of 3 - Current Applications
The International Symposium on Mixed and Augmented Reality (ISMAR) held workshops today, and I was lucky enough to be assigned as student volunteer to exactly the workshop I wanted to attend.
From the program (though it was referred to as experiential learning on the conference signs):
First Post of Three
This is the first of three posts covering this workshop. Here, I will summarize the first three of six presentations given in the morning by those already using augmented reality for their particular purposes. In the next post, I will cover the remaining talks. Finally, the third post will cover the afternoon's discussions that sought to answer three main questions about augmented reality's place in education.
What Is Augmented Reality?
If you really want to know, check out the Wikipedia article. The points mentioned before the six presentations began include:
What's Happening at UCF
Eileen Smith, director at the Institute for Simulation and Training at the University of Central Florida, spoke first, telling us about some of the projects surrounding experiential learning going on at UCF. Some examples include informal learning at museums, teacher training, recreating the World Fair, and military training.
One of the most interesting and unique uses of AR was, for me, the green kitchen. This is a reconfigurable set of cabinetry that can be arranged as per anyone's kitchen. Someone requiring cognitive rehabilitation can then wear a head mounted display and see what looks a lot like their own house, and then practice performing simple tasks like making cereal.
Another neat project was Journey With the Sea Creatures. A magic window into a fossil exhibit that would otherwise never change made the museum worth visiting more than once. This particular program filled the room with water and brought in the amazing creatures alive many years ago. Apparently once the children discovered this feature, they would go back into the main exhibit area and start swimming around for their friends and family to see on the magic window.
Eileen closed with a suggestion on when to use augmented reality. Don't use it when the real world will do just fine (in other words, if you can just do what you are trying to simulate, why bother with the simulation?). Instead, employ AR when you want to explore space, time, and scale, or to collect data you can then use or display to others later.
Museum Exploration, DNP Digitalcom
Next up was Tsutomu Miyashita from DNP Digitalcom [Japanese]. He discussed the AR projects intended for use in the Louvre to encourage better appreciation of art by visitors, and route guidance.
His group wanted to use markerless tracking at first, since they felt that the 2D bar codes would probably detract from the art itself, not being terribly attractive. Visitors using this technology were surprised and gleeful, but because they were not familiar with the concept of AR, they did not use it as expected. Furthermore, the weight and battery life of the devices used were a problem. (Something that may not be as important in research, but crucial in the real world!)
The next iteration used cell phones and markers instead. In the interface, a computer animated character taught users how to view art and properly appreciate it in addition to showing them where to go next. They understood the marker-based system much better, and the system also performed better in terms of recognition accuracy.
The key takeaway was that users feel surprised when they see augmented reality for the first time, leading to strong attention. But if they don't really know how to use it, then engaging them is really important so that they actually want to figure it out. Finally, once their attention is obtained, retention, understanding, and satisfaction become the aim.
EyePet
Istvan Siklossy spoke next, mainly showing us the new EyePet game for Playstation 3. He explains that in camera based games, you typically see yourself and use motions and gestures to interact. Player actions generally map to the game action, making the games accessible to everyone.
In EyePet, an adorable creature comes to life on your living room floor. Your interaction with it, which occurs by gestures as well as with a special marker, is robust and responsive. It's quite impressive! To get the robust tracking even in low-lighting (noisy images), the group took the usual tracking algorithms and made some improvements, such as rapid multiple thresholding to find many contours and locate the marker. It's crucial in the skill-based games that the tracking accuracy is no less than excellent.
In terms of learning environments, the EyePet allows for experimentation in that some basic sketches drawn by players are interpreted and transformed into toys for the pet. Players learn how the pet reacts, get a personalized experience, and have an opportunity to record and share videos of their experience.
From the program (though it was referred to as experiential learning on the conference signs):
Falling in Love with Learning: Education and Entertainment Converge with Learning Landscapes is designed to meet the needs of people who are currently designing memorable and lasting experiences for visitors and students through AR technology. These include professionals in the areas of:The leaders of this workshop will discuss how they are currently using Mixed and Augmented Reality for education and entertainment and the challenges they face or most wish to tackle in the future.
- cultural heritage preservation
- education and in-situ learning
- entertainment and games for learning
- museum curation and design
First Post of Three
This is the first of three posts covering this workshop. Here, I will summarize the first three of six presentations given in the morning by those already using augmented reality for their particular purposes. In the next post, I will cover the remaining talks. Finally, the third post will cover the afternoon's discussions that sought to answer three main questions about augmented reality's place in education.
What Is Augmented Reality?
If you really want to know, check out the Wikipedia article. The points mentioned before the six presentations began include:
- AR gives context to the situation. It's not an out-of-body experience or a separate thing from the world we know.
- Blends the real and the synthetic.
- When the technology disappears, the imagination is enhanced.
- Involves multiple senses.
- Can record experiences in detail (such as high scores, stress of learners, etc).
What's Happening at UCF
Eileen Smith, director at the Institute for Simulation and Training at the University of Central Florida, spoke first, telling us about some of the projects surrounding experiential learning going on at UCF. Some examples include informal learning at museums, teacher training, recreating the World Fair, and military training.
One of the most interesting and unique uses of AR was, for me, the green kitchen. This is a reconfigurable set of cabinetry that can be arranged as per anyone's kitchen. Someone requiring cognitive rehabilitation can then wear a head mounted display and see what looks a lot like their own house, and then practice performing simple tasks like making cereal.
Another neat project was Journey With the Sea Creatures. A magic window into a fossil exhibit that would otherwise never change made the museum worth visiting more than once. This particular program filled the room with water and brought in the amazing creatures alive many years ago. Apparently once the children discovered this feature, they would go back into the main exhibit area and start swimming around for their friends and family to see on the magic window.
Eileen closed with a suggestion on when to use augmented reality. Don't use it when the real world will do just fine (in other words, if you can just do what you are trying to simulate, why bother with the simulation?). Instead, employ AR when you want to explore space, time, and scale, or to collect data you can then use or display to others later.
Museum Exploration, DNP Digitalcom
Next up was Tsutomu Miyashita from DNP Digitalcom [Japanese]. He discussed the AR projects intended for use in the Louvre to encourage better appreciation of art by visitors, and route guidance.
His group wanted to use markerless tracking at first, since they felt that the 2D bar codes would probably detract from the art itself, not being terribly attractive. Visitors using this technology were surprised and gleeful, but because they were not familiar with the concept of AR, they did not use it as expected. Furthermore, the weight and battery life of the devices used were a problem. (Something that may not be as important in research, but crucial in the real world!)
The next iteration used cell phones and markers instead. In the interface, a computer animated character taught users how to view art and properly appreciate it in addition to showing them where to go next. They understood the marker-based system much better, and the system also performed better in terms of recognition accuracy.
The key takeaway was that users feel surprised when they see augmented reality for the first time, leading to strong attention. But if they don't really know how to use it, then engaging them is really important so that they actually want to figure it out. Finally, once their attention is obtained, retention, understanding, and satisfaction become the aim.
EyePet
Istvan Siklossy spoke next, mainly showing us the new EyePet game for Playstation 3. He explains that in camera based games, you typically see yourself and use motions and gestures to interact. Player actions generally map to the game action, making the games accessible to everyone.
In EyePet, an adorable creature comes to life on your living room floor. Your interaction with it, which occurs by gestures as well as with a special marker, is robust and responsive. It's quite impressive! To get the robust tracking even in low-lighting (noisy images), the group took the usual tracking algorithms and made some improvements, such as rapid multiple thresholding to find many contours and locate the marker. It's crucial in the skill-based games that the tracking accuracy is no less than excellent.
In terms of learning environments, the EyePet allows for experimentation in that some basic sketches drawn by players are interpreted and transformed into toys for the pet. Players learn how the pet reacts, get a personalized experience, and have an opportunity to record and share videos of their experience.
Saturday, October 17, 2009
ISMAR09: What I'm Looking Forward To
Ori Inbar over at Games Alfresco is doing a pretty top-notch job of getting me excited about this year's International Symposium on Mixed and Augmented Reality, or ISMAR, where I'm going to be a student volunteer. For example, who wouldn't feel giddy when they saw a program that could turn sketches made on paper into working 3D virtual models in real time?
Today I finally got the chance to look at the schedule in more detail. There are a few sessions that I feel are must-see's for me, so hopefully my volunteer schedule can somehow accommodate this. At the very least, I hope I can do my duties in the rooms where these talks happen!
There have been indications scattered throughout some of my past posts that I really wanted to do augmented reality for my PhD research. Lately I've been feeling more and more certain that I want to explore educational entertainment that makes use of augmented reality. My current vision involves building games for kids that help them learn computer science concepts, kind of like CS Unplugged does, but on an individual basis.
I hope to explore two main areas: first, I would love to look at how well children understand augmented reality and what interfaces are best suited for them; and second, based on those results, I want to build such an interface. I somewhat expect that kids will need some kind of tangible component to best understand augmented reality, as they don't seem to be very abstract in general, so I'm hoping there will be some major (or at least somewhat major) new technological aspects to explore.
Based on this, Monday's workshop Falling in Love with Learning: Education and Entertainment Converge is an obvious choice for me. If I could do nothing else all conference, I think I would walk away satisfied.
A close contender for top choice is my second must-see: Tuesday's tutorial on AR Game Design, given by Blair MacIntyre of Georgia Tech. You know, I never really knew much about Georgia Tech before about a year ago, being a Canadian who can't keep track of all the schools down in the good ol' US of A. But once I started to see the amazing videos of their projects, they gained a lot of respect from me. No wonder they seem to rank in top schools for computer science! Anyway, the connection of this workshop to my thesis ideas should be obvious.
Finally, there is a talk in the Arts, Media, and Humanities Track that I'm also pretty pumped about. It's called Science Meets Fiction: Imagining the Future of Mixed and Augmented Reality. Aside from the inspiration for new ideas that I expect to get from this session, the speaker's affiliation is rather intriguing. He's a Disney Imagineer, something I learned about thanks to Randy Pausch's Last Lecture. I never really cared to go to Disney World until I heard about the Imagineers. Now I want to go solely to see what cool tech they've come up with!
So that's what's making me excited as I write my packing list for ISMAR. I fly out tomorrow and hope to document as much of what I see for you guys as I have time for. I am bringing my Nikon D90, which has a built-in video feature, so watch for photos and videos, too!
Today I finally got the chance to look at the schedule in more detail. There are a few sessions that I feel are must-see's for me, so hopefully my volunteer schedule can somehow accommodate this. At the very least, I hope I can do my duties in the rooms where these talks happen!
There have been indications scattered throughout some of my past posts that I really wanted to do augmented reality for my PhD research. Lately I've been feeling more and more certain that I want to explore educational entertainment that makes use of augmented reality. My current vision involves building games for kids that help them learn computer science concepts, kind of like CS Unplugged does, but on an individual basis.
I hope to explore two main areas: first, I would love to look at how well children understand augmented reality and what interfaces are best suited for them; and second, based on those results, I want to build such an interface. I somewhat expect that kids will need some kind of tangible component to best understand augmented reality, as they don't seem to be very abstract in general, so I'm hoping there will be some major (or at least somewhat major) new technological aspects to explore.
Based on this, Monday's workshop Falling in Love with Learning: Education and Entertainment Converge is an obvious choice for me. If I could do nothing else all conference, I think I would walk away satisfied.
A close contender for top choice is my second must-see: Tuesday's tutorial on AR Game Design, given by Blair MacIntyre of Georgia Tech. You know, I never really knew much about Georgia Tech before about a year ago, being a Canadian who can't keep track of all the schools down in the good ol' US of A. But once I started to see the amazing videos of their projects, they gained a lot of respect from me. No wonder they seem to rank in top schools for computer science! Anyway, the connection of this workshop to my thesis ideas should be obvious.
Finally, there is a talk in the Arts, Media, and Humanities Track that I'm also pretty pumped about. It's called Science Meets Fiction: Imagining the Future of Mixed and Augmented Reality. Aside from the inspiration for new ideas that I expect to get from this session, the speaker's affiliation is rather intriguing. He's a Disney Imagineer, something I learned about thanks to Randy Pausch's Last Lecture. I never really cared to go to Disney World until I heard about the Imagineers. Now I want to go solely to see what cool tech they've come up with!
So that's what's making me excited as I write my packing list for ISMAR. I fly out tomorrow and hope to document as much of what I see for you guys as I have time for. I am bringing my Nikon D90, which has a built-in video feature, so watch for photos and videos, too!
Wednesday, October 14, 2009
How Does Usability Engineering Fit Into the Field of Software Engineering?
I just finished reading Usability Engineering Turns 10, a paper from 1996 by Keith Butler. One of the main questions that jumped out at me, as a computer scientist, what how usability engineering can fit into the larger field of software engineering. I suspect things have changed in thirteen years (for example, a company that I did some co-op placements with has changed from being run by the engineers in the nineties to being run by the business and marketing types today, and there are now entire teams of user design experts). Despite this, I have noticed a resistance from some students to even consider making a basic user and task analysis course mandatory, even just for the most relevant streams in our computer science degree.
The usability engineering cycle outlined by Butler is fairly straightforward, and likely looks familiar to software engineers:
Some interesting key points from the article:
First of all, a lot of what we learn in computer science could arguably be deemed "not computer science" if one wanted to be particularly pedantic. After all, software engineering isn't about algorithms or system design; it's about engineering processes. Yet software engineering is a course we all have to take it. Wouldn't at least being aware of how the designers came up with their decisions help us develop their vision more accurately?
According to our undergraduate calendar, students in the software engineering stream have to take a quality assurance course. Again, this could argued as something a little less computer science and a little more engineering. If those students learn about what happens during and after development, shouldn't they have the complete picture by learning what comes before?
Here's another good point that Butler makes:
Butler sums it up perfectly:
The usability engineering cycle outlined by Butler is fairly straightforward, and likely looks familiar to software engineers:
- User and Task Analysis
- Interface Design
- Building (Iterative Prototyping)
- Usability Evaluation
Some interesting key points from the article:
- The abstract objective of usability engineering is the minimization of cognitive and perceptual overhead required from the user.
- Intuitive interfaces are the result of designers connecting the layers between the mapping of a user's conceptual model to the functions of the system, the user determining the exact commands and arguments needed to control various functions, and the user's physical execution of the commands.
- User and task analysis has two objectives: first, to understand the situation as it is, and second, to improve it.
- Analogies help users connect the software with their mental model of the world. If an analogy is not defined in the software, the user will invent one.
- The default practice is often assigning only as many functions to the computer as budget and time will allow, but it's better to understand what the computer do better and what humans do better and assign functions accordingly.
- When designing layouts and operation of screens in software, the low level details can be worked out using UI standards. Higher level details, on the other hand, are driven by analogy and mental models.
First of all, a lot of what we learn in computer science could arguably be deemed "not computer science" if one wanted to be particularly pedantic. After all, software engineering isn't about algorithms or system design; it's about engineering processes. Yet software engineering is a course we all have to take it. Wouldn't at least being aware of how the designers came up with their decisions help us develop their vision more accurately?
According to our undergraduate calendar, students in the software engineering stream have to take a quality assurance course. Again, this could argued as something a little less computer science and a little more engineering. If those students learn about what happens during and after development, shouldn't they have the complete picture by learning what comes before?
Here's another good point that Butler makes:
Application development projects, however, must already deal with function, costs, schedule, GUIs, data management, communications, software architecture, methods, tools, standards. Unless part of a comprehensive, integrated approach to application development, usability can easily end up being just one more tail trying to wag the dog.So in addition to simply having a big-picture understanding of the entire software development process, we have to be careful that the usability design phase doesn't get shoved aside partially due to the attitudes of those on the development side of things. Unless, of course, you think that us programmers can design a perfect product on our own. (Yeah, right.)
Butler sums it up perfectly:
Cultural obstacles in the computing community must be overcome in adding a user-centered perspective to the existing technology-centered focus.Once again, though I think things have improved, there's still work to do, evidenced by my school colleagues. Here's hoping that one day an introduction to user and task analysis will be on the curriculum of anyone taking a software engineering class.
Tuesday, October 13, 2009
GHC09: Girls, Computer Science, and Games
Although Grace Hopper has come and gone for some time now, the excitement still lingers. My second talk, the one I did on my own, was in the second last session of the whole conference. As you may recall, this talk was all about my computer science and games mini-course for grade eight girls.
I was supposed to only have half an hour for my talk, so my formal slides fit into that slot perfectly. However, the person for the second half hour didn't show, so I got to talk for the entire time! Cool! I took questions, showed one of the games the girls made in the first iteration of the course, and showed some course slides.
Terri was the official blogger for this session and had some great things to say about it:
I was so pumped when I was done this session. Many audience members came up to talk to me about the course and their ideas for their own outreach. I have never felt so important before! ;)
I was supposed to only have half an hour for my talk, so my formal slides fit into that slot perfectly. However, the person for the second half hour didn't show, so I got to talk for the entire time! Cool! I took questions, showed one of the games the girls made in the first iteration of the course, and showed some course slides.
Terri was the official blogger for this session and had some great things to say about it:
Gail Carmichael hit upon the idea of doing a 1 week course on games for girls when her university was soliciting proposals for "enrichment mini courses." These courses are largely attended by grade 8s (~13 year olds), typically the advanced students from the local schools. They're intended to give the students a one-week taste of the university environment. If you are interested in running such a program, Gail suggests that there are often similar programs in other cities, local summer camps, local WISE groups, the Girl Guides/Girl Scouts and many others who could help set something up.
She notes that another thing the girls craved is Starbucks coffee... who knew?
Gail ended up having the entire hour to herself, since the second speaker, Anne Marie Agnelli, was unable to attend. This gave an opportunity for Gail to showcase one of the games created by her students, as well as have a longer question/discussion section. In fact, the second half of the presentation became much more like a Birds of a Feather session where a variety of women talked about their questions and experiences.I also had an awesome note taker for this session, Eshe, who had come out on the Tuesday night before the conference when we had a dinner to discuss outreach efforts for young women. Check out the notes she wrote for my talk, where you can also find the slides I used.
I was so pumped when I was done this session. Many audience members came up to talk to me about the course and their ideas for their own outreach. I have never felt so important before! ;)
Friday, October 2, 2009
GHC09: Tips, Tricks and Software for Keeping Research Organized
As somebody who naturally loves to organize, this session was close to my heart. Oddly enough, I didn't really do a whole lot of organizing for my Masters research (I guess it was 'simple' enough that I didn't need to), but I'm really excited to use some of this advice as I start my PhD. One of the first things I'm going to (finally) do after thinking about it a lot is setting up an SVN server on my own webserver.
Why Organization Matters
You will do a lot of stuff in 5-7 years, and you'll forget a lot of it. Why waste time recreating work you've already done by being disorganized? (Your advisor doesn't teach you this kind of thing!)
Mistakes Made
The panelists share the following mistakes they have made:
Considerations
Index cards, loose leaf paper, or notebooks are good for temporary notes and drawings, but are easy to lose, not portable, and not searchable. A research blog might be a good place to process ideas and search them later, as well as allow group members to follow your work and make comments, but makes it difficult to organize ideas. You can keep weekly notes in Google Docs, using coloured highlighting to track what is done and what is not; however, this often produces very large documents. A Google Site takes this a step further, allowing multiple pages that can be used to track progress, share with group members, and so on.
Audience suggestions: Webspiration: Online Visual Thinking. For math notes, some use TeX and SVN. Delicious is used to remember websites visited and Diigo is a web highlighter and sticky note tool. MS OneNote is also popular.
Keeping Papers Organized
Keep track of author, title, etc, but also notes about key points and criticisms. Even if you've only skimmed a paper, make a note of it. When choosing tools look for the ability to make citations and bibliographies for papers, take notes, and link to the paper (PDF).
I've blogged before about the tools available on Windows, and another mentioned here is Pybliographer. I also hadn't included EndNote in my list since it's not free.
Pro tip from audience: As soon as you read a paper, get the FULL citation information. It's amazing how hard it can be to find later when you only note the title. Always put every document you've read in your organizing software.
Keeping Experiments Organized
At stake: sanity, time, and reputation. When you were wrong about "never using that code again," you will waste a lot of time if you didn't bother keeping everything organized.
Organized your file system by project and experiment. Make your code modular by separating code for preprocessing data, running the method, summarizing results, and creating figures/tables. When something goes wrong, make it so you can re-run only the part that went bad. Make your experiments reproducible (store random seeds, input parameters, and know what versions of libraries (etc) are used.
Other tips:
Why Organization Matters
You will do a lot of stuff in 5-7 years, and you'll forget a lot of it. Why waste time recreating work you've already done by being disorganized? (Your advisor doesn't teach you this kind of thing!)
Mistakes Made
The panelists share the following mistakes they have made:
- Not commenting code.
- Not taking notes during meetings.
- Keeping track of papers (also known as messy piles on your desk).
- Not using source control systems.
- Not writing down research ideas.
Considerations
- Do I work alone or with collaborators?
- Do I work on multiple machines that require synchronization?
- Do I have limited amounts of storage?
- Do I need to keep paper records or record data off my computer?
- Is my work backed up?
Index cards, loose leaf paper, or notebooks are good for temporary notes and drawings, but are easy to lose, not portable, and not searchable. A research blog might be a good place to process ideas and search them later, as well as allow group members to follow your work and make comments, but makes it difficult to organize ideas. You can keep weekly notes in Google Docs, using coloured highlighting to track what is done and what is not; however, this often produces very large documents. A Google Site takes this a step further, allowing multiple pages that can be used to track progress, share with group members, and so on.
Audience suggestions: Webspiration: Online Visual Thinking. For math notes, some use TeX and SVN. Delicious is used to remember websites visited and Diigo is a web highlighter and sticky note tool. MS OneNote is also popular.
Keeping Papers Organized
Keep track of author, title, etc, but also notes about key points and criticisms. Even if you've only skimmed a paper, make a note of it. When choosing tools look for the ability to make citations and bibliographies for papers, take notes, and link to the paper (PDF).
I've blogged before about the tools available on Windows, and another mentioned here is Pybliographer. I also hadn't included EndNote in my list since it's not free.
Pro tip from audience: As soon as you read a paper, get the FULL citation information. It's amazing how hard it can be to find later when you only note the title. Always put every document you've read in your organizing software.
Keeping Experiments Organized
At stake: sanity, time, and reputation. When you were wrong about "never using that code again," you will waste a lot of time if you didn't bother keeping everything organized.
Organized your file system by project and experiment. Make your code modular by separating code for preprocessing data, running the method, summarizing results, and creating figures/tables. When something goes wrong, make it so you can re-run only the part that went bad. Make your experiments reproducible (store random seeds, input parameters, and know what versions of libraries (etc) are used.
Other tips:
- Use good programming practices.
- Handle errors.
- Code unit tests.
- Use an IDE which integrates with debuggers and revision control.
- Use a good LaTeX editor.
- Use revision control and/or track changes (especially with multiple authors!).
- Keep track of what version of a paper has been submitted where.
- Start early, and remember that writing can help organize your thoughts.
GHC09: I Am a Technical Woman!!
...and I really am! I'm even in the video. :) Please watch and pass on - make it viral!
(It was filmed at least year's Grace Hopper in Keystone, Colorado. More info on the Anita Borg website.)
(It was filmed at least year's Grace Hopper in Keystone, Colorado. More info on the Anita Borg website.)
Thursday, October 1, 2009
GHC09: Have You Ever Considered Being an Entrepreneur?
I'm going to try doing this post a little differently. I'm recording information during the actual session instead of taking notes and writing it up later. Below I have the introductions of the panelists, some general session notes, and a few of the audience questions.
Sandy Jen, Meebo
How can I find the right people to become co-founders, stock holders, etc?
Sandy Jen, Meebo
- First job after graduation had cubicle walls. She was short and walls were really tall!
- Worked on own ideas after working a regular 8-5 day.
- Got funding after launching.
- Two lives: During the day, working on mobile start-ups; at night, help run Women 2.0.
- Women 2.0 helps women launch start-ups via networking, workshops, competitions, etc.
- Grew up in Canada, travelled back and forth to Pakistan. Helped her see how much she had living in Canada. Technology is one thing that's missing in places like Pakistan.
- Thought she wanted to code but found out otherwise in first course with JavaScript. Took business to be able to work with engineers instead.
- Interviewed at Google after graduation. Saw Silicon Valley for first time. Didn't get Google job, but came to Silicon Valley anyway.
- It's not about what you've already done or the failures you've had - it's about what you want to do next.
- Don't over plan and see what interesting bumps come along on the way.
- Liberal arts snob in undergrad.
- Worked at Random House book publishing.
- Found someone also interested in the idea of spreading ideas while editing his book. Started Squidoo.
- Wants us to realize what a cool moment we're in, and to just pick something and start doing it (don't spend too much time researching, etc!).
- Social startups are a current trend (where social is code for non-profit).
- Biggest question: I have this idea. How do I make it happen?
- Get the feeling that you can make things better.
- In tech, you can choose ideas that don't cost a lot of money up front (especially on web). Sandy used her own money to start Meebo, making it her baby (and making cost-cutting decisions easier).
- Lately, the start-ups that have made it got funding from friends and family they trust rather than venture capitalist money. Start making money with ads and freemium models.
- Don't ask for permission to execute your idea when asking for money. Be frank that it might fail in a year.
- Share your idea and get feedback. Others will have the same idea, but will approach it differently, so you don't have to keep it to yourself.
- Passion: You see an opportunity really clearly, and you'd feel really bummed if you missed it. Helping people and enabling people. Gives you that internal energy that nobody else can give you.
How can I find the right people to become co-founders, stock holders, etc?
- Look for people who want the experience and the celebrity status.
- Being in the right environment can help a lot. E.g. Silicon Valley is ripe with technologists.
- Check entrepreneurial resources.
- Go to Elance.
- You can't know. But that doesn't make your idea invalid.
- "No point in trying to out-Google Google." If the other people fill every need you ever had and you're envious of it... you probably don't need to add to the market.
- The idea itself should never be a secret. The idea might be the same but the implementation different.
- Those you tell haven't gone through the evolution you have, so you're most likely to do it better.
GHC09: PhD Forum 2
As the mentor for this PhD Session noted, the three talks given really show the eclectic mix that can be found in computer science. This was the first time I attended these forums, and I tried my best to fill in the feedback forms as best I could with useful comments. All three presenters did a really good job and were really well prepared, so my comments were really only of small things!
Warehousing Markovian Streams
Julie Letchner
Imagine that you have an RFID tag attached to you, and that several sensors record your movement around a building with time stamps. You might want to ask questions like "when did Bob enter the coffee room?" The only problem is that you can't be 100% sure where exactly someone is based on the RFID sensors, since there are overlapping signals, etc. Instead, there are a bunch of probabilities of Bob physically being somewhere, probably based on how close the signal is to the sensor.
Julie's research was all about having a database of all these probabilities stored as Markovian streams (I think). The key question was how to make it more efficient, and the main points of the answer centre around indexing and approximation. The Lahar database developed is efficient enough to run in real time as the data is streamed.
There are some cool applications of Markov streaming, so making use of this kind of data is definitely desirable. Some examples include using tracked information for diaries, health monitoring and fitness assessments. Markov streams can also be used to process audio streams, which may be very useful for sound search.
Classroom Resources and Impact on Learning
Margaret A Dickey-Kurdziolek
The big question for Margaret is whether there is worth in having technology in classrooms. I think this is a very interesting question indeed. After all, it's easy to try and bring in all the newest and coolest tech, but are kids actually learning more because of it?
Margaret focused on SimCalc. She found that when it came to test scores on standardized tests, the use of this technology didn't improve student results all that much. But when it came to the students' abilities to learn advanced math skills, the technology made a huge difference. This brings up a whole other issue about standardize tests hurting more than helping, but that's another blog post for another day.
The research focused on a selection of teachers from Texas who used SimCalc is various setups, from all students using it in the computer lab and having their own computer to the teacher just projecting one computer in the classroom. I actually don't recall the results between these setups, but Margaret did mention that the students who shared often faced problems, but that learning to share was highly valued by teachers.
I think this sort of research will be very useful in shaping the future of technology in the classroom, and am looking forward to seeing more of it as time goes on.
Augmenting Biographical Memory
Andrea Schweer
The goal of this research is to help people remember the details and events of their lives. For instance, have you ever wondered "when did I meet this person and what did we talk about" after a day at Grace Hopper? Wouldn't it be great to have some easy way to recall these little details?
Current solutions for this are what we might call 'male-oriented'. It's kind of like someone noticed some cool tech out there and wanted to figure out a way to use it. Instead, Andrea took a more human approach and used cognitive science to figure out how people and memory work. She found out about the differences between memory cues and the memories themselves; the older methods of life blogging and semantic desktop don't really differentiate these things.
The highlight of this talk for me was the idea that computer science can benefit so much from so called 'softer' sciences (especially psychology). I completely agree with this, and I wish more computer scientists could be exposed to these ideas, even if they don't have to work with them directly.
Warehousing Markovian Streams
Julie Letchner
Imagine that you have an RFID tag attached to you, and that several sensors record your movement around a building with time stamps. You might want to ask questions like "when did Bob enter the coffee room?" The only problem is that you can't be 100% sure where exactly someone is based on the RFID sensors, since there are overlapping signals, etc. Instead, there are a bunch of probabilities of Bob physically being somewhere, probably based on how close the signal is to the sensor.
Julie's research was all about having a database of all these probabilities stored as Markovian streams (I think). The key question was how to make it more efficient, and the main points of the answer centre around indexing and approximation. The Lahar database developed is efficient enough to run in real time as the data is streamed.
There are some cool applications of Markov streaming, so making use of this kind of data is definitely desirable. Some examples include using tracked information for diaries, health monitoring and fitness assessments. Markov streams can also be used to process audio streams, which may be very useful for sound search.
Classroom Resources and Impact on Learning
Margaret A Dickey-Kurdziolek
The big question for Margaret is whether there is worth in having technology in classrooms. I think this is a very interesting question indeed. After all, it's easy to try and bring in all the newest and coolest tech, but are kids actually learning more because of it?
Margaret focused on SimCalc. She found that when it came to test scores on standardized tests, the use of this technology didn't improve student results all that much. But when it came to the students' abilities to learn advanced math skills, the technology made a huge difference. This brings up a whole other issue about standardize tests hurting more than helping, but that's another blog post for another day.
The research focused on a selection of teachers from Texas who used SimCalc is various setups, from all students using it in the computer lab and having their own computer to the teacher just projecting one computer in the classroom. I actually don't recall the results between these setups, but Margaret did mention that the students who shared often faced problems, but that learning to share was highly valued by teachers.
I think this sort of research will be very useful in shaping the future of technology in the classroom, and am looking forward to seeing more of it as time goes on.
Augmenting Biographical Memory
Andrea Schweer
The goal of this research is to help people remember the details and events of their lives. For instance, have you ever wondered "when did I meet this person and what did we talk about" after a day at Grace Hopper? Wouldn't it be great to have some easy way to recall these little details?
Current solutions for this are what we might call 'male-oriented'. It's kind of like someone noticed some cool tech out there and wanted to figure out a way to use it. Instead, Andrea took a more human approach and used cognitive science to figure out how people and memory work. She found out about the differences between memory cues and the memories themselves; the older methods of life blogging and semantic desktop don't really differentiate these things.
The highlight of this talk for me was the idea that computer science can benefit so much from so called 'softer' sciences (especially psychology). I completely agree with this, and I wish more computer scientists could be exposed to these ideas, even if they don't have to work with them directly.