Thursday, October 22, 2009

ISMAR09: Human Factors and User Interfaces

There were more than a few good papers presented at ISMAR this year on human factors and user interfaces. Here's just a taste of them. See the conference schedule for information about authors and their affiliations.

Using Augmented Reality to Support Cross-Organizational Collaboration in Dynamic Tasks

This student paper was an honourable mention for the best paper awards. It was all about a crisis management system designed for use by commanders with different backgrounds. Augmented reality is intended to give each user a personalized view that they can most easily understand based on their culture and so on.

The scenario used for the user study - the first such study for joint realtime operations - was planning the fight again forest fires. Rescue, police, and military helicopter units are all involved.

The initial brainstorming stage with field experts in these areas suggested that hand held displays should be used to give individualized views of a command map. But we all know how important it is to ask the real users what works best for them, not their managers; it turned out that the field workers couldn't use the handhelds. They were too clumsy and took away their ability to use their hands freely. They wanted a shared map that they could point to and have the others see. In other words, they wanted a heads up display with joystick control.

When compared with a paper based map, the AR system with custom markers for each type of field worker performed significantly better.

Interference Avoidance in Multi-User Handheld Augmented Reality

Have you ever wondered how safe multi-user augmented reality games really are? I mean, when you're competing furiously while looking through your mobile device, it seems like it'd be pretty easy to knock into each other as you move around in the virtual world in front of you, right? Well, trying to avoid this is what this paper is all about.

The concept is pretty simple. As you move closer to your opponent, the virtual objects in your view shift slightly away from them. The key is to make sure that you as a user don't notice thisvhappening, so certain compensations are needed, such as covering the playing surface with a flat texture that can also shift with the virtual objects.

The amazing thing is how effective this approach is compared to other proximity warnings, like dimming the screen, beeping, and disabling user actions when they get too close to another user. Users perceived the shifting to be less distracting but also less effective than other methods. However, the real distance maintained between players in a competitive two-player game was significantly better than the other methods, making it quite effective in reality.

Interaction and Presentation Techniques for Shake Menus in Tangible Augmented Reality

The investigation in this paper sought to find a way to interact with objects directly in the environment using some kind of menu system. Objects should not require any kind of tags or electronics added to them beforehand, and hands should be able to manipulate the object freely without having to pick up something else as well.

The idea of a shake menu was inspired by shaking a gift to see what's inside. So you shake an object to open a menu, and then move the object to the desired menu selection and hold it there to make the choice. But what's the best way to present the menu items in relation to the object?

A user study looked at a clipboard paradigm in which menu items (which look like cubes) are aligned along the right of the object, and "stick" to it as it moves around in the camera's view. Other layouts include aligning the choices surrounding the object (this seems very similar to the clipboard version), aligning relative to the display only (so it sticks to the screen and doesn't move again), and aligning to the world coordinated, but not the object's.

The hypothesis was that the object alignment would be the fastest and most intuitive, and would be appreciated for the ability to examine the menu choice from different angles (after all, it could be any 3D object). However, user studies proved this wrong. The object was almost tied with the display alignment for the best speed, but display had far fewer errors than any other method. The display choice was also the best in terms of perceived intuitiveness, with object in second place.


Post a Comment

Comments are moderated - please be patient while I approve yours.