So far I’ve spent 8 weeks working with the HCI Lab in the SimTech building of the University of Stuttgart. During my time here, I’ve not only been working on IRES projects, but I’ve also seen (and sometimes participated in) research conducted by members of the HCI lab.
For example, a few weeks ago I participated in a focus group conducted by Dr. Niels Henze, in which we discussed the options for a movable/portable public display. The selling point for such a device was that it could either be moved around by a user, or it could move itself around (even on walls!) with a treaded drive system located below the display.
I also participated in an experiment guided by Mariam Hassib, in which I was asked my views about ways to navigate through menus of a smart-watch. Basically, the watch can detect your hand gestures, including individual finger movements and positions, which can then be translated into commands for navigation within the devices’ apps.
Finally, I had the great experience of being able to watch the PhD defense of Dr. Alireza Sahami. During his defense he gave a presentation on his work in the Stuttgart HCI Lab, much of which focused around emotive communication though digital devices such as smartphones.
Overall, I feel I have a pretty good idea of the kind of innovative research that the Stuttgart HCI Lab produces, and I’m glad to have absorbed some of their techniques and approaches into my own methodologies. The lab has provided us with a wealth of equipment to use during our project, much of which was completely new to me and took some time to get used to. This includes eye-tracker cameras, electroencephalograms (EEGs), heart-rate monitors, skin conductance monitors (GSR), body temperature sensors, and respiration sensors. In particular, our IRES project has focused on eye-tracking and its effectiveness as a tool for measuring cognitive workload. However, I’ll go into more detail about that in a future post.