First week in HCI lab Stuttgart

This is my first blog posting on human computer interaction site of UNH. I am blogging about my travel experience, lab visits in Europe and the project I am working on during my stay in Stuttgart Germany.

My name is Rudra Timsina. I am an Electrical Engineering student at University of New Hampshire. I graduated from UNH with a Bachelors of Science in Electrical Engineering in May 2014. I am attending UNH Graduate School from fall 2014. I am very thankful to Professor Kun and Professor Miller for providing me an opportunity to participate in a summer research project in Germany; International Research Experience for Students-a program of University of New Hampshire (UNH IRES).

It’s a great pleasure  for me to spend two months in Human Computer Interaction (HCI) lab in Stuttgart with three other friends. We are working with eye tracking devices, brain computer interfaces and biofeedback system to measure physiological responses induced as a result of cognitive load task and light illumination. We spent first few weeks learning new technologies that were needed for our project. We also spent some time reading literatures related to our project. We learned to use eye trackers (SMI, Facelab, tobii), EPOC-brain computer interface, and Nexus 10 – biofeedback system.

It was very exciting to meet a group of very nice people at HCI lab in Stuttgart. It was good to know about their interesting projects. Some of the projects were about augmented workplace with user defined tangibles, projects on public displays, large scale analysis of mobile notifications, project with driving simulators, etc.

I want to thank the whole HCI team at Stuttgart for helping us with housing setup, transportation, to get familiar with the city and project related assistance.

More post coming soon…

Working with a head mounted eye tracker.
Working with a head mounted eye tracker.

Eindhoven Technical University (TU/e), (7/7/2014)

20140707_134808[1]

Dr. Jacques Terken of the Eindhoven University of Technology (TU/e) hosted a very interesting tour of the TU/e campus and the Industrial Design Department during our day-trip to Eindhoven a few weeks ago.

Much of the research we were shown had to do with the study of peripheral interaction, which incorporates elements from the fields of HCI, HF/E, and Industrial Engineering.

We were first shown the department’s five-screen driving simulator, and we were given a chance to drive it ourselves. What set this simulator apart from others I’ve seen, though, is that they used a digital projector (behind the driver’s head) to display information ‘on top of’ the driving environment. This feature had been used to display information about the road ahead to a driver whose view is blocked by a freight truck in front of them. I think it’s a really innovative way of addressing problems with limited driver visibility.

We were also shown a demonstration of a blending of driving/commuting and social networking. Using hand gestures drivers in the simulation would be able to select and “Like” something about their driving environment, whether it be a cool car they saw, a place they drove past, or any number of other things someone might encounter in traffic. This was really interesting to me, and was the highlight of my trip to TU/e.

Big thanks to Jacques Terken, Saskia Bakker, Hanneke Hooft van Huysduynen, and Chao Wang for showing us around and sharing your work!

Hasso Plattner Institute, HCI Lab (6/30/2014)

IMG_6280

Dr. Dominik Schmidt of the Hasso-Plattner Institute (near Berlin) was a kind enough host to let the IRES team (myself, Micah Lucas, Michael Nguyen, and Rudra Timsina) visit his lab on Monday, June 30th. He and some of his colleagues gave us a tour of their impressive facility and demonstrated some of the research that they’re currently working on.

One really neat setup they had was an “interactive floor,” whose hardware took up space in two rooms in their building (one on top of the other). The idea is that you can expand your working area to include the floor, and the floor-projected display can detect whether a person is standing on it, sitting in a chair on it, or sitting on the floor. Your adjustable display within the entire floorspace available (comparable to a program window displayed on your desktop monitor) can move with you as you move around the room and allows you to “drag” your workspace with you to different areas of the floor. Very, very interesting implications for future workplace designs.

We were also shown some work with their human-powered virtual-reality simulation system, which is actually a simpler concept than it sounds. Basically, you wear a head-mounted display  while being held/carried in the air by four other people. Each person holds on to one ‘corner’ of your body, i.e., one person holds up the left leg, another one takes the right leg, one for the right arm, and another for the left arm. These four people get cues from the simulation about how to move the person they’re carrying (i.e., you) in order to simulate realistic motion and g-forces. It’s a surprisingly effective way to reduce the cost of expensive VR systems.

Overall, very neat work is being done in the HCI lab at the HPI campus. The researchers there are definitely people to keep an eye on for near-future HCI innovations.

Research at the Stuttgart HCI Lab

20140731_120128[1]

So far I’ve spent 8 weeks working with the HCI Lab in the SimTech building of the University of Stuttgart. During my time here, I’ve not only been working on IRES projects, but I’ve also seen (and sometimes participated in) research conducted by members of the HCI lab.

For example, a few weeks ago I participated in a focus group conducted by Dr. Niels Henze, in which we discussed the options for a movable/portable public display. The selling point for such a device was that it could either be moved around by a user, or it could move itself around (even on walls!) with a treaded drive system located below the display.

I also participated in an experiment guided by Mariam Hassib, in which I was asked my views about ways to navigate through menus of a smart-watch. Basically, the watch can detect your hand gestures, including individual finger movements and positions, which can then be translated into commands for navigation within the devices’ apps.

Finally, I had the great experience of being able to watch the PhD defense of Dr. Alireza Sahami. During his defense he gave a presentation on his work in the Stuttgart HCI Lab, much of which focused around emotive communication though digital devices such as smartphones.

Overall, I feel I have a pretty good idea of the kind of innovative research that the Stuttgart HCI Lab produces, and I’m glad to have absorbed some of their techniques and approaches into my own methodologies. The lab has provided us with a wealth of equipment to use during our project, much of which was completely new to me and took some time to get used to. This includes eye-tracker cameras, electroencephalograms (EEGs), heart-rate monitors, skin conductance monitors (GSR), body temperature sensors, and respiration sensors. In particular, our IRES project has focused on eye-tracking and its effectiveness as a tool for measuring cognitive workload. However, I’ll go into more detail about that in a future post.

Introduction to HCI Lab, Getting started in IRES projects, and thank you’s

So this is my first post on this blog, which was designed to chronicle the thoughts and experiences of the students privileged enough to participate in the 2014 “International Research Experience for Students” organized by the University of New Hampshire. It is part of a four-year program in which students spend their summer working with the HCI lab at the University of Stuttgart, Germany, under the advisement of Dr. Albrecht Schmidt.

I have to say it’s been quite an experience, and much more than I was expecting (in a good way, of course). Professors Andrew Kun and Tom Miller at the University of New Hampshire set me up with a great opportunity for career- and personal-development, and I hope I’ve delivered up to their expectations. I’ve certainly surpassed my own.

Most of our time in Stuttgart has been spent working and researching with the HCI lab, but we’ve also spent a great deal traveling Europe. I will go into further detail about both of those aspects in further blog posts. I’ll use this post to give some background to the future posts I’ll make.

My background is in Human Factors Psychology, and I’m currently a doctoral student at Clemson University’s Visual Perception & Performance Lab. Human Factors has been a big part of HCI for a long time now, and I’m glad I’ve been able to contribute to this multidisciplinary trend in scientific research.

When I got here, I had a general idea that our IRES team would be working on a research project involving eye-tracking, cognitive workload, and human-computer interaction with visual and auditory displays. However over the course of the last 8 weeks, I and my IRES colleagues have formulated a project that (in my opinion) has the potential to contribute much to these sub-fields while also setting the stage for future IRES projects.

I’d also like to say, before I go on to future posts, a big thanks to everyone at the Stuttgart HCI Lab, including-but-not-limited-to Albrecht, Bastian, Yomna, Stefan, Niels, Markus, Mauro, Mariam, Ali, Miriam, Katrin, Sven, Lars, Michael, and anyone else I might have forgotten (sorry if I did!)

Stay tuned for more updates about my experience in the IRES program.

University of New Hampshire