Tag Archives: eye tracking

2015 senior project ideas

Are you a UNH junior looking for an exciting senior project? Are you interested in driving research, and/or eye tracking research? Would you like to work publish your work at a conference (three recent senior projects resulted in publications: pupil diameter, navigation, driver authentication)? Would you like to design new interaction techniques, such as this LED-based augmented reality navigation aid:

If so, here is a list of ideas for 2015 senior projects:

  1. Collision warning systems. Collision warning systems issue auditory, visual, or multimodal warnings in the case of imminent collision. But, do drivers pay attention to these warnings? Do these systems reduce braking reaction time? These are some of the questions the senior project team will explore through driving simulator-based studies.
  2. Intelligent agent controller for automated vehicle. Automated vehicles are of great interest to the automotive industry. The senior project team will develop an intelligent agent to control a simulated vehicle. In future work the intelligent agent will be used in exploring HCI issues related to automated driving.
  3. Intelligent human-computer interaction that supports reengagement in driving. A central question in automated driving is: how will driver reengage in the driving task once the automation needs assistance? The senior project team will design strategies for alerting the driver, as well as methods to evaluate how fully the driver has reengaged in the driving task.
  4. Using Apple Siri while driving. With the support of Apple engineers we are setting up Siri in our driving simulator. The senior project team will design experiments to assess the safety of interacting with Siri while driving.
  5. Eye tracking for early detection of Alzheimer’s. Alzheimer’s disease is devastating. Early detection of the disease, and a subsequent early intervention, might improve the odds of successful treatment. The senior project team will explore the use of eye behavior and pupil diameter as measures for early detection.
  6. Comparing Prezi and slides. Prezi presentations are exciting. The senior project team will explore what the strengths and weaknesses of this presentation style when compared to traditional slide presentations.
  7. Your ideas. Do you have a senior project idea in the general areas of driving, and eye tracking? Let us know – send email to Andrew Kun.

Ludwig-Maximilian University Institute for Computer Science, Munich (7/21/2014)

20140721_101716[1]

As I mentioned in a previous post, the IRES students (including myself) were lucky enough to visit two HCI research labs in the same day during our trip to Munich. This current post will chronicle my experience of the LMU IFI as we were hosted and guided by Dr. Florian Alt.

Most of the work we were shown in Florian’s lab dealt with the topic of interactive public displays. Researchers at LMU IFI are working with eye-tracking and gesture-tracking technologies in order to change the information presented on a public display (e.g., billboards, posters) to target the user in a more specific way. For example, a public display can become clearer or angled to the viewer as they approach the display to see what it’s showing to them. Much of this work is still in progress so I’ll refrain from sharing any more about it, but I’ve got to say it was impressive.

I also had the chance to see a custom smartphone application developed by this lab, designed to offer alternative ways of unlocking your phone with a password. Most phones allow you to enter a PIN number, or swipe a pattern within a 9-target display to unlock your phone but the downside of this is that it leaves visible,oily streak marks on your device’s screen which may allow someone else to see your password pattern. Instead, this app shows a photograph of your choosing every time you unlock your phone, and you touch specific areas of the photo to unlock. However, the picture is slightly different every time (i.e,. zoomed in, upsidedown, mirrored, etc.) so that you never leave a consistent pattern of finger movement on your screen.

I’m very thankful that Florian and his colleagues were able to show us their working space, especially on such short notice. They even took us out to lunch at a very nice Korean restaurant in downtown Munich, which was great. I’ll look forward to seeing what other kind of work this lab produces in the future.