Tag Archives: hcilab

2016 UNH IRES student research – part 2

In its third and final year, the UNH International Research Experiences for Students (IRES) program has selected eight students to conduct research in the HCI Lab at the University of Stuttgart, under the supervision of my colleague Albrecht Schmidt. The UNH IRES program is funded by the National Science Foundation, Office of International and Integrative Activities, and we are grateful for their support. The eight students were each assigned to a group within the HCI Lab and participated in the research activities of that group.

I asked each of the students to write a one-paragraph report on their summer experience in Stuttgart, focusing on their research, and on their cultural experience. This is the second installment of these brief reports, where we look at some of the research conducted by the students. (You can see the first installment here.)

Natalie Warren worked EEG recording devices:

Learning about EEG during the past two months under the supervision of Valentin and Jakob has been very rewarding. I’ve learned a huge amount about signal processing, experiment design, MATLAB, coding stimulus presentations, and brain activity, not to mention using EEG recording systems! We also got to put our knowledge to use early in the program by measuring electrical activity generated by the eye movement of some of our colleagues (like Anna, pictured here).

Whitney Fahnbulleh worked on augmenting human memory:

This summer I have been developing a photo gallery application for the “recall” project, a project that explores ways to augment human memory. I have been implementing various ways users can interact with the gallery through touch gestures, mid-air gestures, speech recognition, and keyboard input. My end goal for this project is to flesh out the user interface design and run user studies on the application. I have learned so much about computer vision this summer, and I look forward to working on future projects for recall.

Aditi Joshi worked on visualizing uncertainty:

For the past two months, I have been working on designing and implementing a study investigating uncertainty visualizations. In the future, the amount of uncertain information that we will have access to will increase and often they will have conflicting information. With this study, we are trying to understand how people aggregate uncertainty information so we can implement these techniques in future technologies. In this picture Anna is participating in the study and providing us with some great data.

Donovan O.A. Toure how the realism of virtual faces affects the human observer:

This summer, I worked on the perception of computer generated/virtual faces within the Uncanny Valley by analyzing brain waves as an individual is presented with virtual faces with varying levels of detail. In addition to learning about EEG, digital signal processing, and the uncanny valley, I worked on stimulus creation–including 3D modelling–to help carry out the experiment design.

2016 UNH IRES student research – part 1

In its third and final year, the UNH International Research Experiences for Students (IRES) program has selected eight students to conduct research in the HCI Lab at the University of Stuttgart, under the supervision of my colleague Albrecht Schmidt. The UNH IRES program is funded by the National Science Foundation, Office of International and Integrative Activities, and we are grateful for their support. The eight students were each assigned to a group within the HCI Lab and participated in the research activities of that group.

I asked each of the students to write a one-paragraph report on their summer experience in Stuttgart, focusing on their research, and on their cultural experience. Here’s the first installment of these brief reports, where we look at some of the research conducted by the students.

Taylor Gotfrid worked on using augmented reality in assistive systems:

During my time here I learned about experiment design and augmented reality. Over this summer I’ve been working on designing and conducting a user study to determine whether picture instructions or projected instructions have better recall for assembly tasks over a long period of time. This experiment assesses which form of media would lead to fewer errors, faster assembly times, and better recall over a span of three weeks. The picture above is of the projector system indicating where the next LEGO piece needs to be placed to complete the next step in the assembly process.

Dillon Montag worked on tactile interfaces for people with visual impairments:

HyperBraille: I am working with Mauro Avila on developing tactile interfaces for people with visual impairments. Our tool developed this summer will allow users to explore scientific papers while receiving both audio and tactile feedback. We hope this new tool will allow people with visual impairments to enhance their understanding and navigation through papers.

Anna Wong worked on touch recognition:

For my project with the University of Stuttgart lab, I was tasked with using images like the one on the left to detect the user’s hand, and then classify the finger being used to touch a touch screen. This involved transforming the images in a variety of ways, such as finding the edges using a canny edge detector as in the top image, and then using machine learning algorithms to classify the finger.

Elizabeth Stowell worked on smart notifications:

I worked with smart objects and the use of notifications to support aging-in-place. I enjoyed building prototypes for a smart pillbox, sketching designs for a smart calendar, and exploring how people who are elderly interact with such objects. In these two months, I learned a lot about notification management within the context of smart homes.

Saarbrücken lab tours

Recently, the UNH IRES group visited Saarbrücken, Germany to tour two HCI labs at Universität des Saarlandes. We were all quite amazed by research effort at both labs.

The Innovative Retail Lab at the German Research Center for Artificial Intelligence  (DFKI) is conducting multiple studies on innovative shopping solutions and more! The simulated intelligent supermarket (pictured) provides shoppers with virtual assistance through smart shopping carts and sensor-based shelves that provide nutritional feedback. Very cool!

IMG_5828

The Embodied Interaction Lab, housed in the Max Planck Institute for Informatics (Cluster of Excellence “Multimodal Computing and Interaction”), studies a range of HCI-related topics. We got to see all sorts of futuristic inventions, from use cases of flexible displays to sensor-based on-body devices.  iSkin (pictured) is a flexible, stretchable, ground-breaking technology that detects touch input on skin (it is also the winner of the Best Paper Award at CHI ’15!).

IMG_5837

After a day of lab tours, we had the chance to indulge in some German-Mexican fare! A few of us had been missing “American food,” so this was a great opportunity to have some enchiladas while socializing with the Saarbrücken researchers. Of course, some delicious dessert was enjoyed as well (pictured).

IMG_0464

And a selfie was taken!

IMG_0477

2015 UNH IRES: HCI student summer research experience in Germany

HCI Lab, Stuttgart

The UNH HCI Lab is happy to announce the 2015 UNH International Research Experiences for Students (IRES) program. Within the program we plan to fund 3 undergraduate and 3 graduate students to conduct research at the Human Computer Interaction (HCI) Lab of Professor Albrecht Schmidt at the University of Stuttgart. Professor Schmidt and his lab are among the world leaders in the field of HCI. Successful applicants will participate in the program between June 1 and July 31, 2015. They will receive full financial support for participation, covering items such as airfare, room and board, health insurance, as well as a $500/week stipend. The total value of the financial package is approximately $8,500 for 9 weeks.

Student research within the UNH IRES program will focus on developing and testing tools for estimating cognitive load in the domains of in-vehicle user interfaces, knowledge acquisition, speech interfaces, or similar areas.


2014 UNH IRES students visiting Ludwigsburg palace in Stuttgart

UNH IRES students will live and work in Stuttgart. Stuttgart is a city of about 600,000, where students will encounter history around every corner. For example, Stuttgart is believed by many to be the cradle of the automobile, and students can visit the Mercedes-Benz museum that is devoted to the history of this iconic brand.

The program is funded by the National Science Foundation, Office of International and Integrative Activities. We are grateful for the support.

To learn more about the program (including student experiences from 2014) and to apply, click on the UNH IRES menu at the top of this page, or here.

First Lab Visit in UNH IRES Program: Hasso Plattner Institute

IMG_6280

On June 30th, 2014 we met with Dominik Schmidt at Hasso Plattner Institute (HPI). HPI was Founded in 1998 and is the first, and still the only entirely privately funded university college in Germany.

 

Dominik is currently doing research in human-computer interaction. More specifically, in his research, he scales natural user interfaces to span entire rooms and creates novel interaction technologies and techniques with the goal to enable seamless and powerful interaction across physical space.Before joining Patrick Baudisch at HPI’s Human Computer Interaction Lab in Potsdam, Germany, Dominik received his Ph.D. from  Lancaster University, UK, where he was part of the Embedded Interactive Systems (EIS) group. You can check out his blog here.

 

The objective of the human computer interaction department at HPI is to unify the virtual world of the computer with the physical world of the user into a single space. During our visit we got a glimpse of a few projects they are working on. The most intriguing piece of technology they have is an interactive floor. One research project Dominik showed us is GravitySpace. GravitySpace is a new approach to tracking people and objects indoors. Unlike traditional solutions based on cameras, GravitySpace reconstructs scene data from a pressure-sensing floor. While the floor is limited to sensing objects in direct contact with the ground, GravitySpace reconstructs contents above the ground by first identifying objects based on their texture and then applying inverse kinematics.

A picture during our visit. This is the room below the interactive floor. 
GravitySpace recognizes people and objects. We use a mirror-metaphor to show how GravitySpace identifies users and tracks their location and poses, solely based on the pressure imprints they leave on the floor.
 HPI says

Smart rooms support users by offering not only a series of convenient functions, like home automation, but also by acting pro-actively on the user’s behalf. To this end, such rooms need to know their own geometry as well as the people and their actions within it.”

The GravitySpace prototype senses pressure at 1mm resolution and projects across an active area of 8 m² in a single seamless piece–a 10x larger version of multitoe.

Check out this video of GravitySpace:

Another projects they showed us is called Haptic Turk Wwalk-Up VR where a user can use his/her friends for motion based feedback in a virtual world. This is a great solution because it not only involves your friends it is also a cheap solution to not buying an expensive motion platform.
Fully immersive experience with motion feedback based on People