What is InC?InC is the Innovative Communication research Group at the IT University of Copenhagen. It focuses on
- design and development of interactive technologies in the contexts of prior and emerging cultures of information
- advanced and innovative communication trends
- historical and rhetorical methods of innovation
Category Archives: HCI
Title: Affective Computing and the Communication Technologies
Time: Friday August 15 from 11:00 to 12:00 at IT University of Copenhagen
Affective Computing is leading to a deeper understanding of people’s emotional relationships with educational products, environments, and experience. Through exploratory design and user testing of smart systems, embedded technologies, and collaborative environments researchers are developing a new framework for learners’ interactions with educational technologies. Real-time affective sensing is being used to measure and interpret elements of user experience such as physiology, contextual actions, and social interactions. This awareness enables dynamic tailoring of function and focus, to affect user experience and outcome. For example, an expressive Affective Learning Companion sensing user interest through patterns of posture, facial expression, pressure exerted on a mouse, and skin conductivity might choose to delay intervention to allow the user to continue exploration. On the other hand, if frustration were sensed, the companion might display concern through appearance and body posture as it engages in non-verbal expression as a form of empathy. This interaction could provide social support and draw attention to the user’s affect, to facilitate self-awareness and mitigate the negative impact of frustration. These interactions form relationships between learners, products, environments, and experiences that are enhanced because they take into account emotions and context. Investigations at the confluence of affect, experience, and usage are transforming the design of educational products and the role of collaborative information systems. These products and systems are empowering learners, teachers, researchers and designers to better understand and promote learning, collaboration, creativity, and innovation.
About the Speaker
Winslow Burleson is an Assistant Professor of Human Computer Interaction with a joint appointment in the School of Computing and Informatics and the Arts, Media, and Engineering graduate program at Arizona State University. He received his PhD from the MIT Media Lab, working with the Affective Computing and Life Long Kindergarten research groups. He has also worked with the Entrepreneurial Management Unit at the Harvard Business School on creativity research methodologies and frequently serves on National Academies of Science organizing committees and NSF Review Panels. At IBM’s Almaden Research Center he was awarded ten patents for inventing educational and assistive technologies and novel forms of human-computer interaction. He holds a bachelor’s degree in bio-physics from Rice University and a Master of Science in Engineering degree from Stanford University’s Mechanical Engineering Product Design Program where he taught brainstorming, creativity, and visual thinking skills. His research is supported by awards and gifts from NSF, NASA-JPL, Deutsche Telekom, iRobot, and LEGO Group. He has been a Curriculum Developer at the NASA-SETI Institute, Co-Principal Investigator on the Hubble Space Telescope’s Investigation of Binary Asteroids, member of the LEGO Learning Institute, and Consultant to UNICEF and the World Scout Bureau on Healthy Lifestyles for Youth.
Four of the Inc people participated in the CHI 08 conference in Florence, Italy, last week. Anker Helms Jørgensen organized a Special Interest Group on the history of User Interfaces together with Brad Meyers, CMU. The session was well attended with more than 20 people participating and Anker has now started a blog to continue the discussion.
Javier San Agustin and John Paulin Hansen presented their work-in-progress, comparing mouse and gaze selection. They found a combination of gaze and EMG clicking to be faster than mouse pointing and clicking. The research is important for the development of more efficient interaction with e.g. games and with communication systems for disabled people. This work is done in collaboration with Julio Mateo at Wright State University and an abstract can be found here.
Sune Alstrup participated in a workshop about evaluation of user experience in games, where he presented work on how to evaluate gamers’ experience by the use of eye tracking and retrospective think aloud.
The new 3D typing system, “Stargazer” developed by Henrik H. Jensen and the gaze interaction research group at IT University of Copenhagen was tested with great success at the open-house event “Culture Night” on October 12, 2007. All of the 35 visitors who tried it could use it after just one minute. We randomly assigned them to small PDA-size screens, we put noise disturbances on the tracker and no matter what – they wrote their names. Some of them were typing at a speed of more than 20 characters per minute. And the zooming speed can be increased a lot once you learn to master the layout and interaction. We are now including word predictions and expect that people may get to type more than 50 characters per minute with some training. Pan and zoom selections are well guided by gaze. Expect to see more in the near future!
We have been collaborating with a US company “Cyberlink”on the development of an EMG interface to our GazeTalk communication system. Much to our surprise we realized that clicking by small forehead muscle movements were in fact faster than (finger-)clicking the mouse button, presumably because the distance from the brain to the forehead is shorter than to the finger. Gamers would love this, we thought. And yes, this year at CEBIT the company OCZ presented their integration of the Cyberlink sensor with a game controller. This is great news since the cost of the EMG switch may now become significant reduced and it may become a widespread input device for disabled people.
Please notice, that the person in the video is *NOT* using a brain computer interface (BCI) but “just” a very reliable, fast and sensitive sensor of facial muscle activity (EMG).
Last week Arne Lykke Larsen gave a talk to students at the School of Social Work in Odense, Denmark. Arne is paralyzed because he has ALS/mnd and uses gaze interaction to type and control speech.
Arne is an associate professor in theoretical physics at University of Southern Denmark and gives lectures regularly. Normally, he uses the Brainfinger EMG switch and this was his very first public talk given with the use of gaze interaction. Arne has THE darkest sense of humor and for those of you who read danish I highly recommend to download his daily-life observations. One of the novels “33 Nej men jeg har læst bogen” gave us some good ideas to the GazeTalk system that Arne is helping us designing.