Page last updated November 8th 2015
Wearable Personal Assistants -- From Vision to Reality. PhD student Shahram Jalaliniya, IT University of Copenhagen, Denmark. (Ongoing.)
Egocentric Interaction for Ambient Intelligence. PhD student Dipak Surie, Umeaa University, Sweden. (Graduated 2012.)
Current project proposals can be found on the PIT Lab student project proposal page.
I explore the development of mobile and wearable systems that seamlessly blend with and support ongoing real-world tasks (c.f. Marc Weiser's vision of calm computing) and give rise to what we refer to as an egocentric interaction paradigm.
I believe that human perception, cognition, and action capabilities are defining factors for future wearable systems that talk to us in increasingly subtle ways through our peripheral attention.
Keywords: Human-Computer Interaction, Embodied Cognition, Peripheral Interaction, Ubiquitous Computing, Wearable Computers.
Perception, cognition, action in wearable HCI [work in progress]
A wearable personal assistant for surgeons [work in progress]
The Situative Space Model (Pederson, Janlert, & Surie, 2011) captures what a specific human agent can perceive and not perceive, reach and not reach, at any given moment in time. This model is for the emerging egocentric interaction paradigm what the virtual desktop is for the PC/WIMP (Window, Icon, Menu, Pointing device) interaction paradigm: more or less everything of interest to a specific human agent is assumed to, and supposed to, happen here.
Breakfast scenario involving mobile, wearable, and embedded interactive devices apart from everyday objects.
The Situative Space Model applied to the breakfast scenario pictured to the left, centered around the person having breakfast.
A more complete list is available here. Clicking on the publisher logos will search for my publications directly in their respective digital libraries. Also, check me out in Google Scholar.