The purpose of this study, which is part of the ARC project “A Pictorial Communication Framework for Inclusion“, is to explore the design of applications that would allow people to point to images, in addition to using their voice, to interact with online search engines. This can be used for interactions in groups (on interactive whiteboards) or individually (on tablets).
At the beginning of the study, an experimenter (a member of the research team) will simulate the responses from a search engine, thus ensuring that people in the study are not limited by what search engines can currently do and understand.
We built our own software to make sure it is accessible to all participants, and also that experimenters can respond to voice, touch and text as fast as possible with voice, text and images all at once.
Team
Associate Professor Laurianne Sitbon
Sirin Roomkham
Shannon Terris
Alicia Mitchell