A variety of different evaluation methods have been developed or adapted from related fields, building on identified methodology gaps. But choosing the right setting in a User Experience (UX) study is still challenging for both researchers in academia and practitioners in industry. Although the importance of mixed methods and data-driven approaches to get well-founded study results of interactive systems has been emphasized numerous times, there is a lack of clear recommendations and a common understanding of when and in which ways to combine different methods, theories, and data related to the UX of interactive systems.
This workshop aims to gather experiences of user studies from UX professionals and academics to contribute to the knowledge of mixed methods, theories, and data in UX evaluation. We want to discuss individual experiences, best practices, potential risks and gaps, and to reveal correlations among individual triangulation strategies.
As an academic discipline, the field of User Experience (UX) research has a multi-disciplinary heritage, involving a variety of different perspectives that focused on studying human experiences with products, systems, and services. This led to a wide spectrum of methods that are used for studying users’ experiences. Traditional Human-computer interaction (HCI) theory has passed on methodological approaches akin to those used in usability evaluation studies. Other disciplines that have significantly influenced UX research are those of social sciences, ethnography, and philosophy. There have been great efforts in academia to create new methods for effectively evaluating UX, aimed at both academic and industrial application. Our proposition in this workshop is, however, that we often do not need to develop new methods but rather use existing tools and approaches from the wide flora of UX evaluation more efficiently. UX evaluation is no longer an unknown territory and we want to encourage reflection on established approaches as well as lessons learned along the way. We want to explore the existing know-how of UX professionals, from academia and industry, in combining different UX evaluation methods (e.g., qualitative and quantitative methods) within so called mixed methods approaches and triangulation strategies.
Specifically, we want to discuss the following questions:
- What are the motivations and the outcomes of different UX research and evaluation methods?
- How do we best draw conclusions from multiple and different sources, such as qualitative and quantitative or attitudinal and behavioral data?
- Can combinations of contrasting theories that exist in UX be better exploited, and if so how?
- How can we define best practices and where are gaps or development needs in mixed method approaches in the field of UX?
The workshop aims to gather experiences of user studies from UX professionals and academics to contribute to the knowledge of mixed methods, theories, and data in UX evaluation. We want to discuss individual experiences, best practices, potential risks and gaps, and to reveal correlations among individual triangulation strategies. The workshop will encourage and enable participants to learn from existing practices and understand needs within industry and academia for UX evaluation. A cooperatively developed mixed method map will summarize the outcomes.
How to participate
We you to reflect on an own UX evaluation study that is currently in progress or already finished as a positive and/or challenging example. Submissions should be 2- 4 page long (SIGCHI Extended Abstracts Format). Include:
- Short description of the study aim, context of use (e.g., healthcare, transportation, etc.), used methods, and motivations for including particular methods in the study setup
- Discussion of the outcomes (expected or actual results, depending of stage of research) and challenges in preparing, running, analyzing, and documenting associated results
- Personal reflections on current state and possible gaps of UX evaluation methodology (this is not a requirement for participation, but feel free to reflect on this)
Submissions must be mailed to the organizers by April 25 at the latest: