| dc.creator | Meiners, Anna-Lena | |
| dc.creator | Schrepp, Martin | |
| dc.creator | Hinderks, Andreas | |
| dc.creator | Thomaschewski, Jörg | |
| dc.date.accessioned | 2023-06-01T10:03:31Z | |
| dc.date.accessioned | 2023-09-07T15:20:11Z | |
| dc.date.available | 2023-06-01T10:03:31Z | |
| dc.date.available | 2023-09-07T15:20:11Z | |
| dc.date.created | 2023-06-01T10:03:31Z | |
| dc.identifier | 1989-1660 | |
| dc.identifier | https://reunir.unir.net/handle/123456789/14811 | |
| dc.identifier | https://doi.org/10.9781/ijimai.2023.05.003 | |
| dc.identifier.uri | https://repositorioslatinoamericanos.uchile.cl/handle/2250/8732134 | |
| dc.description.abstract | Questionnaires are a highly efficient method to compare the user experience (UX) of different interactive products or versions of a single product. Concretely, they allow us to evaluate the UX easily and to compare different products with a numeric UX score. However, often only one UX score from a single evaluated product is available. Without a comparison to other measurements, it is difficult to interpret an individual score, e.g. to decide whether a product’s UX is good enough to compete in the market. Many questionnaires offer benchmarks to support researchers in these cases. A benchmark is the result of a larger set of product evaluations performed with the same questionnaire. The score obtained from a single product evaluation can be compared to the scores from this benchmark data set to quickly interpret the results. In this paper, the first benchmark for the UEQ+ (User Experience Questionnaire +) is presented, which was created using 3.290 UEQ+ responses for 26 successful software products. The UEQ+ is a modular framework that contains a high number of validated user experience scales that can be combined to form a UX questionnaire. Currently, no benchmark is available for this framework, making the benchmark constructed in this paper a valuable interpretation tool for UEQ+ questionnaires. | |
| dc.language | eng | |
| dc.publisher | International Journal of Interactive Multimedia and Artificial Intelligence | |
| dc.relation | ;In Press | |
| dc.relation | https://www.ijimai.org/journal/bibcite/reference/3316 | |
| dc.rights | openAccess | |
| dc.subject | benchmark | |
| dc.subject | UEQ | |
| dc.subject | user experience | |
| dc.subject | UX | |
| dc.subject | IJIMAI | |
| dc.title | A Benchmark for the UEQ+ Framework: Construction of a Simple Tool to Quickly Interpret UEQ+ KPIs | |
| dc.type | article | |