dc.creatorde Santana V.F.
dc.creatorBaranauskas M.C.C.
dc.date2008
dc.date2015-06-30T19:15:08Z
dc.date2015-11-26T14:40:19Z
dc.date2015-06-30T19:15:08Z
dc.date2015-11-26T14:40:19Z
dc.date.accessioned2018-03-28T21:46:32Z
dc.date.available2018-03-28T21:46:32Z
dc.identifier9780387096773
dc.identifierIfip International Federation For Information Processing. , v. 272, n. , p. 99 - 104, 2008.
dc.identifier15715736
dc.identifier10.1007/978-0-387-09678-0_9
dc.identifierhttp://www.scopus.com/inward/record.url?eid=2-s2.0-47249146123&partnerID=40&md5=26e1ecf4dddcdb518220cd657c882b17
dc.identifierhttp://www.repositorio.unicamp.br/handle/REPOSIP/105425
dc.identifierhttp://repositorio.unicamp.br/jspui/handle/REPOSIP/105425
dc.identifier2-s2.0-47249146123
dc.identifier.urihttp://repositorioslatinoamericanos.uchile.cl/handle/2250/1250363
dc.descriptionThe variety of websites evaluation tools based on event logs is growing, but some of their limitations are already visible (e.g., the need for task model, plug-in dependency, use of simulated tasks, separation of accessibility from usability, etc). Some of these characteristics result in coupled systems and make the configuration and use of these tools more expensive. This work aims to show the main features and weaknesses of these tools. One expects that the discussion and the requirements pointed out in this paper could help developers of evaluation tools so they could reuse consolidated ideas and avoid identified weaknesses. © 2008 Springer Science+Business Media, LLC.
dc.description272
dc.description
dc.description99
dc.description104
dc.descriptionClaypool, M., Le, P., Wased, M., Brown, D., Implicit interest indicators (2001) IUI '01: Proceedings of the 6th Int. conference on intelligent user interfaces, pp. 33-40. , New York, NY, USA, ACM
dc.descriptionCorreani, F., Leporini, B., Paternò, F., Automatic inspection-based support for obtaining usable web sites for vision-impaired users (2006) Universal Access in the Information Society, 5 (1)
dc.descriptionEtgen, M., Cantor, J., What does getting wet (web event-logging tool) mean for web usability? (1999) Proceedings of 5th Conference on Human Factors & the Web
dc.descriptionGuzdial, M., Deriving software usage patterns from log files (1993), Technical report, Georgia Institute of TechnologyHilbert, D.M., Redmiles, D.F., Extracting usability information from user interface events (2000) ACM Comput. Surv, 32 (4), pp. 384-421
dc.descriptionHong, I.J., Heer, J., Waterson, S., Landay, A.J., Webquilt: A proxy-based approach to remote web usability testing (2001) ACM Transactions on Information Systems, 19 (3), pp. 263-285
dc.descriptionIvory, M.Y., Hearst, M.A., The state of the art in automating usability evaluation of user interfaces (2001) ACM Comput. Surv, 33 (4), pp. 470-516
dc.descriptionPaganelli, L., Paternò, F., Intelligent analysis of user interactions with web applications (2002) IUI '02: Proceedings of the 7th int. conf. on intelligent user interfaces, pp. 111-118. , ACM
dc.descriptionRubin, J., (1994) Handbook Of Usability Testing: How to plan, design, and conduct effective tests, , 1st edn. John Wiley & Sons Inc
dc.descriptionStamper, R.: A semiotic theory of information and information systems/ applied semiotics. In: Invited papers for the ICL/University of Newcastle Seminar on Information, (1993)Woo, D., Mori, J.: Accessibility: A tool for usability evaluation. In Masoodian, M., Jones, S., Rogers, B., eds.: APCHI. 3101 of LNCS, Springer (2004) 531-539
dc.languageen
dc.publisher
dc.relationIFIP International Federation for Information Processing
dc.rightsfechado
dc.sourceScopus
dc.titleA Prospect Of Websites Evaluation Tools Based On Event Logs
dc.typeActas de congresos


Este ítem pertenece a la siguiente institución