menu
{ "item_title" : "Explainable AI and User Experience. Prototyping and Evaluating an UX-Optimized XAI Interface in Computer Vision", "item_author" : [" Georg Dedikov "], "item_description" : "Master's Thesis from the year 2023 in the subject Computer Science - SEO, Search Engine Optimization, grade: 1,0, University of Regensburg (Professur für Wirtschaftsinformatik, insb. Internet Business & Digitale Soziale Medien), language: English, abstract: This thesis presents a toolkit of 17 user experience (UX) principles, which are categorized according to their relevance towards Explainable AI (XAI). The goal of Explainable AI has been widely associated in literature with dimensions of comprehensibility, usefulness, trust, and acceptance. Moreover, authors in academia postulate that research should rather focus on the development of holistic explanation interfaces instead of single visual explanations. Consequently, the focus of XAI research should be more on potential users and their needs, rather than purely technical aspects of XAI methods. Considering these three impediments, the author of this thesis derives the assumption to bring valuable insights from the research area of User Interface (UI) and User Experience design into XAI research. Basically, UX is concerned with the design and evaluation of pragmatic and hedonic aspects of a user's interaction with a system in some context. These principles are taken into account in the subsequent prototyping of a custom XAI system called Brain Tumor Assistant (BTA). Here, a pre-trained EfficientNetB0 is used as a Convolutional Neural Network that can divide x-ray images of a human brain into four classes with an overall accuracy of 98%. To generate factual explanations, Local Interpretable Model-agnostic Explanations are subsequently applied as an XAI method. The following evaluation of the BTA is based on the so-called User Experience Questionnaire (UEQ) according to Laugwitz et al. (2008), whereby single items of the questionnaire are adapted to the specific context of XAI. Quantitative data from a study with 50 participants in each control and treatment group is used to present a standardized way of quantif", "item_img_path" : "https://covers4.booksamillion.com/covers/bam/3/34/687/419/3346874192_b.jpg", "price_data" : { "retail_price" : "87.90", "online_price" : "87.90", "our_price" : "87.90", "club_price" : "87.90", "savings_pct" : "0", "savings_amt" : "0.00", "club_savings_pct" : "0", "club_savings_amt" : "0.00", "discount_pct" : "10", "store_price" : "" } }
Explainable AI and User Experience. Prototyping and Evaluating an UX-Optimized XAI Interface in Computer Vision|Georg Dedikov

Explainable AI and User Experience. Prototyping and Evaluating an UX-Optimized XAI Interface in Computer Vision

local_shippingShip to Me
In Stock.
FREE Shipping for Club Members help

Overview

Master's Thesis from the year 2023 in the subject Computer Science - SEO, Search Engine Optimization, grade: 1,0, University of Regensburg (Professur für Wirtschaftsinformatik, insb. Internet Business & Digitale Soziale Medien), language: English, abstract: This thesis presents a toolkit of 17 user experience (UX) principles, which are categorized according to their relevance towards Explainable AI (XAI). The goal of Explainable AI has been widely associated in literature with dimensions of comprehensibility, usefulness, trust, and acceptance. Moreover, authors in academia postulate that research should rather focus on the development of holistic explanation interfaces instead of single visual explanations. Consequently, the focus of XAI research should be more on potential users and their needs, rather than purely technical aspects of XAI methods. Considering these three impediments, the author of this thesis derives the assumption to bring valuable insights from the research area of User Interface (UI) and User Experience design into XAI research. Basically, UX is concerned with the design and evaluation of pragmatic and hedonic aspects of a user's interaction with a system in some context. These principles are taken into account in the subsequent prototyping of a custom XAI system called Brain Tumor Assistant (BTA). Here, a pre-trained EfficientNetB0 is used as a Convolutional Neural Network that can divide x-ray images of a human brain into four classes with an overall accuracy of 98%. To generate factual explanations, Local Interpretable Model-agnostic Explanations are subsequently applied as an XAI method. The following evaluation of the BTA is based on the so-called User Experience Questionnaire (UEQ) according to Laugwitz et al. (2008), whereby single items of the questionnaire are adapted to the specific context of XAI. Quantitative data from a study with 50 participants in each control and treatment group is used to present a standardized way of quantif

This item is Non-Returnable

Details

  • ISBN-13: 9783346874191
  • ISBN-10: 3346874192
  • Publisher: Grin Verlag
  • Publish Date: May 2023
  • Dimensions: 8.27 x 5.83 x 0.36 inches
  • Shipping Weight: 0.47 pounds
  • Page Count: 156

Related Categories

You May Also Like...

    1

BAM Customer Reviews