Please use this identifier to cite or link to this item: http://localhost:80/xmlui/handle/123456789/5043
Full metadata record
DC FieldValueLanguage
dc.contributor.authorKhan, Muhammad Akif-
dc.date.accessioned2019-06-28T05:51:27Z-
dc.date.accessioned2020-04-11T15:35:25Z-
dc.date.available2020-04-11T15:35:25Z-
dc.date.issued2018-
dc.identifier.govdoc17142-
dc.identifier.urihttp://142.54.178.187:9060/xmlui/handle/123456789/5043-
dc.description.abstractWorldwide 39 million people are blind, and the number of people that could be blind is predicted to be 79 million in 2020. The loss of vision puts a limitation on a blind person to perform daily activities smoothly, which hampers the person’s participation/progress in education, employment, and social networking. The access to information on ubiquitous devices including smartphones, smartwatches, and wearable assistive bands is an emerging trend for not only people with vision but also those that are visually impaired. A large number of blind people are using smartphone-based assistive technologies and accessibility services such as talkback and voice assistants in performing their common-life activities including placing calls, sharing pictures, reading books, and sending messages, etc. However, existing accessibility services face several problems including late learning, accessing and selecting nonvisual items on the screen, finding an item of interest in a complex menu structure, and operating multiple connected-devices. In addition, the diversity of the devices, mutable user requirements, equipment, apparatus, systems, apps, services, accessibility, usability, and context-sensitivity contributes to the frustrating user experience significantly. Therefore, an accessibility-inclusive blind-friendly user interface model is required to perform activities on touchscreen interfaces. This thesis defines a semantically enriched universal accessibility framework by designing a usable, personalized, light-weight, and semantically enriched accessibility-inclusive user interface to operate common applications on a smartphone. The proposed blind-friendly interface design simplifies User Interface Artefacts (UIAs) such as layouts, labels, buttons, and panels, etc., on touchscreen interfaces. The users can share UIAs to other devices instead of sharing the whole screen by enhancing the availability and consistency of multiple devices. The user v experience of blind people in operating smartphone/smartwatch interfaces were evaluated through empirical and automated user studies. The evaluation framework peculiarly recorded and investigated the behaviour of the blind people on several usability parameters including attitude, intention to use, perceived usefulness, understandability & learnability, operability, ease of use, system usability scale, minimal memory load, consistency, and user satisfaction. The study leveraged an improved user experience of blind people in performing common-life activities on a smartphone. The findings of this thesis provide an essential reference for usability experts, Human-Computer Interaction (HCI) experts, and application developers in designing accessibility-inclusive services for touchscreen user interfacesen_US
dc.description.sponsorshipHigher Education Commission, Pakistanen_US
dc.language.isoen_USen_US
dc.publisherUniversity of Peshawar, Peshawar.en_US
dc.subjectComputer Sciencesen_US
dc.titleBlind-Friendly Universal User Interfaces: Towards The Design of Adaptive User Interfaces for Blind Peopleen_US
dc.typeThesisen_US
Appears in Collections:Thesis

Files in This Item:
File Description SizeFormat 
9991.htm120 BHTMLView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.