An investigation into the effectiveness of using acoustic touch to assist people who are blind.
Zhu, HY
Hossain, SN
Jin, C
Singh, AK
Nguyen, MTD
Deverell, L
Nguyen, V
Gates, FS
Fernandez, IG
Melencio, MV
Bell, J-AR
Lin, C-T
- Publisher:
- Public Library of Science (PLoS)
- Publication Type:
- Journal Article
- Citation:
- PLoS One, 2023, 18, (10), pp. e0290431
- Issue Date:
- 2023
Open Access
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is open access.
Full metadata record
Field | Value | Language |
---|---|---|
dc.contributor.author | Zhu, HY | |
dc.contributor.author | Hossain, SN | |
dc.contributor.author | Jin, C | |
dc.contributor.author | Singh, AK | |
dc.contributor.author | Nguyen, MTD | |
dc.contributor.author | Deverell, L | |
dc.contributor.author |
Nguyen, V https://orcid.org/0000-0003-3744-1523 |
|
dc.contributor.author | Gates, FS | |
dc.contributor.author | Fernandez, IG | |
dc.contributor.author | Melencio, MV | |
dc.contributor.author | Bell, J-AR | |
dc.contributor.author | Lin, C-T | |
dc.contributor.editor | Khan, IA | |
dc.date.accessioned | 2023-11-07T02:55:09Z | |
dc.date.available | 2023-08-09 | |
dc.date.available | 2023-11-07T02:55:09Z | |
dc.date.issued | 2023 | |
dc.identifier.citation | PLoS One, 2023, 18, (10), pp. e0290431 | |
dc.identifier.issn | 1932-6203 | |
dc.identifier.issn | 1932-6203 | |
dc.identifier.uri | http://hdl.handle.net/10453/173155 | |
dc.description.abstract | Wearable smart glasses are an emerging technology gaining popularity in the assistive technologies industry. Smart glasses aids typically leverage computer vision and other sensory information to translate the wearer's surrounding into computer-synthesized speech. In this work, we explored the potential of a new technique known as "acoustic touch" to provide a wearable spatial audio solution for assisting people who are blind in finding objects. In contrast to traditional systems, this technique uses smart glasses to sonify objects into distinct sound auditory icons when the object enters the device's field of view. We developed a wearable Foveated Audio Device to study the efficacy and usability of using acoustic touch to search, memorize, and reach items. Our evaluation study involved 14 participants, 7 blind or low-visioned and 7 blindfolded sighted (as a control group) participants. We compared the wearable device to two idealized conditions, a verbal clock face description and a sequential audio presentation through external speakers. We found that the wearable device can effectively aid the recognition and reaching of an object. We also observed that the device does not significantly increase the user's cognitive workload. These promising results suggest that acoustic touch can provide a wearable and effective method of sensory augmentation. | |
dc.format | Electronic-eCollection | |
dc.language | eng | |
dc.publisher | Public Library of Science (PLoS) | |
dc.relation.ispartof | PLoS One | |
dc.relation.isbasedon | 10.1371/journal.pone.0290431 | |
dc.rights | info:eu-repo/semantics/openAccess | |
dc.subject.classification | General Science & Technology | |
dc.subject.mesh | Humans | |
dc.subject.mesh | Acoustics | |
dc.subject.mesh | Blindness | |
dc.subject.mesh | Speech | |
dc.subject.mesh | Touch Perception | |
dc.subject.mesh | Vision, Ocular | |
dc.subject.mesh | Humans | |
dc.subject.mesh | Acoustics | |
dc.subject.mesh | Blindness | |
dc.subject.mesh | Speech | |
dc.subject.mesh | Touch Perception | |
dc.subject.mesh | Vision, Ocular | |
dc.title | An investigation into the effectiveness of using acoustic touch to assist people who are blind. | |
dc.type | Journal Article | |
utslib.citation.volume | 18 | |
utslib.location.activity | United States | |
pubs.organisational-group | /University of Technology Sydney | |
pubs.organisational-group | /University of Technology Sydney/Faculty of Engineering and Information Technology | |
pubs.organisational-group | /University of Technology Sydney/Faculty of Health | |
pubs.organisational-group | /University of Technology Sydney/Strength - AAII - Australian Artificial Intelligence Institute | |
pubs.organisational-group | /University of Technology Sydney/Faculty of Engineering and Information Technology/School of Computer Science | |
pubs.organisational-group | /University of Technology Sydney/Faculty of Health/Graduate School of Health | |
pubs.organisational-group | /University of Technology Sydney/Faculty of Health/Graduate School of Health/GSH.Orthoptics | |
utslib.copyright.status | open_access | * |
dc.date.updated | 2023-11-07T02:55:07Z | |
pubs.issue | 10 | |
pubs.publication-status | Published online | |
pubs.volume | 18 | |
utslib.citation.issue | 10 |
Abstract:
Wearable smart glasses are an emerging technology gaining popularity in the assistive technologies industry. Smart glasses aids typically leverage computer vision and other sensory information to translate the wearer's surrounding into computer-synthesized speech. In this work, we explored the potential of a new technique known as "acoustic touch" to provide a wearable spatial audio solution for assisting people who are blind in finding objects. In contrast to traditional systems, this technique uses smart glasses to sonify objects into distinct sound auditory icons when the object enters the device's field of view. We developed a wearable Foveated Audio Device to study the efficacy and usability of using acoustic touch to search, memorize, and reach items. Our evaluation study involved 14 participants, 7 blind or low-visioned and 7 blindfolded sighted (as a control group) participants. We compared the wearable device to two idealized conditions, a verbal clock face description and a sequential audio presentation through external speakers. We found that the wearable device can effectively aid the recognition and reaching of an object. We also observed that the device does not significantly increase the user's cognitive workload. These promising results suggest that acoustic touch can provide a wearable and effective method of sensory augmentation.
Please use this identifier to cite or link to this item:
Download statistics for the last 12 months
Not enough data to produce graph