Press "Enter" to skip to content

Research into Technology for People with Deafblindness in 2020-21

With the SUITCEYES project finished, it seems like  a good time to review what other researchers in the field have been up to in the last couple of years. To that end, this blog post written by Raymond Holt aims to provide a review of academic literature on the subject of technologies for people with deafblindness published in 2020 and 2021.

About the Author

Author Raymond Holt

Dr Holt is a lecturer in the School of Mechanical Engineering, at the University of Leeds. He is a member of the University’s Institute of Design, Robotics and Optimisation, and its Centre for Disability Studies. His research is primarily concerned with haptic perception and the cocreation of assistive and rehabilitative technologies with users. He was part of the team working on Perception and Navigation on the SUITCEYES project, and is currently extending this work as part of the Wellcome Trust-funded Imagining Technologies for Disability Futures project (http://itdfproject.org).

Scope and Limitations

Before I launch into the details, let me first put up a few caveats:

Firstly, this is a blog post not a formal systematic review for publication in an academic journal: the aim is to highlight work going on in the field of technologies for deafblindness, but I make no claims that it is exhaustive. Indeed, if I’ve missed anything, I’d be grateful if you could let me know! I’m aiming for highlighting work people can look at, not providing detailed accounts or analysis of that work.

Secondly, this blog post focuses exclusively on academic research that is specifically about technology for people with deafblindness. There has been lots of interesting research in the area of deafblindness in other areas, but I am a technologist and an academic, so I’m reviewing the academic literature on technologies. This is highlighting research being undertaken, not of technologies out in the marketplace.

Thirdly, this post focuses on research that explicitly concerns itself with technologies for people with deafblindness. There are various pieces of research on, for example, haptic navigation that are aimed at people who are visually impaired. Again, maybe future blog posts can pick up on more general issues of haptic navigation and communication, but here I want to highlight research that is specifically concerned with deafblindness.

Fourthly, I am going off the official publication date, not the date of acceptance or the date made available online, so some of these papers have been available before 2020.

Finally, I have focussed exclusively on peer-reviewed journal articles that are available through the Leeds University website. So if there is exciting work reported in a conference, or in a journal we don’t subscribe to, then I won’t have picked it up. Again, let me know!

Now, with all that out of the way, let’s take a look at some of the interesting research from the last couple of years.

Wayfinding Tools and Deafblindness

Amy T Parker and Martin Swobodzinski, along with several of their colleagues at Portland State University, have published three interesting papers related to wayfinding tools and deafblindness over the course of 2020 and 2021. None present new technology specifically, but what makes them particularly interesting is their emphasis on user views and experiences. Parker et al (2020) presents a focus group of ten people with deafblindness discussing the wayfinding apps that they use. Unsurprisingly, the actual apps used and preferences regarding them vary greatly from individual to individual, but some interesting common desires emerged: better visual access for people with low vision; better touch-based vibrotactile information;  better support for braille outputs; sound-based environmental information to be made accessible; and more consistent connectivity. Parker et al. (2021) supplements this with a review of the literature on wayfinding tools for people with visual impairments. It is a very detailed review of the 35 identified papers, and is well worth reading as it makes many rich insights, that would be impossible to do justice in a short summary here.

Swobodzinski et al. (2021) follow this up with a case study of an individual with deafblindness using three different methods to navigate the University of Portland Campus through both indoor and outdoor environments (this transition between indoors and outdoors being something that they found was neglected in most studies which focussed on one or the other). The individual in this case had a cochlear implant, so had access to audio cues in the environment. The three methods were written instructions (accessed through Apple VoiceOver on the individual’s iPhone), a tactile map, and a purpose-built mobile app that used a combination of bluetooth beacons for indoor navigation and GPS for outdoor navigation. The mobile app is of particular interest from a SUITCEYES perspective, as the use of bluetooth beacons to navigate an indoor environment was one of the areas we had planned to investigate that was eventually prevented by the COVID-19 lockdowns in 2020 and 2021. Again, the findings are very rich, and well worth reading, even though they are a sample of one participant and they would be impossible to do justice to here. Still the main headlines were that the tactile map afforded the best performance, and the mobile app the worst, although some of that was due to the user having high expectations of the app that it could not meet. Those of us who spend our lives making and testing early prototypes know that feeling well!    

Shopping When You Are Deafblind

Walter Wittich of the Université de Montréal is a familiar face from the SUITCEYES symposium last year, and the inaugural director of the Deafblind International Research Network. He has been involved in a lot of interesting research on dual sensory loss over the last few years, and was one of the co-authors of an interesting “pre-technology” study of shopping while deafblind (Vincent et al, 2021). In this study, a braille note taker and a smartphone were combined to enable communication between individuals with deafblindness and others in their environment, with the aim of enabling communication with, for example, passers-by or shop assistants when the user might need to ask for directions during a shopping trip. The concept is that the user interacts with the braille note taker, and is able to write notes on it which are then displayed on the smartphone. The other person can read these, and write a reply on the smartphone which is then displayed on the braille notetaker.  This is, as noted a “pre-technology” study, acting as a proof of concept, in this case, the system as tested with a single user – a visually impaired clinician simulating deafblindness – to act as a proof of concept before testing the system with users who have deafblindness. I think that’s reasonable – I always argue that time with your users is precious, and you don’t want to spend your time with them finding flaws that you could have identified for yourself. And even with a single participant, it’s still an interesting case study, as this testing requires participation of members of the public. Three tests are reported, including transcripts of the dialogues achieved through the system. This is a report of the first year of a project, so it will be interesting to see the subsequent follow ups!

Device Abandonment

Wittich is also a co-author of a review of literature on the use or abandonment of technology by persons living with deafblindness (Wittich et al. 2021) . Again, it makes for a very interesting read, identifying ten studies, and noting that these focus devices intended for either vision or hearing loss, but not dual sensory loss – even though it is often the interaction between vision and hearing impairments that impede access to technologies for people with deafblindness. It also highlights a gap in research into haptic communication, which will be of interest to those involved in the SUITCEYES project and its follow-ups.

Accessible TV Captions

GoCC4All is a mobile phone app intended to increase the accessibility of TV shows for people with deafblindness by displaying closed captions on a smartphone, or a braille display connected to the smartphone. You can find details over at its website: https://gocc4all.dicaptafoundation.org/. I found two publications relating to this, the first introducing the system as GoAll (Garcia-Crespo, 2020) and the second describing wider testing as GoCC4All (Rodriguez, 2021).  Both are co-authored by Ángel Garcia-Crespo, from University Carlos III Madrid. The mobile app is available from both Apple and Google Play stores, I believe – I’ve been able to download and use the Android version on my phone (though I should note, I don’t have a braille display, so could not test that part. It does require a certain amount of infrastructure from the broadcasters, so only small number of channels were available (Ideal Channel, PBS, Discovery, CNN, Fox News, Telemundo and TBS on the basis of the version I downloaded).  Worth noting that the app also includes delivery of local emergency alerts – though I’ve yet to look through the website to see exactly what areas are covered.

Wearable Devices for Tactile Communication

Marion Hersh from the University of Glasgow has produced several interesting articles on subjects related to deafblindness in recent years. Within the scope of this review, she has produced two interesting publications with her colleagues Oliver Ozioko and Ravinder Dahiya (Ozioko 2020a; 2020b). Both publications concern wearable devices for displaying six-point braille letters by delivering vibrations to three fingers on each hand; and allowing two-way communication by sensing pressure from the fingertips to send characters to a smartphone display or another tactile device. The design features custom-built sensors and actuators, to suitable for a flexible glove, so are much more sophisticated than gluing flexiforce sensors and coin vibration motors onto a glove. Well worth a look.

In the same vein, Jenna Gorlewicz of the CHROME (Collaborative Haptics, Robotics and Mechatronics) Lab at St Louis University and her colleagues present a “ProTactile Inspired Wearable Device for Capturing the Core Functions of Communication” (McGavin et al., 2021). It’s hard to describe what the paper is about without using its full title – it’s definitely appropriate! In this case, the device is a sleeve that covers the back of the hand the whole of the forearm, with an array of vibration motors on the back of the hand, an array of servomotors to press on one side of the forearm, and a heating pad to provide thermal feedback on the other side of the forearm. This paper is interesting for several reasons. Firstly, it includes thermal feedback and pressure feedback, both of which were considered for SUITCEYES, but abandoned due to the size of the devices required. We investigated peltier modules (which can do hot and cold sensations), to be fair, whereas this is a heating pad, but I imagine the same problem applies – that is, you either need to provide thermal feedback infrequently, or risk a build up of heat. Secondly, it is the first paper I have seen to use vibration motors both parallel and perpendicular to the skin. This is something we gave thought to, since when parallel to the skin, the vibrations spread out more and are harder to localise – but making them perpendicular again increases bulk. In this paper, both are used specifically to allow easy to localise and more diffuse sensations. Finally, beyond the technical elements, it goes beyond just delivering directions or phrases, to other aspects of communication present in ProTactile, such as phatic communication (initiating, maintaining and terminating  a channel of communication) and back-channeling (signals used by a listener to demonstrate engagement). Really interesting reading.

SUMMARY

This post has given an overview of ten recent papers (four published in 2020 and six in 2021) on the subjects of technology and deafblindness.  These are divided across the topics of wayfinding (three papers), communication (three papers), accessible TV captions (2 papers) and shopping and the causes of device abandonment (1 paper each). Certainly interesting work, and I look forward to seeing how some of these early stage projects unfold. You can find citations for all the papers under “References” below.

As noted above, this is not necessarily exhaustive – if you think I’ve missed some important publications on this topic, then let me know! There are certainly interesting papers that just missed my 2020 cut-off date, and interesting papers published in the period that weren’t about technology, or weren’t specific to deafblindness. I’ll try to get some follow-up posts on these in the future. In the meantime, if you have any comments about this post, or suggestions for how such posts might be improved in the future, then let me know!

REFERENCES

García-Crespo Á, Montes-Chunga M, Matheus-Chacin CA & Garcia-Encabo I. (2020). Increasing the autonomy of deafblind individuals through direct access to content broadcasted on digital terrestrial television. Assistive Technology, 32(5), 268–276. https://doi.org/10.1080/10400435.2018.1543219

MacGavin B, Edwards T & Gorlewicz JL. (2021). A Protactile-Inspired Wearable Haptic Device for Capturing the Core Functions of Communication. IEEE Transactions on Haptics, 14(2), 279–284. https://doi.org/10.1109/TOH.2021.3076397

Ozioko O, Karipoth P, Hersh M & Dahiya R. (2020). Wearable Assistive Tactile Communication Interface Based on Integrated Touch Sensors and Actuators. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 28(6), 1344–1352. https://doi.org/10.1109/TNSRE.2020.2986222

Ozioko O, Navaraj W, Hersh M & Dahiya R. (2020). Tacsac: A Wearable Haptic Device with Capacitive Touch-Sensing Capability for Tactile Display. Sensors (Basel, Switzerland), 20(17), 4780–. https://doi.org/10.3390/s20174780

Parker AT, Swobodzinski M, Brown-Ogilvie T & Beresheim-Kools J. (2020). The Use of Wayfinding Apps by Deafblind Travelers in an Urban Environment: Insights From Focus Groups. Frontiers in Education (Lausanne), 5. https://doi.org/10.3389/feduc.2020.572641

Parker AT, Swobodzinski M, Wright JD, Hansen K, Morton B & Schaller E. (2021) Wayfinding Tools for People With Visual Impairments in Real-World Settings: A Literature Review of Recent Studies. Frontiers in Education (Lausanne), 6. https://doi.org/10.3389/feduc.2021.723816

Rodríguez J, Díaz MV, Collazos O & García-Crespo Á. (2021). GoCC4All a pervasive technology to provide access to TV to the deafblind community. Assistive Technology, 1–9. https://doi.org/10.1080/10400435.2020.1829176 

Swobodzinski M, Parker AT, Wright JD, Hansen K & Morton B. (2021) Seamless Wayfinding by a Deafblind Adult on an Urban College Campus: A Case Study on Wayfinding Performance, Information Preferences, and Technology Requirements. Frontiers in Education (Lausanne), 6. https://doi.org/10.3389/feduc.2021.723098

Vincent C, Wittich W, Bergeron F, Hotton M & Achou B. (2021). Shopping When You Are Deafblind: A Pre-Technology Test of New Methods for Face-to-Face Communication—Deafblindness and Face-to-Face Communication. Societies (Basel, Switzerland), 11(4), 131. https://doi.org/10.3390/soc11040131

Wittich W, Granberg S, Wahlqvist M, Pichora-Fuller MK & Mäki-Torkko E. (2021). Device Abandonment in Deafblindness: A Scoping Review of the Intersection of Functionality and Usability through the International Classification of Functioning, Disability and Health Lens. BMJ Open 11, e044873. https://doi.org/10.1136/bmjopen-2020-044873

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.