
Blog Article

This funded project seeks to make coding more accessible for developers with physical impairments.
People with physical impairments whoÌýmay experience challenges in usingÌýtraditional input devices (i.e., mouse,Ìýkeyboard, and touch) are often excluded from technical professions (e.g.,Ìýsoftware engineering). Alternative input methods such as eye gaze trackingÌý²¹²Ô»åÌýspeech recognition have becomeÌýmore readily available in recent years, although there has been a lack of work exploring the potential of usingÌýthese technologies to make coding more accessible.Ìý
To address this gap, we are exploring the potential of combining multiple alternative methods of input (i.e. gaze interaction, speech recognition, and large mechanical switches) to make coding more accessible forÌýdevelopersÌýwith physical impairments. This work has resulted in a new development platformÌý(Voiceye) thatÌýfacilitatesÌýmultimodalÌýinputas an approach for writing HTML, CSS, and JavaScript codeÌý(further details are available inÌýthe followingÌýpaper:Ìý).
We have also beenÌýinvestigatingÌýthe working practices ofÌývoice coders with physical impairments, as well asÌýthe strengths and limitations of differentÌýmultimodalÌýspeechÌýcodingÌýapproaches.ÌýOur paperÌýonÌýÌýprovides additional detail around the work we have undertaken in this area.Ìý
OurÌýlonger-termÌýaim is to createÌýa customisable development platform that supports people with a range of impairmentsÌý(using different input modalities) to write and manage codeÌýefficiently and effectively to a professional standard.ÌýWe are also particularly interested inÌýexamining further theÌýpotential for intelligent coding assistants to support disabled developers and how this canÌýsupport coding experiences.Ìý
Project Team
- Professor Chris CreedÌý(Project Lead)
- Bharat Paudyal
- Professor Ian Williams
- Dr Maite Frutos-Pascual
- Dr Sayan Sarcar
Funders
This work has been funded through Microsoft AI for Accessibility andÌýGoogle Inclusive Research Program awards.Ìý
Contact
For more informationÌýon the project, contact Professor Chris Creed (chris.creed@bcu.ac.uk).