
Engineered a real-time auditory feedback system to support visually impaired individuals learning to code. Implemented sonification techniques to represent syntax errors, indentation, function execution, and loop iterations using distinct auditory cues. Integrated adaptive feedback based on user proficiency and provided optional tactile feedback via wearable devices. The system aims to reduce cognitive load and improve accessibility in programming education.
Accessibility in software development tooling is often an afterthought. This project pioneers a novel approach to coding without vision by mapping abstract programmatic structures to intuitive soundscapes (sonification). Utilizing the Web Audio API, the system parses code in real-time and generates specific audio signatures. For instance, deeper indentation levels correspond to higher pitches, syntax errors trigger distinct dissonant chords, and rapid loop iterations produce rhythmic percussive patterns. This non-visual representation drastically reduces the cognitive load associated with relying solely on text-to-speech screen readers, empowering visually impaired programmers to 'hear' the shape, structure, and execution flow of their code naturally.
Achievement
Inclusive Computing Initiative
Timeline
April 2025 - Present