Aryan.

Auditory Feedback System for Visually Impaired Programmers
07 // AI

Auditory Feedback System for Visually Impaired Programmers

01. Overview

Engineered a real-time auditory feedback system to support visually impaired individuals learning to code. Implemented sonification techniques to represent syntax errors, indentation, function execution, and loop iterations using distinct auditory cues. Integrated adaptive feedback based on user proficiency and provided optional tactile feedback via wearable devices. The system aims to reduce cognitive load and improve accessibility in programming education.

02. Deep Dive

Accessibility in software development tooling is often an afterthought. This project pioneers a novel approach to coding without vision by mapping abstract programmatic structures to intuitive soundscapes (sonification). Utilizing the Web Audio API, the system parses code in real-time and generates specific audio signatures. For instance, deeper indentation levels correspond to higher pitches, syntax errors trigger distinct dissonant chords, and rapid loop iterations produce rhythmic percussive patterns. This non-visual representation drastically reduces the cognitive load associated with relying solely on text-to-speech screen readers, empowering visually impaired programmers to 'hear' the shape, structure, and execution flow of their code naturally.

Project Info

  • Achievement

    Inclusive Computing Initiative

  • Timeline

    April 2025 - Present


Tech Stack

JavaScriptHTML/CSSWeb Audio APIAssistive Technology