Bayan
Read the Quran in Arabic, one word at a time.
Progressive substitution. Spaced repetition. On-device speech recognition.
iOS app that helps non-Arabic speakers gradually learn Quranic Arabic through progressive substitution. Built for the Quran Foundation Ramadan 2026 Hackathon. Tarteel AI Whisper integration via CoreML for on-device pronunciation feedback.
§ Abstract
Most Quran learning apps assume you already speak Arabic. Bayan assumes you don’t, and progressively trains you. Verses are displayed as a mix of English and Arabic based on your current vocabulary. Common words substitute first; rare ones later. The substitution slider lets you control the ratio — from 0% (all English) to 100% (all Arabic script).
Built in Swift for iOS. Submitted to the Quran Foundation Ramadan 2026 Hackathon.
§ What surfaced
- Cognitive science backs the approach. The Involvement Load Hypothesis predicts that the depth of vocabulary processing — not raw exposure count — determines retention. Progressive substitution forces depth at every encounter.
- Tarteel AI’s open-source Whisper model runs on-device via CoreML, giving learners pronunciation feedback without a network round-trip and without sending audio off the device.
- Word-frequency-driven scoring (common words like Allah, Rahman substitute first) mirrors how children acquire language naturally.
- Two learning paths support both readers (Arabic Script mode) and pre-readers (Transliteration mode).
§ Method
The substitution engine assigns each word a mastery score (unseen → introduced → learning → familiar → mastered). On every render, the engine evaluates each word’s score against the user’s threshold and the global substitution slider. Words above threshold display in Arabic; below, in English.
Pronunciation drills follow a fixed loop: hear the word at normal speed, hear it at half speed, hear it at normal speed, then attempt it. The on-device Whisper model scores the attempt and updates the mastery score.
unseen ─ introduced ─ learning ─ familiar ─ mastered
│ │ │ │ │
shown heard drilled quizzed fluent
§ Implementation
Pure Swift, SwiftUI throughout. No external network dependencies for the core learning loop. CoreML model loaded at first launch and cached. Audio playback with word-by-word highlighting (no text rearranging — preserves the reader’s position). Per-verse play controls. Vocabulary quiz with flashcards. Reading streak tracker, calendar heatmap, milestone celebrations.
§ Outcomes
- Shipped on time for the Ramadan 2026 hackathon
- On-device speech recognition working end-to-end via Tarteel Whisper through CoreML
- Score-based substitution that reliably surfaces high-frequency vocabulary first
- Vocabulary quiz, daily word, streak heatmap all live
Future direction: a web companion that pulls the user’s vocabulary state and renders the same progressive view, plus optional teacher dashboards for groups learning together.