Built an on-device computer-vision system for a retail bank that turns a customer's phone into a fully offline KYC station — real-time liveness, face matching against the ID portrait, and document capture, all running locally with ultra-high accuracy and no biometrics ever leaving the handset. Engineered before the generative-AI era on classical computer vision and tightly tuned mobile ML, the platform enabled remote account opening without the cost, latency, or connectivity assumptions of video-call or server-side verification.
The Challenge
A local bank wanted to open accounts remotely — and under KYC and AML rules, that means proving in real time that the person on the other end of the phone is real, present, and the legitimate owner of the ID they are holding. Video-call verification was too slow and expensive to scale; server-side automation forced biometric data off the device and depended on connectivity retail customers don't always have. The bar was to do all of it on-device — accurately enough for compliance, fast enough to feel instant, and robust enough to defeat printed photos, replayed videos, and screen attacks — across the wide range of mid- and low-end Android and iOS phones the bank's customers actually carry.
Our Approach
We built an end-to-end mobile vision pipeline grounded in classical computer vision and tightly tuned ML — no foundation models, no cloud round-trips. Face detection and landmark alignment locate and stabilize the subject in real time; a passive liveness stack reads texture, micro-motion, and color/reflectance cues to flag presentation attacks — printed photos, screen replays, and 3D masks — without forcing the user through awkward head choreography, with optional active challenges layered on for higher-assurance flows. A parallel path captures the ID document, runs OCR-based field extraction, and matches the portrait on the document to the live selfie using a face-recognition model quantized for on-device inference. The whole pipeline is engineered for mobile reality: aggressive model quantization, ARM SIMD acceleration, frame-budgeted inference, and graceful behavior under uneven lighting and varying camera quality — all without any biometric leaving the device.
Results
The system shipped into the bank's mobile app as the verification engine for remote account opening — capturing, matching, and liveness-checking a new customer end-to-end in real time, fully offline, with accuracy high enough to satisfy KYC review and resist real-world spoofing. Beyond the deployment itself, the project is a reference point for what disciplined applied-ML engineering delivered before the generative-AI era: production-grade on-device biometrics built from classical CV and carefully trained mobile models, on the phones customers already carry — and the same on-device-first discipline now carries directly into our modern multimodal work.


