Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Show HN: Qwen 3.5 running on a $300 Android phone – on-device, open source (github.com/alichherawalla)
6 points by ali_chherawalla 19 days ago | hide | past | favorite | 10 comments
Qwen 3.5 Small dropped two days ago. I had it running on a mid-tier Android phone within hours.

It's great seeing the on-device AI community light up around this release. Off Grid brings it to Android: phones with 6GB RAM in the $200-300 range, ~8 tok/sec on the 2B model. Fully offline.

Text generation, vision AI, image gen, voice transcription, tool calling, document analysis — all on-device, nothing uploaded, ever. Works in airplane mode.

780+ GitHub stars. ~2,000 downloads across Android and iOS. Early days.

GitHub: https://github.com/alichherawalla/off-grid-mobile-ai

Play Store: https://play.google.com/store/apps/details?id=ai.offgridmobi...

App Store: https://apps.apple.com/us/app/off-grid-local-ai/id6759299882



You have eliminated the problem of latency and having flagship phones. Amazing, my old android has AI now. Gave you star on github. Ciao.


lol thanks buddy!


The current copy function copies the entire message.

Could it please be improved to allow selection and copying of only the desired text?


fair point, let me know work on that. That's a small lift.


Can you share some technical details? How did you do it? What’s under the good?


ofcourse ofcourse,

I've documented everything here: https://github.com/alichherawalla/off-grid-mobile-ai/blob/ma...

llama.cpp compiled as a native Android library via the NDK, linked into React Native through a custom JSI bridge. GGUF models loaded straight into memory. On Snapdragon devices we use QNN (Qualcomm Neural Network) for hardware acceleration. OpenCL GPU fallback on everything else. CPU-only as a last resort.

Image gen is Stable Diffusion running on the NPU where available. Vision uses SmolVLM and Qwen3-VL. Voice is on-device Whisper.

The model browser filters by your device's RAM so you never download something your phone can't run. The whole thing is MIT licensed - happy to answer anything about the architecture.


Any roadmap to add Mediatek NPU support?


I'm working on that we speak. Shouldn't not be that difficult of a lift and should be able to do that tonight or in the next couple of nights


Happy to test, I have poco X6 pro, 12gb ram model


awesome. I'll let you know once thats in




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: