React Native Local LLM Chatbot need Mobile App Development
Contact person: React Native Local LLM Chatbot
Phone:Show
Email:Show
Location: Kolkata, India
Budget: Recommended by industry experts
Time to start: As soon as possible
Project description:
"I need an Expo-managed React Native application that runs a lightweight Large Language Model entirely on the device. The app’s single purpose is a text-based chatbot that can understand and answer free-form user prompts through natural language processing—no server calls, no cloud inference.
Core scope
• Set up or compile a compact LLM (e.g., [login to view URL], GGUF model, or similar) so it loads and runs inside the React Native project for both iOS and Android.
• Build a minimal chat UI: scrolling message history, user input field, streaming model responses with basic markdown support.
• Keep everything offline: all inference must occur locally; the app should work in airplane mode once installed.
• Expose a simple helper module so I can swap or upgrade the model later without changing the UI layer.
• Provide clear build/run instructions and any custom native modules or configuration needed for Expo.
Acceptance criteria
1. Clone, install, and run with `expo run:ios` / `expo run:android` without extra setup.
2. Entering a prompt returns a coherent answer in under 5 seconds on a recent mobile device.
3. Memory use stays within reasonable device limits (≤1 GB peak on a mid-range phone).
4. No external API calls are triggered at any point; all traffic stays local.
Optional down the road—though not required now—are predefined response flows or user authentication hooks, so structure the code to allow those to be bolted on later.
Ship the complete Expo project in a public or private repo along with a brief README covering build steps, model size, and performance tips." (client-provided description)
Matched companies (2)

Conchakra Technologies Pvt Ltd
