Run Ollama Models on Android
Last updated
Last updated
Ollama is a CLI-based tool that allows users to run and manage various AI models locally.
This module is designed for Android users who want to install and run Ollama on their devices using Termux. You need atleast 8GB RAM of android.
By following this guide, you will be able to utilize Ollama's capabilities directly from your Android device.
Termux is a terminal emulator that enables Android devices to run a Linux environment without requiring root access.
It is freely available and can be downloaded from the official
You need to set up termux first.
Grant storage access
This command will access your android storage.
Update packages
This will update your packages to new one. Press Y when needed.
Install Essential Tools
These packages provide Git for version control, CMake for software compilation, and Go, the programming language used to develop Ollama.
Clone Ollama's Github Repository
Navigate to Ollama directory
Generate Go code
Build Ollama
After that, start ollama server
Now ollama server will run, you can interact with it.
Now ollama already installed and running, you can run any ollama models.
You can check every on ollama website.
For example, I will use .