Join our Discord Server
Tanvir Kour Tanvir Kour is a passionate technical blogger and open source enthusiast. She is a graduate in Computer Science and Engineering and has 4 years of experience in providing IT solutions. She is well-versed with Linux, Docker and Cloud-Native application. You can connect to her via Twitter https://x.com/tanvirkour

How to Completely Uninstall Ollama and Erase LLM Models on Linux Systems?

1 min read

Ollama is a powerful tool for running large language models (LLMs) locally. However, there may come a time when you need to uninstall it entirely, whether to free up disk space or troubleshoot an issue. This guide walks you through completely removing Ollama and erasing all related files, including any LLM models downloaded on Linux systems.

Why Completely Uninstall Ollama?

There are several reasons you might want to uninstall Ollama and erase its associated data:

  • Freeing Up Disk Space: LLM models can consume significant storage.
  • Starting Fresh: Clean installation to troubleshoot errors.
  • Switching to a Different Tool: Transitioning to another solution.

Steps to Completely Uninstall Ollama

1. Stop and Remove the Docker Container

If you are running Ollama via Docker, stop and remove the container.

Stop the Container

docker stop ollama

Remove the Container

docker rm ollama

Remove the Docker Image

List the images to confirm the ollama/ollama image exists:

docker images

Remove the image:

docker rmi ollama/ollama

2. Uninstall Ollama Installed via Package Manager

If you installed Ollama directly on your Linux system, uninstall it using your package manager.

For Debian/Ubuntu:

sudo apt-get remove --purge ollama
sudo apt-get autoremove

For Fedora/CentOS/RHEL:

sudo dnf remove ollama

For Arch Linux:

sudo pacman -Rns ollama

3. Remove Configuration Files

After uninstalling Ollama, residual configuration files may still exist. Remove them manually to ensure a clean slate.

Common Directories to Check

Home Directory Configurations:

    rm -rf ~/.ollama
    

    System-Wide Configurations:

      sudo rm -rf /etc/ollama
      

      Log Files: Check for logs and delete them:

        sudo rm -rf /var/log/ollama
        

        4. Erase LLM Models

        LLM models downloaded by Ollama can take up significant disk space. These models are often stored in specific directories.

        Find and Remove LLM Models

        Locate the folder storing the models:

        sudo find / -type d -name "ollama"
        

        Remove the models:

        sudo rm -rf /path/to/ollama
        

        5. Clean Docker Volumes (If Applicable)

        If you used Docker to run Ollama, its volumes might still exist, storing model data.

        List Volumes

          docker volume ls
          

          Remove Ollama Volume

            docker volume rm ollama
            

            6. Verify the Uninstallation

            To ensure Ollama and its files are entirely removed:

            Run the following command to check if any traces remain:

            sudo find / -name "ollama" -type f
            

            Confirm no results are returned.

            Additional Tips

            Check Disk Usage

            After uninstalling, verify the reclaimed disk space:

            df -h
            

            Reinstall Ollama

            If you plan to reinstall Ollama later, follow the official Ollama installation guide.

            Conclusion

            By following this guide, you can completely uninstall Ollama and erase all LLM models from your Linux system. This process ensures no residual files or data occupy your disk space, leaving your system clean and ready for other tasks. Whether you’re troubleshooting, switching tools, or reclaiming space, these steps will help you achieve your goal.

            Have Queries? Join https://launchpass.com/collabnix

            Tanvir Kour Tanvir Kour is a passionate technical blogger and open source enthusiast. She is a graduate in Computer Science and Engineering and has 4 years of experience in providing IT solutions. She is well-versed with Linux, Docker and Cloud-Native application. You can connect to her via Twitter https://x.com/tanvirkour
            Join our Discord Server
            Index