How to Install DeepSeek Locally with Ollama LLM in Ubuntu 24.04

Running large language models like DeepSeek locally on your machine is a powerful way to explore AI capabilities without relying on cloud services.

In this guide, we’ll walk you through installing DeepSeek using Ollama on Ubuntu 24.04 and setting up a Web UI for an interactive and user-friendly experience.

What is DeepSeek and Ollama?

  • DeepSeek: An advanced AI model designed for natural language processing tasks like answering questions, generating text, and more. .
  • Ollama: A platform that simplifies running large language models locally by providing tools to manage and interact with models like DeepSeek.
  • Web UI: A graphical interface that allows you to interact with DeepSeek through your browser, making it more accessible and user-friendly.

Prerequisites

Before we begin, make sure you have the following:

  • Ubuntu 24.04 installed on your machine.
  • A stable internet connection.
  • At least 8GB of RAM (16GB or more is recommended for smoother performance).
  • Basic familiarity with the terminal.

Step 1: Install Python and Git

Before installing anything, it’s a good idea to update your system to ensure all existing packages are up to date.

sudo apt update && sudo apt upgrade -y

Ubuntu likely comes with Python pre-installed, but it’s important to ensure you have the correct version (Python 3.8 or higher).

sudo apt install python3
python3 --version

pip is the package manager for Python, and it’s required to install dependencies for DeepSeek and Ollama.

sudo apt install python3-pip
pip3 --version

Git is essential for cloning repositories from GitHub.

sudo apt install git
git --version

Step 2: Install Ollama for DeepSeek

Now that Python and Git are installed, you’re ready to install Ollama to manage DeepSeek.

curl -fsSL https://ollama.com/install.sh | sh
ollama --version

Next, start and enable Ollama to start automatically when your system boots.

sudo systemctl start ollama
sudo systemctl enable ollama

Now that Ollama is installed, we can proceed with installing DeepSeek.

Step 3: Download and Run DeepSeek Model

Now that Ollama is installed, you can download the DeepSeek model.

ollama run deepseek-r1:7b

This may take a few minutes depending on your internet speed, as the model is several gigabytes in size.

Install DeepSeek Model Locally
Install DeepSeek Model Locally

Once the download is complete, you can verify that the model is available by running:

ollama list

You should see deepseek listed as one of the available models.

List DeepSeek Model Locally
List DeepSeek Model Locally

Step 4: Run DeepSeek in a Web UI

While Ollama allows you to interact with DeepSeek via the command line, you might prefer a more user-friendly web interface. For this, we’ll use Ollama Web UI, a simple web-based interface for interacting with Ollama models.

First, create a virtual environment that isolates your Python dependencies from the system-wide Python installation.

sudo apt install python3-venv
python3 -m venv ~/open-webui-venv
source ~/open-webui-venv/bin/activate

Now that your virtual environment is active, you can install Open WebUI using pip.

pip install open-webui

Once installed, start the server using.

open-webui serve

Open your web browser and navigate to http://localhost:8080 – you should see the Ollama Web UI interface.

Open WebUI Admin Account
Open WebUI Admin Account

In the Web UI, select the deepseek model from the dropdown menu and start interacting with it. You can ask questions, generate text, or perform other tasks supported by DeepSeek.

Running DeepSeek on Ubuntu
Running DeepSeek on Ubuntu

You should now see a chat interface where you can interact with DeepSeek just like ChatGPT.

Running DeepSeek on Cloud Platforms

If you prefer to run DeepSeek on the cloud for better scalability, performance, or ease of use, here are some excellent cloud solutions:

  • Linode – It provides affordable and high-performance cloud hosting, where you can deploy an Ubuntu instance and install DeepSeek using Ollama for a seamless experience.
  • Google Cloud Platform (GCP) – It offers powerful virtual machines (VMs) with GPU support, making it ideal for running large language models like DeepSeek.
Conclusion

You’ve successfully installed Ollama and DeepSeek on Ubuntu 24.04. You can now run DeepSeek in the terminal or use a Web UI for a better experience.

Hey TecMint readers,

Exciting news! Every month, our top blog commenters will have the chance to win fantastic rewards, like free Linux eBooks such as RHCE, RHCSA, LFCS, Learn Linux, and Awk, each worth $20!

Learn more about the contest and stand a chance to win by sharing your thoughts below!

Ravi Saive
I am an experienced GNU/Linux expert and a full-stack software developer with over a decade in the field of Linux and Open Source technologies

Each tutorial at TecMint is created by a team of experienced Linux system administrators so that it meets our high-quality standards.

Join the TecMint Weekly Newsletter (More Than 156,129 Linux Enthusiasts Have Subscribed)
Was this article helpful? Please add a comment or buy me a coffee to show your appreciation.

4 Comments

Leave a Reply
  1. Do NOT trust this article, DeepSeek or openweb-ui needs some sort of internet access. I had mine set up in proxmox and set up firewall rules to only allow local network access. Openweb-ui failed to properly load, either failed to interact with the DS or DS failed to load some external resource.

    I disabled the firewall and immediately the page loaded. I will be filing a cybersec bug report in the relevant repositories. Use at your own personal risk!!!

    Reply
    • @Watson,

      We appreciate your feedback!

      However, we clearly mentioned in the prerequisites that a stable internet connection is required. Did you get a chance to read that? If you believe there’s a security concern beyond that, filing a report is a good step.

      Let us know if you need any clarification!

      Reply

Got Something to Say? Join the Discussion...

Thank you for taking the time to share your thoughts with us. We appreciate your decision to leave a comment and value your contribution to the discussion. It's important to note that we moderate all comments in accordance with our comment policy to ensure a respectful and constructive conversation.

Rest assured that your email address will remain private and will not be published or shared with anyone. We prioritize the privacy and security of our users.