{"id":175,"date":"2025-02-06T07:14:27","date_gmt":"2025-02-06T07:14:27","guid":{"rendered":"https:\/\/innohub.powerweave.com\/?p=175"},"modified":"2025-02-06T07:14:27","modified_gmt":"2025-02-06T07:14:27","slug":"self-hosting-deepseek-ai-models-on-aws-ec2-with-docker-ollama-and-nginx","status":"publish","type":"post","link":"https:\/\/innohub.powerweave.com\/?p=175","title":{"rendered":"Self-Hosting DeepSeek AI Models on AWS EC2 with Docker, Ollama, and Nginx"},"content":{"rendered":"\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Take Back Your PRIVACY and Run DeepSeek on EC2 in 5 Minutes!\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/DCWtyd9AW4w?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p><\/p>\n\n\n\n<p>DeepSeek&#8217;s AI models have gained significant attention, but privacy concerns regarding their web UI have led many to seek alternative solutions. This guide demonstrates how to deploy a DeepSeek model on an AWS EC2 instance using Docker, Ollama, and Nginx, giving you complete control over your AI environment.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Why Self-Host DeepSeek?<\/h2>\n\n\n\n<p>Self-hosting offers enhanced privacy and control over your AI environment, eliminating concerns associated with using third-party web UIs.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Prerequisites<\/h2>\n\n\n\n<ul class=\"wp-block-list\">\n<li>An AWS account<\/li>\n\n\n\n<li>Basic knowledge of Docker, Ollama, and Nginx<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\">Steps<\/h2>\n\n\n\n<ol class=\"wp-block-list\">\n<li><strong>Launch an EC2 Instance:<\/strong>\n<ul class=\"wp-block-list\">\n<li>Select Amazon Linux as the operating system.<\/li>\n\n\n\n<li>Choose an appropriate instance type. While a G4DN instance is ideal, an R5 XLarge instance can be used as an alternative.<\/li>\n\n\n\n<li>Create or select an existing key pair.<\/li>\n\n\n\n<li>Allow HTTPS and HTTP traffic.<\/li>\n<\/ul>\n<\/li>\n\n\n\n<li><strong>Connect to Your EC2 Server:<\/strong>\n<ul class=\"wp-block-list\">\n<li>Use SSH to connect to your EC2 instance.<\/li>\n<\/ul>\n<\/li>\n\n\n\n<li><strong>Install Docker:<\/strong>\n<ul class=\"wp-block-list\">\n<li>Update yum:\u00a0<code>sudo yum update -y<\/code><\/li>\n\n\n\n<li>Install Docker:\u00a0<code>sudo yum install docker -y<\/code><\/li>\n\n\n\n<li>Add your user to the docker group:\u00a0<code>sudo usermod -aG docker $USER<\/code><\/li>\n\n\n\n<li>Enable and start the Docker service:\n<ul class=\"wp-block-list\">\n<li><code>sudo systemctl enable docker<\/code><\/li>\n\n\n\n<li><code>sudo systemctl start docker<\/code><\/li>\n<\/ul>\n<\/li>\n\n\n\n<li>Change the permissions of the Docker socket:\u00a0<code>sudo chmod 666 \/var\/run\/docker.sock<\/code><\/li>\n<\/ul>\n<\/li>\n\n\n\n<li><strong>Deploy Ollama with Docker:<\/strong>\n<ul class=\"wp-block-list\">\n<li>Run the following command to deploy Ollama in detached mode:bash<code>docker run -d -v ollama:\/root\/.ollama -p 11434:11434 --name ollama ollama\/ollama<\/code><\/li>\n\n\n\n<li>Verify that the Ollama container is running:\u00a0<code>docker ps<\/code><\/li>\n\n\n\n<li>Check if Ollama is running:\u00a0<code>curl http:\/\/localhost:11434<\/code><\/li>\n<\/ul>\n<\/li>\n\n\n\n<li><strong>Install DeepSeek Models:<\/strong>\n<ul class=\"wp-block-list\">\n<li>Use Ollama to install the desired DeepSeek models. For example, to install the 1.5 qwen distilled and 7b qwen distilled models, run the following commands:bash<code>ollama pull deepseek-ai\/deepseek-coder:1.5b-base-q4_K_M ollama pull deepseek-ai\/deepseek-coder:7b-base-q4_K_M<\/code><\/li>\n<\/ul>\n<\/li>\n\n\n\n<li><strong>Run the Ollama Web UI:<\/strong>\n<ul class=\"wp-block-list\">\n<li>Deploy the Ollama web UI using Docker:bash<code>docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:\/app\/data --name ollama-webui ghcr.io\/ollama-webui\/ollama-webui:latest<\/code><\/li>\n<\/ul>\n<\/li>\n\n\n\n<li><strong>Configure Nginx:<\/strong>\n<ul class=\"wp-block-list\">\n<li>Install Nginx:\u00a0<code>sudo yum install nginx -y<\/code><\/li>\n\n\n\n<li>Edit the Nginx configuration file (<code>\/etc\/nginx\/nginx.conf<\/code>) and add the following code block within the\u00a0<code>http<\/code>\u00a0block:text<code>server { listen 80; server_name your_domain.com; # Replace with your domain or public IP location \/ { proxy_pass http:\/\/localhost:3000; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; } }<\/code><\/li>\n\n\n\n<li>Test the Nginx configuration:\u00a0<code>sudo nginx -t<\/code><\/li>\n\n\n\n<li>Enable and start Nginx:\n<ul class=\"wp-block-list\">\n<li><code>sudo systemctl enable nginx<\/code><\/li>\n\n\n\n<li><code>sudo systemctl start nginx<\/code><\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<\/li>\n\n\n\n<li><strong>Testing Your Deployment:<\/strong>\n<ul class=\"wp-block-list\">\n<li>Open your EC2 instance&#8217;s public IP address in a web browser, changing HTTPS to HTTP.<\/li>\n\n\n\n<li>Sign up on the Ollama web UI.<\/li>\n\n\n\n<li>Interact with the installed DeepSeek models.<\/li>\n<\/ul>\n<\/li>\n<\/ol>\n\n\n\n<h2 class=\"wp-block-heading\">Conclusion<\/h2>\n\n\n\n<p>By following these steps, you can successfully deploy DeepSeek AI models on an AWS EC2 instance, ensuring privacy and complete control over your AI environment. While this setup can be further optimized with higher parameter models and GPU-enabled instances, it provides a solid foundation for self-hosting DeepSeek.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>DeepSeek&#8217;s AI models have gained significant attention, but privacy concerns regarding their web UI have led many to seek alternative solutions. This guide demonstrates how to deploy a DeepSeek model on an AWS EC2 instance using Docker, Ollama, and Nginx, giving you complete control over your AI environment. Why Self-Host DeepSeek? Self-hosting offers enhanced privacy [&hellip;]<\/p>\n","protected":false},"author":5,"featured_media":176,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[33,40,34,53],"tags":[26,138,85,28,91],"class_list":["post-175","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artificial-intelligence","category-aws","category-cloud-computing","category-software-development","tag-ai","tag-deepseek","tag-docker","tag-future-of-web-development","tag-ollama"],"jetpack_featured_media_url":"https:\/\/innohub.powerweave.com\/wp-content\/uploads\/2025\/02\/sddefault-1.jpg","_links":{"self":[{"href":"https:\/\/innohub.powerweave.com\/index.php?rest_route=\/wp\/v2\/posts\/175","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/innohub.powerweave.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/innohub.powerweave.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/innohub.powerweave.com\/index.php?rest_route=\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/innohub.powerweave.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=175"}],"version-history":[{"count":1,"href":"https:\/\/innohub.powerweave.com\/index.php?rest_route=\/wp\/v2\/posts\/175\/revisions"}],"predecessor-version":[{"id":177,"href":"https:\/\/innohub.powerweave.com\/index.php?rest_route=\/wp\/v2\/posts\/175\/revisions\/177"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/innohub.powerweave.com\/index.php?rest_route=\/wp\/v2\/media\/176"}],"wp:attachment":[{"href":"https:\/\/innohub.powerweave.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=175"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/innohub.powerweave.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=175"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/innohub.powerweave.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=175"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}