Podcast Title

Author Name

0:00
0:00
Album Art

How I Built and Deployed a Background Removal App with Docker

By 10xdev team August 02, 2025

One of the most annoying things in the world is when you're trying to grab an image from the internet, and it appears to have a transparent background, but when you download it, it has the classic checkerboard pattern. Luckily, nowadays, there are all kinds of good tools for removing backgrounds from images, like Remove.bg or the new AI tools in Photoshop.

In my work, I use a lot of images with the background removed, and it's extremely inefficient to have to go into Photoshop, upload an image, remove its background, re-export it, and then bring it back into my project. As a developer, this inefficiency is unacceptable, and my only option was to build my own app from scratch.

In this article, I want to show you how I built this background remover from scratch. More importantly, I want to talk about why I dockerized it and explain how I deployed it to the cloud for free.

I recently published an introduction to Docker, which is a great starting point if you have no idea what it is. I prefer to focus on tools that I have firsthand experience with, and Docker is something I use all the time. By containerizing this background remover, which is just a Python web app, I'm able to run it locally with the click of a button and also deploy it to the cloud with a single command. Without Docker, I would have to go into my terminal, ensure I have the right Python dependencies installed, and then run the app in the background every time I want to use it. On top of that, deployment to the cloud would be a lot more complex and also more expensive.

Building the Core Application

First, let's talk about the app itself. The reason I built this app in Python, and not my typical choice of JavaScript, is that there's a Python package called rembg which is based on the U-2-Net model to magically remove the background with AI. But really, this article has nothing to do with the intricacies of AI models.

The model itself is heavily abstracted to the point where all we do is open an image with Pillow, call the remove function from this library, which returns a new image with the background removed. It's an extremely practical use case for image models. However, I don't want to use it from the terminal. I want to be able to drag and drop images directly from my browser into it so I can then drag the result directly back into my project.

To do that, I built a little app with Flask, which creates a single HTTP route that handles both GET and POST methods. A GET request displays the initial web page, and when we drag an image into that web page, it makes a POST request, which calls that remove background function.

The website itself is rendered in an index.html file, which uses nothing but plain JavaScript and CSS. It contains an HTML form with a file input. When that form is submitted, it makes a POST request to the root URL. I also wrote a little bit of JavaScript that will automatically submit the form when a file is dropped onto it, just to make the process even more efficient. That's the entire app. I can run it from the terminal with the python command, but now here is where Docker comes in.

Containerizing with Docker

I want to be able to use this code on multiple computers and also deploy it to the web so I could even use it from my phone or some other device, allowing anyone to use this tool.

First, you'll need to have Docker installed. I'm using Docker Desktop, but in the past, I've used tools like Podman, which is developed by Red Hat and is also a good option. Now, we need to go into our code and create a Dockerfile.

The Dockerfile itself is very simple. It starts with the official Python base image, creates a working directory for the app, installs the dependencies, copies the code, exposes a port, and then runs the app.

# Use the official Python image
FROM python:3.9-slim

# Set the working directory
WORKDIR /app

# Copy and install dependencies
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

# Copy the application code
COPY . .

# Expose the port the app runs on
EXPOSE 5000

# Run the app
CMD ["flask", "run", "--host=0.0.0.0"]

The only unusual thing I'm doing here is taking the actual AI model weights and copying them into the Docker image. The weights are about 175 MB, and doing this prevents the actual Python package from downloading them asynchronously, which would slow things down.

Now, let's build the image and make sure to give it a tag. This will take a minute. Then, if we go into Docker Desktop, we should see it in the images panel. To actually run the image as a container, we simply hit the play button and make sure to map the port to something we can use on localhost. Now the app is always ready to go in the background with Docker. That's how I use this tool 90% of the time.

Deploying to the Cloud for Free

But I also want to show you how to deploy it to the web. There are numerous options for deploying containers to the cloud, and there are also several free options available.

The most well-known option is Elastic Container Service (ECS) on AWS with a related service called Fargate that can deploy your container in a serverless way. This means it will scale down to zero when it's not in use and then scale back up once requests are coming in. You've also got services like the App Platform on DigitalOcean, which has a free tier. But my go-to for deploying random utilities like this is Google Cloud Run.

To deploy something that's dockerized, you first need to get your image on a registry. Every cloud has one built-in, and on Google, it's called Artifact Registry. You create a repository for your images that'll store them in a specific region. Then, you can copy the repository link, which can be used as a tag on your images so it knows where to upload them.

Let's go into the terminal and use the docker tag command to tag our existing image with this namespace. Once that's done, we can use the docker push command to upload it to Google Cloud.

Note: One caveat is that you need a Google Cloud account and you'll also need the gcloud CLI tool installed on your system. Once that's done, you should be able to see the image in Google Cloud.

One nice thing about this is that if you want to use this image on a different machine, you can simply pull it from this repo. But now, let's head over to Cloud Run and deploy it to the internet.

Create a new service, then the first thing you'll do is select that container image. From here, we have a bunch of configuration options, but if you want to make this a public web service, the most important one is to "Allow unauthenticated invocations." That means anybody can access it from a public API or URL.

The next option is CPU allocation. One problem with serverless deployments is that when the app is not being used, it scales down to zero, which is great because it means you're not paying for anything. But the trade-off is a 'cold start,' which means it takes around four or five seconds for the application to boot up when the next request comes in. In my case, that's not a problem, but if you want to eliminate cold starts, you can make sure that the CPU is always allocated. It's just going to cost more because you'll always be burning through the free CPU seconds every month.

From there, let's go down to the container options. One thing we'll also want to change here is the allocated memory for the container. It takes a lot of memory to run the AI model, so let's bump this up to 2 GB.

One other thing I want to do is decrease the amount of autoscaling this application can do. Instead of 100 maximum instances, I'm only going to allow three. We just don't need to be prepared to scale for this type of app, but it's nice to have that option if you're building something with viral potential.

Let's go ahead and deploy it, and a few minutes later, we should have a URL where we can access our Python app on the web. Pretty awesome.

And one huge benefit of having this all dockerized is that our code is portable. So if we want to get off Cloud Run, we could take that to any other cloud service and deploy it there just as easily. And that's basically all there is to it. There's a ton of other stuff we could talk about when it comes to Docker and Cloud Run.

Join the 10xdev Community

Subscribe and get 8+ free PDFs that contain detailed roadmaps with recommended learning periods for each programming language or field, along with links to free resources such as books, YouTube tutorials, and courses with certificates.

Recommended For You

Up Next