How to deploy Dify.ai to Fly.io

July 19, 2024

Hey! I recently faced the challenge of deploying Dify.ai because I didn't want to use my local machine as a playground. I wanted Dify to be constantly available for experimentation. After some research, Fly.io emerged as the best hosting candidate due to its affordability, ease of maintenance, and straightforward deployment process. However, Fly.io does not support Docker Compose or Kubernetes, so I had to find alternative methods to work around this limitation. I'd like to share my initial attempts with you, hoping they might be helpful for you as well.

Setting Up Your Fly.io Account

Before we dive into the deployment process, the first step is to set up an account on Fly.io. Follow these simple steps to get started:

  1. Sign Up: Visit Fly.io and sign up for a free account.
  2. Install Flyctl: Fly.io's command-line interface, Flyctl, is essential for deploying your application. Install it by following the instructions on the Fly.io documentation.
  3. Login to Fly.io: Open your terminal and log in to your Fly.io account using the command:
flyctl auth login
  1. Create an Application: Initialize a new Fly.io application by running:
flyctl launch

Follow the prompts to set up your new application.

Preparing Dify.ai for Deployment

With your Fly.io account ready, the next step is to prepare Dify.ai for deployment. Here's how to do it:

  1. Clone the Dify.ai Repository:
git clone https://github.com/langgenius/dify
cd dify
  1. Follow this guide to setup .env file in /docker folder - Dify AI How to deploy docker compose

  2. In the root of the repository create Dockerfile:

# Use the Docker official DinD image
FROM docker:latest

# Install docker-compose
RUN apk add --no-cache bash
RUN apk add --no-cache docker-cli-compose

COPY . ./app

WORKDIR /app

RUN chmod 755 /app/start.sh

CMD dockerd &

ENTRYPOINT ["/app/start.sh"]

and in the root of the repository create start.sh:

#!/bin/sh

# Create a folder to store data on the Fly.io volume
mkdir -p /mnt/app
mkdir -p /mnt/app/volumes
rm -rf /app/docker/volumes

# Create symlink to your fly.io volume from the folder used by dify to save data
ln -s /mnt/app/volumes /app/docker

# Start Docker daemon
dockerd --storage-driver=overlay2 --data-root /mnt &

# Wait for Docker daemon to be ready
while (! docker stats --no-stream ); do
    echo "Waiting for Docker daemon to start..."
  sleep 1
done

# Remove all stopped containers
docker container prune -f
# Remove all unused images
docker image prune -a -f
# Remove all unused volumes
docker volume prune -f
# Remove all unused networks
docker network prune -f
# Remove build cache
docker builder prune -f
# Remove all unused data including volumes
docker system prune -a --volumes -f

# Run docker-compose up
cd ./docker && docker-compose up

Those files will be used to build a Docker image that will run Dify.ai using Docker Compose in Docker on a single machine on Fly.io.

Create volume on Fly.io

To store data on Fly.io, you need to create a volume. Run the following command:

fly volumes create dify_single_docker_data --size 10 --region atl

You will need around 5GB of space for Dify.ai docker cache for images.

Deploying Dify.ai to Fly.io

Now that everything is set up, it's time to deploy Dify.ai to Fly.io. Here's how you can do it:

# Build a docker image
docker build --rm -t dify-single-docker .

# Tag docker image
docker tag dify-single-docker:latest registry.fly.io/dify-single-docker:latest

# Push docker image
docker push registry.fly.io/dify-single-docker:latest

# Deploy to fly.io
fly deploy

Conclusion

Deploying Dify.ai to Fly.io may initially seem challenging due to the lack of Docker Compose and Kubernetes support, but with the right steps, it becomes manageable. Fly.io offers a cost-effective and efficient platform for hosting your AI applications, allowing you to keep them constantly available for experimentation. I hope my experience and solutions help you successfully deploy your projects. Please share your experience and suggestions. Thanks