last_updated: 2025-10-29

llm.datasette.io generates compose.yml#

Sometimes, the documentation of an open source tool only shows how to run it using Docker.

I want it to conform to my ingress though, which is based on docker compose.

This is what I do:

# clipboard contains "
clipboard | llm -t docker-compose

with the following line in the clipboard (from prefect):

docker run -p 4200:4200 -d --rm prefecthq/prefect:3-latest -- prefect server start --host 0.0.0.0

Which returns

networks:
  ingress:
    external: true

services:
  prefect:
    image: prefecthq/prefect:3-latest
    command: prefect server start --host 0.0.0.0
    networks: [ default, ingress ]
    labels:
      ingress.host: prefect.0-main.de

Here’s the llm template:

cat <<'EOF' > $(llm templates path)/docker-compose.yaml
model: openrouter/qwen/qwen3-coder
system: >
    Convert the given input to a docker-compose.yaml file.
    Remove any exposed port statements.
    Add the following at the top of the file:

        networks:
          ingress:
            external: true

    For each service, add

        networks: [ default, ingress ]
        labels:
          ingress.host: <name>.0-main.de

    Replace <name> with the service name.
    Only output the file's contents - nothing else.
    Skip the output of code fences.
EOF