diff options
author | Ben Sima (aider) <ben@bsima.me> | 2025-06-04 11:50:05 -0400 |
---|---|---|
committer | Ben Sima <ben@bsima.me> | 2025-06-13 11:19:37 -0400 |
commit | 495db3caa6101514c576d9bac18206cd88011871 (patch) | |
tree | 38d255be41fddc0ecd2f5cd995958c02fc23f47b /Omni/Cloud/OpenWebui.nix | |
parent | 9ad72ecd8657449873162114c3d04008fc7adbd3 (diff) |
Add Open Web UI AI Chat Container and Nginx Proxy
Introduce a new container definition for the Open Web UI AI Chat service
in `OpenWebui.nix`, specifying its Docker image, volume, and environment
variables. This change includes the addition of a new port in `Ports.nix`
to facilitate communication with the service.
Furthermore, configure Nginx to serve the AI Chat application by adding
a new virtual host entry in `Web.nix`, ensuring SSL is enforced and
websocket support is enabled. This setup allows for a seamless
integration of the AI Chat service into the existing infrastructure,
improving accessibility and security.
Diffstat (limited to 'Omni/Cloud/OpenWebui.nix')
-rw-r--r-- | Omni/Cloud/OpenWebui.nix | 14 |
1 files changed, 14 insertions, 0 deletions
diff --git a/Omni/Cloud/OpenWebui.nix b/Omni/Cloud/OpenWebui.nix new file mode 100644 index 0000000..fe71608 --- /dev/null +++ b/Omni/Cloud/OpenWebui.nix @@ -0,0 +1,14 @@ +{config, ...}: let + ports = import ./Ports.nix; +in { + config.virtualisation.oci-containers.backend = "docker"; + + config.virtualisation.oci-containers.containers.open-webui-aichat = { + image = "ghcr.io/open-webui/open-webui:main"; + volumes = ["/var/lib/open-webui-aichat:/app/backend/data"]; + environment = { + PORT = toString ports.open-webui-aichat; + }; + extraOptions = ["--network=host"]; + }; +} |