docs: fix Mermaid edge label parsing in architecture diagram
This commit is contained in:
@@ -0,0 +1,46 @@
|
||||
# Docker + Traefik Homelab Stack
|
||||
|
||||
This repository defines a multi-compose Docker environment with Traefik as ingress, app workloads, and a monitoring/alerting plane.
|
||||
|
||||
## High-Level Architecture
|
||||
|
||||
```mermaid
|
||||
flowchart TB
|
||||
Internet((Internet Clients)) -->|HTTPS 443 / HTTP 80| Traefik[Traefik Ingress\nACME TLS + Security Middlewares]
|
||||
|
||||
subgraph DockerHost[Primary Docker Host]
|
||||
Traefik
|
||||
Authelia[Authelia SSO / ForwardAuth]
|
||||
CrowdSec[CrowdSec + Traefik Bouncer]
|
||||
ErrPages[Error Pages Fallback]
|
||||
|
||||
subgraph Apps[Business / User Applications]
|
||||
Nextcloud[Nextcloud]
|
||||
Passbolt[Passbolt]
|
||||
Gitea[Gitea]
|
||||
FamilyTree[Gramps Web]
|
||||
Searxng[SearXNG]
|
||||
end
|
||||
|
||||
subgraph Ops[Operations & Monitoring]
|
||||
Grafana[Grafana]
|
||||
Prometheus[Prometheus]
|
||||
InfluxDB[InfluxDB]
|
||||
NodeRED[Node-RED]
|
||||
Portainer[Portainer]
|
||||
UptimeKuma[Uptime Kuma]
|
||||
Gotify[Gotify Notifications]
|
||||
end
|
||||
end
|
||||
|
||||
Traefik --> Apps
|
||||
Traefik --> Ops
|
||||
Traefik -->|ForwardAuth for selected routes| Authelia
|
||||
Traefik -->|Threat decisions| CrowdSec
|
||||
Traefik -->|4xx/5xx fallback| ErrPages
|
||||
|
||||
Prometheus --> Grafana
|
||||
Prometheus --> Gotify
|
||||
```
|
||||
|
||||
For a request-flow/network view and architecture notes, see [docs/architecture.md](docs/architecture.md).
|
||||
@@ -0,0 +1,109 @@
|
||||
# Architecture Summary
|
||||
|
||||
## Overview
|
||||
|
||||
This stack uses **Traefik v3** as the internet-facing ingress for application and operations UIs. Service routing is primarily label-driven from Docker Compose files, with a shared `traefik` bridge network for reverse-proxied traffic and a `monitor` network for internal telemetry components.
|
||||
|
||||
TLS is terminated at Traefik using ACME HTTP challenge (`myresolver`), with additional hardening via:
|
||||
|
||||
- a default middleware chain (security headers, CrowdSec bouncer, error pages),
|
||||
- Authelia forward-auth middleware on selected routes,
|
||||
- mTLS TLS options (`mtls-private-admin`) on private-admin endpoints.
|
||||
|
||||
## Network / Request Flow
|
||||
|
||||
```mermaid
|
||||
flowchart LR
|
||||
C[Internet Client] -->|80/443| T[Traefik Ingress]
|
||||
T -->|HTTP->HTTPS redirect| T
|
||||
T -->|ACME HTTP challenge| LE[Let's Encrypt ACME]
|
||||
|
||||
subgraph TraefikNet["Docker network: traefik (172.21.0.0 slash 16)"]
|
||||
A[Authelia]
|
||||
CS[CrowdSec LAPI]
|
||||
EP[Error Pages]
|
||||
|
||||
NC[Nextcloud]
|
||||
PB[Passbolt]
|
||||
GT[Gitea]
|
||||
GW[Gramps Web]
|
||||
SX[SearXNG]
|
||||
|
||||
GF[Grafana]
|
||||
PR[Prometheus]
|
||||
NR[Node-RED]
|
||||
PT[Portainer]
|
||||
UK[Uptime Kuma]
|
||||
IF[InfluxDB]
|
||||
GO[Gotify]
|
||||
end
|
||||
|
||||
T -->|forwardAuth for selected services| A
|
||||
T -->|plugin decisions| CS
|
||||
T -->|4xx/5xx middleware| EP
|
||||
|
||||
T --> NC
|
||||
T --> PB
|
||||
T --> GT
|
||||
T --> GW
|
||||
T --> SX
|
||||
|
||||
T --> GF
|
||||
T --> PR
|
||||
T --> NR
|
||||
T --> PT
|
||||
T --> UK
|
||||
T --> IF
|
||||
T --> GO
|
||||
|
||||
subgraph MonitorNet[Docker network: monitor]
|
||||
NE[Node Exporter]
|
||||
TE[Telegraf]
|
||||
DE[Docker Update Exporter]
|
||||
PE[Pi-hole Exporter]
|
||||
DSP[Docker Socket Proxy]
|
||||
end
|
||||
|
||||
PR --> NE
|
||||
PR --> TE
|
||||
PR --> DE
|
||||
PR --> PE
|
||||
PR --> UK
|
||||
PR -->|remote scrape| RH[Remote Hosts]
|
||||
TE --> DSP
|
||||
NR --> DSP
|
||||
PT --> DSP
|
||||
T --> DSP
|
||||
```
|
||||
|
||||
## Key Components
|
||||
|
||||
- **Ingress & security plane:** Traefik, Authelia, CrowdSec, Error Pages.
|
||||
- **User-facing applications:** Nextcloud, Passbolt, Gitea, Gramps Web (Family Tree), SearXNG.
|
||||
- **Monitoring/ops:** Prometheus, Grafana, InfluxDB, Node-RED, Uptime Kuma, Portainer, Gotify.
|
||||
- **Support plane:** Docker Socket Proxy (shared Docker API gateway for Traefik/automation/ops tools).
|
||||
|
||||
## Remote Hosts Observed
|
||||
|
||||
Prometheus scrape targets indicate additional infrastructure outside the local Compose deployment, including hostnames for:
|
||||
|
||||
- `raspberrypi.tail13f623.ts.net`
|
||||
- `pve.sweet.home`
|
||||
- `pbs.sweet.home`
|
||||
- `pihole`
|
||||
- `server`
|
||||
- `nix-cache`
|
||||
- `kuma.lan.ddnsgeek.com`
|
||||
|
||||
## Assumptions / Unknowns
|
||||
|
||||
The repository provides enough detail to infer **container-level architecture**, but not full **Proxmox host/VM topology**.
|
||||
|
||||
Unknowns (left intentionally as placeholders):
|
||||
|
||||
- **Proxmox physical hosts:** _unknown from repo contents._
|
||||
- **VM/LXC inventory and placement:** _unknown from repo contents._
|
||||
- **Which services run on which Proxmox node(s):** _unknown from repo contents._
|
||||
- **Inter-host VLAN/subnet layout beyond Docker bridges:** _unknown from repo contents._
|
||||
|
||||
If you want, this section can be replaced with a concrete Proxmox topology once you add an inventory source (e.g., Terraform, Ansible inventory, or a diagram export).
|
||||
Reference in New Issue
Block a user