一个运行 LLM 的服务应用。

官网:Ollama

测试指令

curl http://localhost:11434/api/generate -d '{
  "model": "<模型名称>",
  "prompt": "Hello, world!",
  "stream": false
}'

使用 caddy 建立基础认证

ref: How to secure the API with api key · Issue #849 · ollama/ollama

修改 Caddyfile,替换 API_KEY 为 OPENAI_API 的认证 KEY:

api.example.com {
	# if you have 80 and 443 open on the ollama server, you can specify tls and caddy automatically sets up SSL for you such that external connectors securely connect to your Ollama API with SSL and do not complain about certificates
	# tls somemail@mail.com
	
	@requireAuth {
		not header Authorization "Bearer API_KEY"
	}
 
	respond @requireAuth "Unauthorized" 401
 
	reverse_proxy 127.0.0.1:11434 {
    	header_up Host 127.0.0.1:11434
    }
}

备注:根据 Error 403 with zrok and other reverse proxies · Issue #3269 · ollama/ollama 需要在 reverse_proxy 中定义 Host 头,否则 ollama 会返回 403 拒绝服务,这可能是 ollama 特殊的安全配置。