Ollama Proxy

(github.com)

2 points | by kesor 9 hours ago ago

1 comments

  • kesor 9 hours ago ago

    A Dockerized Nginx server configured to act as a reverse proxy for Ollama, a local AI model serving platform. The proxy includes built-in authentication using a custom Authorization header and exposes the Ollama service over the internet using a Cloudflare Tunnel.

    Ollama has an OpenAI compatible AI, having this proxy allows you to use Cursor.ai IDE with the models on your computer by using their OpenAI custom url + token.