본문 바로가기
기타

맥 ollama api 외부 오픈

by 앗사비 2024. 7. 23.
728x90

OLLAMA_HOST=0.0.0.0 ollama serve

 

사전에 위와 같은 명령어로 실행 필요

https://github.com/ollama/ollama/issues/703#issuecomment-1951444576

 

Allow listening on all local interfaces · Issue #703 · ollama/ollama

This means not loopback but all other private networks Makes it unusable in containers and configs with proxies in front.

github.com

https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server

 

ollama/docs/faq.md at main · ollama/ollama

Get up and running with Llama 3, Mistral, Gemma 2, and other large language models. - ollama/ollama

github.com

 

* 상시 허용하고 싶으면 

vi ~/.zshrc 실행 후 아래 문구 추가

export OLLAMA_HOST=0.0.0.0:11434

 

source ~/.zshrc 실행해서 적용

 

* cli에서 실행하려면

ollama serve

ollama run llama3.1:8b

 

* GUI로 사용

https://docs.openwebui.com/getting-started/

728x90