Sebastian님의 ThunderAI (ChatGPT, Gemini, Claude, Ollama) 검토
5점중 4점 받음
Great addon with a lot of good tools. Thank you for your work. It works fine with OpenAI but I cant get it to work with a local Ollama LLM.
I tries http://127.0.0.1:11434 and also https://host.docker.internal:11434 after moving ollama to docker. I alway get
"Ollama API request failed: TypeError: NetworkError when attempting to fetch resource." when I try to fetch the modell-list. Anyone with an idea?
Have you setup CORS as described here?
https://micz.it/thunderbird-addon-thunderai/ollama-cors-information/
Please open an issue so we can interact more effectively: https://github.com/micz/ThunderAI/issues
모음집을 제작하시려면 Mozilla 부가 기능 계정이 있어야 합니다.
아니면 현재 사용중인 계정으로 로그인 하세요