5점중 4점 받음

Great addon with a lot of good tools. Thank you for your work. It works fine with OpenAI but I cant get it to work with a local Ollama LLM.

I tries http://127.0.0.1:11434 and also https://host.docker.internal:11434 after moving ollama to docker. I alway get
"Ollama API request failed: TypeError: NetworkError when attempting to fetch resource." when I try to fetch the modell-list. Anyone with an idea?

이 검토는 현재 부가 기능의 이전 (3.1.2) 버전에 대한 것입니다. 

Have you setup CORS as described here?
https://micz.it/thunderbird-addon-thunderai/ollama-cors-information/

Please open an issue so we can interact more effectively: https://github.com/micz/ThunderAI/issues