[1.3.4]
Update ollama to 0.15.5
Full Changelog
Improvements to ollama launch
Sub-agent support for ollama launch for planning, deep research, and similar tasks
ollama signin will now open a browser window to make signing in easier
Ollama will now default to the following context lengths based on VRAM:
GLM-4.7-Flash support on Ollama's experimental MLX engine
ollama signin will now open the browser to the connect page
Fixed off by one error when using num_predict in the API
Fixed issue where tokens from a previous sequence would be returned when hitting num_predict