Techalicious Academy / 2026-01-08-ai-watchman

(Visit our meetup for more great tutorials)

TROUBLESHOOTING

Common problems and how to fix them.

"Connection refused" when calling Ollama

The Ollama server isn't running.

Fix: Start it with:

ollama serve

Keep this terminal open while you work. Ollama needs to be running to accept requests.

"Model not found" error

You're trying to use a model that isn't downloaded.

Fix: Pull the model first:

ollama pull ministral

Check what you have installed:

ollama list

Request times out

Vision models are slower than text models. They need time to process images.

Fixes:

Response doesn't follow the format

The model is ignoring your structured prompt.

Fixes:

Response is empty or garbled

The model might have hit a problem.

Fixes:

"Image too large" or memory errors

The image file is too big, or you're hitting terminal character limits.

Fixes:

To resize on macOS:

sips --resampleHeightWidthMax 768 image.png --out resized.png

Check your terminal's character limit:

getconf ARG_MAX

Check your Base64 string size:

base64 -i image.png | perl -pe's~\s~~g' | wc -c

If the count is over 800,000 characters, resize the image first.

JSON parsing errors

The response isn't valid JSON.

Fixes:

curl command not working

Common issues:

Test with a simple request first:

curl http://localhost:11434/api/tags

Inconsistent results

Running the same image twice gives different answers.

Fixes:

Still stuck?

Check the Ollama terminal for error messages. Most problems show up there before they show up in your code.

Ollama GitHub issues:

https://github.com/ollama/ollama/issues

Techalicious forum:

https://techalicious.forum