Ignore the Blueprint

A blog about programming, life and everything.

Generating Post Summaries with Local LLMs Using Ollama


Last year, I started using OpenAI to generate post summaries. In the post, I mentioned:

The catch is that the script is apparently not free. Generating metadata for this post cost me $0.0017. It's not much, but it feels weird to pay for metadata generation. Maybe I should use a local llama model instead.

I haven't written much since then, but it recently came to mind, and I decided to use a local llama model to generate post summaries.

The obvious choice is to use ollama, a utility that wraps around various LLMs. The code is quite simple; a few lines of code and a small tweak to the prompt have done the job.

You can check out the new version on GitHub.