Source

Building blocks for local agents in C++.

Note: This library is designed for running small language models locally using llama.cpp. If you want to call external LLM APIs, this is not the right fit.

Examples

You need to download a GGUF model in order to run the examples, the default model configuration is set for granite-4.0-micro:

wget https://huggingface.co/ibm-granite/granite-4.0-micro-GGUF/resolve/main/granite-4.0-micro-Q8_0.gguf

Important: The examples use default ModelConfig values optimized for granite-4.0-micro. If you use a different model, you should adapt these values (context size, temperature, sampling parameters, etc.) to your specific use case.


Tags: ai   native   agent   library  

Last modified 15 January 2026