Inference Library: An LLM inference library built over https://github.com/ggerganov/llama.cpp library for integrating LLMs into programs.
Model Context Protocol Library: A C++ MCP client/server library that includes all fundamental features, with support for both STDIO and StreamableHTTP transport methods.
Standard Library: A standard library containing fundamental data-structures and useful utilities such as built-in uuid generation and timers.
JSON Library: A light-weight json library.
Inference Library: An LLM inference library built over https://github.com/ggerganov/llama.cpp library for integrating LLMs into programs. Model Context Protocol Library: A C++ MCP client/server library that includes all fundamental features, with support for both STDIO and StreamableHTTP transport methods. Standard Library: A standard library containing fundamental data-structures and useful utilities such as built-in uuid generation and timers. JSON Library: A light-weight json library.