Simple Python bindings for @ggerganov’s llama.cpp library. This package provides:
Low-level access to C API via ctypes interface.
High-level Python API for text completion
OpenAI-like API
LangChain compatibility
LlamaIndex compatibility
OpenAI compatible web server
Local Copilot replacement
Function Calling support
Vision API support
Multiple Models
Documentation is available at https://llama-cpp-python.readthedocs.io/en/latest.
I always appreciate it when people build friendly Python wrappers around C code, but I suggest that if you’re going to share an example in the documentation, the answer the LLM gives should at least be correct: