Language Model Management ExampleΒΆ
This guide provides a professional, step-by-step walkthrough for registering, setting, and using a language model with the LLMManager in the agenticaiframework package. It is intended for developers integrating custom or third-party LLMs into their applications.
18+ LLM Providers Supported
Part of 400+ modules supporting OpenAI, Anthropic, Gemini, Azure, AWS Bedrock, Ollama, and more. See LLM Documentation.
Prerequisites & ConfigurationΒΆ
- Installation: Ensure
agenticaiframeworkis installed and accessible in your Python environment. - No additional configuration is required for this example.
- Python Version: Compatible with Python 3.10+.
CodeΒΆ
Step-by-Step ExecutionΒΆ
-
Import the Class Import
LLMManagerfromagenticaiframework.llms. -
Instantiate the Manager Create an instance of
LLMManagerto handle model registration and usage. -
Register a Model Use
register_modelwith: name: Unique identifier for the model.-
callable: A function or lambda that takes a prompt and returns generated text. -
Set the Active Model Use
set_active_modelto select which model will be used for generation. -
Generate Text Call
generatewith a prompt to produce output from the active model. -
List Available Models Access the
modelsdictionary to see all registered models.
Best Practice: Wrap third-party LLM APIs in a callable function to standardize the interface for
LLMManager.
Expected InputΒΆ
No user input is required; the script uses hardcoded values for demonstration purposes. In production, prompts could be dynamically generated from user input, workflows, or other runtime data.
Expected OutputΒΆ
| Text Only | |
|---|---|
How to RunΒΆ
Run the example from the project root:
| Bash | |
|---|---|
If installed as a package, you can also run it from anywhere:
| Bash | |
|---|---|
Tip: Use
LLMManagerto manage multiple models and switch between them dynamically based on task requirements.