Add system_prompt param to GeneralLLM.invoke function
#191
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR adds a
system_promptparameter to theGeneralLLM.invokefunction.The main motivation is to remove the need to rely on the deprecated
Perplexityclass for setting system-level instructions. With this change, system prompts can be passed directly toGeneralLLM.invoke, making the API more flexible and future-proof.This is particularly useful for existing implementations such as
Q3TemplateBot2024, where system prompts are currently handled via the deprecatedPerplexityabstraction.Changes
Added optional
system_promptparameter toGeneralLLM.invokeAllows explicit system-level instructions without using deprecated classes
Improves compatibility with current and future LLM backends
Backward compatibility
Existing calls to
GeneralLLM.invokeremain unchangedThe new parameter is optional and does not break current usage
Motivation
Deprecating
Perplexityrequires an alternative way to inject system prompts. This change provides a minimal and clean solution while keeping the existing API stable.Example
This implementation of
Q3TemplateBot2024.run_researchcan be changed to