Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[TheiaAi] Support referencing prompt fragments via variable #14985

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

lucas-koehler
Copy link
Contributor

What it does

Prompt fragments allow to define parts of prompts in a reusable way. You can embed these prompts then via a variable in the chat or in our prompt templates. This also allows to have a set of prompts available in the chat without always defining a custom agent.

This PR defines a Theia Ai variable prompt with an argument containing the prompt fragment's id. The variable is fully resolved including variables and function calls in the fragment.

This also adds auto completion for available custom and builtin prompt fragments to the chat UI.

The variable is defined as a context variable meaning it is added to the chat context similar to referenced files. This was discussed here: #14899 (comment)

Note that agents need to be adapted to use this from the context as explained here: #14787 (comment)
This is not part of this PR.

prompt-template-autocomplete.webm

Part of #14899

How to test

  • Configure setting Ai-features › Prompt Templates: Prompt Templates Folder to point to a folder of your choosing
  • Create a .prompttemplate file in the specified folder.
  • Define a prompt in the file. Ideally, it uses a variable to test that this is resolved. E.g.
    Convert the timestamp to a human readable date: {{today:inUnixSeconds}}.
    
  • Open the AI Chat and verify that the autocompletion works (see video in What id does)
  • Send the request and make sure that your prompt fragment is used. You can use the AI Agent History to see what was sent to the LLM

Follow-ups

Step 2 of #14899 implementing settings UI and editing for prompt fragments

Breaking changes

  • This PR introduces breaking changes and requires careful review. If yes, the breaking changes section in the changelog has been updated.

Attribution

Review checklist

Reminder for reviewers

@lucas-koehler
Copy link
Contributor Author

Per discussion with @JonasHelming : The current implementation does not yet consider that the functionDescriptions map resolved from referenced functions is required. Currently, it is lost when resolving a prompt template variable because variable values can only contain text.

Thus, function references in prompt fragments should not be resolved in order to be resolved later when the fragments parent (i.e. a user chat input or a system prompt) is resolved. This needs to consider that functions are referenced differently in prompts (~{functionId}) and chat input (~functionId).
Suggested approach: leave in the prompt syntax (~{functionId}) as is and enable also resolving this format in chat inputs in addition to the current format for chats.

Adds a new AI variable that resolves registered prompt templates based on their ID.
The variable resolves to the resolved prompt template. I.e. variables and functions in the template are already resolved in the returned value.

In the chat a prompt template with ID myprompt is referenced as #prompt:myprompt

Adds prompt template id autocompletion for the chat window that triggers on entering the variable separator `:`
- different icon vor custom prompts
- sort custom prompts first
- add detail text specifying whether a prompt is built-in or custom
- internationalize texts
Marks the prompt variable as a context variable. This adds the referenced prompt template to the context.
Functions must not be immediately resolved in prompt fragments because function objects are lost this way.
Instead, they are left in and resolved when the final chat message or prompt containing the prompt fragment is resolved.

- Extend PromptService with getPromptFragment method that resolves variables but not functions
- ChatRequestParser can now handle chat and prompt function formats
This allows to resolve function references that are part of a resolved variable, e.g. a prompt fragment
@lucas-koehler lucas-koehler force-pushed the issues/14899-reference-prompt-fragments branch from d4c4b0f to 009aeff Compare February 20, 2025 14:42
@lucas-koehler
Copy link
Contributor Author

The PR is currently on hold due to issues with parsing functions of resolved variables in chat messages. See #14899 (comment) for more details

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Waiting on reviewers
Development

Successfully merging this pull request may close these issues.

1 participant