feat(prompt):guide llm read session history#3263
Closed
aiguozhi123456 wants to merge 2 commits intoHKUDS:mainfrom
Closed
feat(prompt):guide llm read session history#3263aiguozhi123456 wants to merge 2 commits intoHKUDS:mainfrom
aiguozhi123456 wants to merge 2 commits intoHKUDS:mainfrom
Conversation
Contributor
Author
|
我记得runtime context有辨认session所需信息的,当然也可以再改改,指出有辨认信息的事实。 |
…file path - Add chat_id parameter to build_system_prompt and _get_identity - Render channel_chat_id.jsonl in identity.md so LLM can locate the exact session file without guessing - Use grep-focused description to keep prompt concise
Collaborator
|
Hi,感谢PR,可以把description补充一下不 |
- Test exact session path rendered with channel + chat_id - Test graceful fallback when channel/chat_id are None - Test build_messages passes chat_id to system prompt
Contributor
Author
Contributor
Author
|
1.我对于session中内容的理解有点问题 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Enable the LLM to locate and read the full session history file on demand, mitigating context loss caused by prompt compression in multi-turn conversations.
Motivation
In long-running sessions, early messages are often compressed or dropped to fit within the context window. Once discarded, those details are irrecoverable from the LLM's perspective, leading to:
By surfacing the exact session file path in the system prompt, we give the LLM a way to pull the original, uncompressed history when it needs to recall something that no longer fits in the active context.
Changes
chat_idparameter tobuild_system_prompt()and_get_identity(), thread it into the identity template.Session file: {{ workspace_path }}/sessions/{{ channel }}_{{ chat_id }}.jsonl— with a short grep usage hint.sessions/_.jsonl), and fullbuild_messagesintegration.Prompt Caching Impact
None.
channel+chat_idare constant within a session, so the system prompt stays stable across turns.