Replies: 8 comments
-
|
hey <@936247882292551690> thanks for the detailed write-up. looking at the OpenClaw plugin as it stands, it doesn't pass a systemPrompt to from your example it sounds like the issue is that a "nothing relevant" response is still returned as text and then injected into the agent context. If you're aiming for stricter, retrieval-only behaviour adding a system prompt that explicitly returns an empty string when nothing is relevant would be a reasonable way to make recall more deterministic. |
Beta Was this translation helpful? Give feedback.
-
|
If you try that would be good to hear how it behaves on your side. |
Beta Was this translation helpful? Give feedback.
-
|
hi Kate, in the plugin, You are right, we haven't added the delete to cognee plugin. Currently only option is to use the We'd love your support on adding |
Beta Was this translation helpful? Give feedback.
-
|
Looks like this is the one used for GENERATE_GRAPH: https://github.com/topoteretes/cognee/blob/main/cognee/infrastructure/llm/prompts/generate_graph_prompt.txt So if I provide a query as part of the search all, presumably it gets tagged onto the end or the system prompt when cognee is called (i had a quick search in cognee code but took too long to see how it was used and I had to stop) and so it does have value in improving how cognee responds to openclaw (see my MEMORY.md bug above). I was thinking to add in a searchPrompt var into the config so that users (me) can choose to set it as above. Do you think this adds value? Also on the topic of openclaw itself then do you know how/when the flow the plugins data is passed in , I've been using this https://github.com/jazzyalex/agent-sessions to browse the session logs and see the interactions but its not super clear and I'm not familiar enough with the inner prompt workings/flow in openclaw without spending time digging into it. Any ideas?? In particular I want to avoid a repeat of the deletion issue and understand how/when the plugin is used in the flow. |
Beta Was this translation helpful? Give feedback.
-
|
PR for delete support is here: topoteretes/cognee-integrations#6 |
Beta Was this translation helpful? Give feedback.
-
|
thanks Kate!! 🙌 we'll review and merge it shortly! a new release is also coming soon with all the changes |
Beta Was this translation helpful? Give feedback.
-
|
<@936247882292551690> hey Kate, the update & delete logic is improved together with some other fixes and released a new version. Let me know how it works for you know. For the hash logic, i havent changed it but if you see it working better , open a pr! thanks a lot for your contribution already! 🙌 |
Beta Was this translation helpful? Give feedback.
-
|
Thanks <@778635401958522961> I'm just looking at the hash logic and relpacing it, also adding another layer of sync that ensures the docs in cognee actually match what is expected i.e removal of additional docs that get orphaned when update fails. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
=Just wondering how people are using the 'systemPrompt' with /api/v1/search, in the openclaw integration its kept empty and prior to the recent bugfix the lack of prompt I think has caused the occasional context issue, one serious problem and some wierdness. My bot had created a file that it shouldn't have and I said to delete the file, context being the file in the previous prompt, however it came back and told me it had deleted MEMORY.md and an clean start was a great idea!!!
Turns out that when that went to the cognee plugin the graph came back and said something along the lines of nothing relevant in MEMORY.md and so with that in the context for my delete the file prompt then away went MEMORY.md, thankfully I had a nightly backup to restore from but it got me thinking that the systemPrompt should probably be used to stop this, along the lines of "If there is nothing relevant to say then return an empty string"
I've (grok) has written me a simple frontend for cognee (https://github.com/KateWilkins/cquery) so I can see select datasets, view data files, run memory searches/etc, basic but functional for testing.
"You are the deep memory for an AI agent called Lyra. The prompt to the agent is being passed to you to see if there is any relevant memory context should be provided to the agent when trying to interpret the prompt. If there is nothing of relevance return an empty string otherwise you must ALWAYS prefix your response with Memory Context "
When you don't provide a system prompt in cognee, what is the default prompt used. Will it be of benefit to add this to the integration or am I wasting my time??
Thx
This discussion was automatically pulled from Discord.
Beta Was this translation helpful? Give feedback.
All reactions