AI memory
planned
T
Thaendril
Having the AI remember previous conversations and data, this would tie well into a continued AI existence for say your Star Citizen ships, maintaining knowledge of the ships you have, where they last were (so long as you told it, so forth. I do know you can do this with backstory a tiny bit but being able to dynamically update as you go without constantly changing the backstory data would be nice.
ShipBit
Merged in a post:
Captain's log (memory)
ShipBit
Use a Captain's log to summarize longer conversations and restore it on load.
ShipBit
Merged in a post:
Ability to load chat/log history/previous conversations
B
Bukit Sorrento
ShipBit
Merged in a post:
Persist conversation history
ShipBit
Allow users to pick up previous conversations with Wingmen instead of always discarding after relaunching Wingman AI Core.
ShipBit
planned
Thaendril is planning to take a first shot at a simple "restore conversation history" Skill.
ShipBit
There are different angles to this:
- save/restore conversation history - easy. But at some point you'll hit the context window restriction of the LLM and it will be more expensive in terms of tokens because you have more input.
- make a Wingman an assistant. These have "threads" and common knowledge shared across threads. But this won't work with all LLMs.
- RAG. This would mean vectorising and embedding previous convos. This is something we want to have in Wingman anyways because it would also allow you to drop a bunch of files into the LLM to "train" it with your data. But it doesn't work exactly like direct conversation history. It would "know" stuff you talked about before but wouldn't necessarily process it like a direct message in the history.
We haven't decided yet which one we prefer but we have been thinking about this topic for a while.
T
Toxichail
ShipBit gpt4o i guess has the capability using RAG i believe
T
Thaendril
I may tentatively start working on this myself as a personal assistant skill :)
A
Adi Solar
I love the idea of a rich context being saved over a long period, the question is - where? Interested to brainstorm with the devs about that.
T
Thaendril
I think in the appdata folder where it mostly seems to keep information anyway. Since this would in theory be a text file i wouldn't see it ever being that large. Could also keep the current backstory box for fast edits or cleanup as you go.
If you wanted to be super fancy, backstory would just turn into it's own setting's tab and the user would simply tell wingmen where to store the file. In that case i think it's just a matter of making sure it's in a place wingmen can edit it.
A
Adi Solar
Thaendril Right, but then it would be more of a training rather than "memory". It's actually a better and a cheaper solution anyway. It's more suited to long term. Nice. EDIT: Ok lol, I see that ShipBit answered pretty much the same...
T
Toxichail
Adi Solar maybe set up a database? mongoDB maybe?
A
Adi Solar
Toxichail I mean, this is more of the "how" which should be chosen once Ship figure out the "what" is best product-wise. We know that most probably, the solution will be implemented on the end user side, locally. Implementing mongo locally can get over complicated. If done remotely (in the cloud, per customer), it could probably involve a shared instance of a Graph DB like Neo4J, Cosmos, Arango etc. for generic, static data, and a personal instance of a vector DB - all of which are complicated, expensive and I don't see them going to all that trouble. In the end, most probably, local files will be the most reasonable solution.
ShipBit
Toxichail Please don't. If it's really required, then it would have to be something that runs locally with almost zero-config (like sqlite etc.). But having that also makes it harder for people to edit it if they want to. File-based is kind of old-school but has the benefit of being very "plain" and readable. If we just save and restore a history per config/wingman, I think a file would suffice. But we can discuss this more if you want.
T
Toxichail
ShipBit that actually makes complete sense and would be more effective in the long run possibly preventing issues down the line
S
Stefan
Fantastic idea!