2 points | by RaiyanYahya 3 hours ago
2 comments
Looks good. Are you sharing context everytime or do you force it ?
Thank you again. The context sharing is done using flags and is forced otherwise the token cost would be too much. Adding local llm support this week and this problem will go away but the context will be huge.
Looks good. Are you sharing context everytime or do you force it ?
Thank you again. The context sharing is done using flags and is forced otherwise the token cost would be too much. Adding local llm support this week and this problem will go away but the context will be huge.