fix(vercel-ai): prevent tool call span map memory leak#19328
fix(vercel-ai): prevent tool call span map memory leak#19328lithdew wants to merge 1 commit intogetsentry:developfrom
Conversation
|
Any chance for a review? cc @nicohrubec @sergical @RulaKhaled |
nicohrubec
left a comment
There was a problem hiding this comment.
Thanks for contributing! Generally looks good to me. The cleanup on successful tool results is definitely missing and there also seems to be no reason to store the full spans in the map. Do you think we really need the LRUCache? I get that the idea is to cap the amount of context we store, but I would rather not impose any arbitrary limits.
Would you like to work on this or should we take over?
|
The LRU cache isn't strictly necessary — it was just a safeguard to cap memory usage in case spans aren't cleaned up for some reason. Happy to drop it in favor of a plain map with proper cleanup on both success and error paths. I can make the change, or feel free to take it over — either works for me. |
|
@lithdew If you could get rid of the LRUCache that would be great. I'll give it another look then |
Tool calls were stored in a global map and only cleaned up on tool errors, causing unbounded retention in tool-heavy apps (and potential OOMs when inputs/outputs were recorded). Store only span context in a bounded LRU cache and clean up on successful tool results; add tests for caching/eviction.
|
Made the changes — would appreciate the re-review 🙏. cc @nicohrubec |
Tool calls were stored in a global map and only cleaned up on tool errors, causing unbounded retention in tool-heavy apps (and potential OOMs when inputs/outputs were recorded). Store only span context in a bounded LRU cache and clean up on successful tool results; add tests for caching/eviction.
Before submitting a pull request, please take a look at our
Contributing guidelines and verify:
yarn lint) & (yarn test).Closes #issue_link_here