fix(core): Fix truncation to only keep last message in vercel#19080
fix(core): Fix truncation to only keep last message in vercel#19080nicohrubec wants to merge 15 commits intodevelopfrom
Conversation
size-limit report 📦
|
node-overhead report 🧳Note: This is a synthetic benchmark with a minimal express app and does not necessarily reflect the real-world performance impact in an application.
|
| const p = JSON.parse(prompt); | ||
| if (!!p && typeof p === 'object') { | ||
| // Handle messages array format: { messages: [...] } | ||
| const { messages } = p as { messages?: unknown }; |
There was a problem hiding this comment.
m-h: Function expects prompt to be of type string, but we cast it here to unknown, which could cause some unexpected behavior.
q: is it ever a stringified messages array?
There was a problem hiding this comment.
I removed the cast and reorganized this method a bit because I noticed that in the previous version I would not add the system instruction in case we get a messages array
There was a problem hiding this comment.
I don't think it should be a stringified messages array at that point but I added an additional explicit check to be safe
|
|
||
| span.setAttributes({ | ||
| [GEN_AI_INPUT_MESSAGES_ATTRIBUTE]: getTruncatedJsonString(filteredMessages), | ||
| [AI_PROMPT_ATTRIBUTE]: truncatedMessages, |
There was a problem hiding this comment.
i thought we're going to git rid of AI_PROMPT_ATTRIBUTE as it's deprecated, right?
There was a problem hiding this comment.
we are getting rid of GEN_AI_PROMPT_ATTRIBUTE in our namespace, this is the original Vercel attribute. If we don't overwrite it here we send the full non-truncated message
Codecov Results 📊Generated by Codecov Action |
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.
|
@RulaKhaled Updated based on what we discussed offline:
|
GEN_AI_PROMPT_ATTRIBUTEconvertPromptToMessagesdid not handle inputs that were already in messages formats (fallback to[]), which was seemingly the culprit for truncation failingAI_PROMPT_ATTRIBUTEto the truncated messages format irrespective of the input as well, so that we get truncation for the original Vercel attribute as well (before we sent the all messages in the original namespace)Tests:
convertPromptToMessagesproperly handles inputs that are already in messages formatCloses #19060