Skip to content

Tools/custom events dispatch#293

Open
jherr wants to merge 8 commits intomainfrom
tools/custom-events-dispatch
Open

Tools/custom events dispatch#293
jherr wants to merge 8 commits intomainfrom
tools/custom-events-dispatch

Conversation

@jherr
Copy link
Contributor

@jherr jherr commented Feb 17, 2026

🎯 Changes

Adds a mechanism for tools to dispatch custom events to the client in realtime.

✅ Checklist

  • I have followed the steps in the Contributing guide.
  • I have tested this code locally with pnpm run test:pr.

🚀 Release Impact

  • This change affects published code, and I have generated a changeset.
  • This change is docs/CI/dev-only (no release).

Summary by CodeRabbit

  • New Features
    • Tools can now emit custom progress events during execution, such as status updates or intermediate results.
    • Chat clients receive custom events through an onCustomEvent callback, enabling real-time progress tracking and feedback from long-running tool operations.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 17, 2026

No actionable comments were generated in the recent review. 🎉


📝 Walkthrough

Walkthrough

This PR introduces custom event dispatch functionality for tools, enabling them to emit progress updates and intermediate results during execution. The system pipes custom events through an async generator-based execution pathway, streams them via the processor, and surfaces them to clients through a new onCustomEvent callback across all framework integrations (React, Vue, Svelte, Solid).

Changes

Cohort / File(s) Summary
Core ChatClient Infrastructure
packages/typescript/ai-client/src/types.ts, packages/typescript/ai-client/src/chat-client.ts, packages/typescript/ai-client/tests/test-utils.ts, packages/typescript/ai-client/tests/chat-client.test.ts
Added new onCustomEvent callback to ChatClientOptions interface; wired callback through internal callbacksRef; forwarded custom events from StreamProcessor to client; added test helper createCustomEventChunks and comprehensive test suite validating event handling, toolCallId extraction, and event sequencing.
Framework Integration
packages/typescript/ai-react/src/use-chat.ts, packages/typescript/ai-solid/src/use-chat.ts, packages/typescript/ai-svelte/src/create-chat.svelte.ts, packages/typescript/ai-vue/src/use-chat.ts
Wired onCustomEvent option from UseChatOptions/CreateChatOptions into ChatClient constructor across all framework adapters, enabling custom event callback support in React, Solid, Svelte, and Vue hooks.
Core AI Tool Execution Pipeline
packages/typescript/ai/src/activities/chat/tools/tool-calls.ts, packages/typescript/ai/src/activities/chat/tools/tool-definition.ts, packages/typescript/ai/src/types.ts
Converted executeToolCalls from Promise to AsyncGenerator yielding CustomEvent chunks; added ToolExecutionContext with emitCustomEvent() method; introduced executeWithEventPolling helper for real-time event emission during tool execution; updated tool-definition signatures to accept optional context parameter.
Stream Processing & Event Handling
packages/typescript/ai/src/activities/chat/index.ts, packages/typescript/ai/src/activities/chat/stream/processor.ts
Added TextEngine helpers (drainToolCallGenerator, createCustomEventChunk); imported CustomEvent type; implemented handleCustomEvent in StreamProcessor with early returns for system events (tool-input-available, approval-requested) and forwarding of user custom events to onCustomEvent callback with toolCallId context extraction.
Core Tests
packages/typescript/ai/tests/chat.test.ts, packages/typescript/ai/tests/custom-events-integration.test.ts, packages/typescript/ai/tests/stream-processor.test.ts, packages/typescript/ai/tests/tool-call-manager.test.ts
Updated tool execution assertions to expect context parameter with toolCallId; added integration test suite for custom event emission, propagation, and toolCallId extraction; added stream processor tests for custom event dispatch and system event filtering; introduced drainExecuteToolCalls helper for async generator consumption.
Examples
examples/ts-react-chat/src/lib/guitar-tools.ts, examples/ts-react-chat/src/routes/api.tanchat.ts, examples/ts-react-chat/src/routes/index.tsx
Updated getGuitars and addToCartToolServer to accept context parameter and emit progress events via context.emitCustomEvent(); added onCustomEvent handler to AI chat hook configuration to log custom events with toolCallId context.
Versioning
.changeset/custom-events-dispatch.md
Documented custom event dispatch feature; bumped @tanstack/ai to minor version and @tanstack/ai-react/solid/svelte/vue to patch versions.

Sequence Diagram

sequenceDiagram
    participant App as Client Application
    participant TC as Tool Execution<br/>(executeToolCalls)
    participant Proc as StreamProcessor
    participant CC as ChatClient
    participant CB as App Callback<br/>(onCustomEvent)
    
    App->>TC: Invoke tool with context
    activate TC
    TC->>TC: Poll for pending events
    Note over TC: emitCustomEvent() queues<br/>CustomEvent chunks
    TC->>Proc: Yield CustomEvent chunk
    deactivate TC
    
    activate Proc
    Proc->>Proc: handleCustomEvent()
    alt System event?
        Proc->>Proc: Early return (tool-input-available,<br/>approval-requested)
    else User custom event
        Proc->>CC: Invoke onCustomEvent
    end
    deactivate Proc
    
    activate CC
    CC->>CB: Forward event with data<br/>and toolCallId context
    deactivate CC
    
    activate CB
    CB->>App: Display progress/update UI
    deactivate CB
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

Suggested reviewers

  • AlemTuzlak

Poem

🐰 Custom events now flow so free,
Tools can chat progressively!
From promise lands to generators' call,
We stream the progress, data and all!

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 54.55% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The title 'Tools/custom events dispatch' clearly and concisely summarizes the main change: adding custom event dispatch functionality for tools, which is the primary objective of the PR.
Description check ✅ Passed The PR description follows the template structure with all required sections completed: Changes section explains the custom events feature, Checklist is fully marked, and Release Impact indicates a changeset was generated.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch tools/custom-events-dispatch

Tip

Issue Planner is now in beta. Read the docs and try it out! Share your feedback on Discord.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@jherr jherr marked this pull request as ready for review February 17, 2026 03:16
@nx-cloud
Copy link

nx-cloud bot commented Feb 17, 2026

View your CI Pipeline Execution ↗ for commit 356097d

Command Status Duration Result
nx affected --targets=test:sherif,test:knip,tes... ✅ Succeeded 3m 19s View ↗
nx run-many --targets=build --exclude=examples/** ✅ Succeeded 1m 18s View ↗

☁️ Nx Cloud last updated this comment at 2026-02-17 18:40:34 UTC

@pkg-pr-new
Copy link

pkg-pr-new bot commented Feb 17, 2026

Open in StackBlitz

@tanstack/ai

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai@293

@tanstack/ai-anthropic

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-anthropic@293

@tanstack/ai-client

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-client@293

@tanstack/ai-devtools-core

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-devtools-core@293

@tanstack/ai-fal

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-fal@293

@tanstack/ai-gemini

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-gemini@293

@tanstack/ai-grok

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-grok@293

@tanstack/ai-ollama

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-ollama@293

@tanstack/ai-openai

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-openai@293

@tanstack/ai-openrouter

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-openrouter@293

@tanstack/ai-preact

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-preact@293

@tanstack/ai-react

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-react@293

@tanstack/ai-react-ui

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-react-ui@293

@tanstack/ai-solid

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-solid@293

@tanstack/ai-solid-ui

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-solid-ui@293

@tanstack/ai-svelte

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-svelte@293

@tanstack/ai-vue

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-vue@293

@tanstack/ai-vue-ui

npm i https://pkg.pr.new/TanStack/ai/@tanstack/ai-vue-ui@293

@tanstack/preact-ai-devtools

npm i https://pkg.pr.new/TanStack/ai/@tanstack/preact-ai-devtools@293

@tanstack/react-ai-devtools

npm i https://pkg.pr.new/TanStack/ai/@tanstack/react-ai-devtools@293

@tanstack/solid-ai-devtools

npm i https://pkg.pr.new/TanStack/ai/@tanstack/solid-ai-devtools@293

commit: 356097d

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
.claude/settings.local.json (1)

1-19: ⚠️ Potential issue | 🟡 Minor

Add .claude/settings.local.json to .gitignore and remove from this commit.

This file contains developer-specific machine paths (line 17: /Users/jherr/projects/tanstack/ai) and custom permission sets that should not be shared across the team. Following the existing pattern in .gitignore for .env.local files, .claude/settings.local.json should be gitignored locally rather than committed to the repository.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.claude/settings.local.json around lines 1 - 19, The committed file
.claude/settings.local.json contains machine-specific paths and permission sets
(see permissions.allow and the path "/Users/jherr/projects/tanstack/ai"); add
".claude/settings.local.json" to .gitignore, remove the file from the repo index
without deleting the local copy (so it remains on your machine), and update the
commit (amend or create a new commit) to remove the file from the repository so
those sensitive entries are not tracked or shared.
🧹 Nitpick comments (6)
packages/typescript/ai/src/activities/chat/tools/tool-calls.ts (1)

259-290: Polling-based event flushing works but has a subtle 10ms busy-wait.

The executeWithEventPolling approach is functional and correctly handles both success and error cases (rejections propagate through Promise.race in subsequent iterations). The 10ms polling interval is a reasonable trade-off for simplicity, though it means ~100 wake-ups/second per executing tool even when no events are emitted.

An alternative would be a push-based notification (e.g., resolve a "signal" promise from within emitCustomEvent so the poller wakes immediately), which would eliminate unnecessary wake-ups for long-running tools. Not critical for correctness, but worth considering if tools frequently take many seconds.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai/src/activities/chat/tools/tool-calls.ts` around lines
259 - 290, The current executeWithEventPolling loop uses a 10ms timeout causing
busy-wait; change it to a push-based wakeup by introducing a reusable "signal"
promise and resolver that emitCustomEvent calls when it pushes into
pendingEvents, pass that signal into executeWithEventPolling (or make it
accessible alongside pendingEvents), and replace the setTimeout race participant
with awaiting the signal (and executionWithFlag) so the loop wakes immediately
on new events; ensure the signal is re-created/reset after being consumed so
subsequent events continue to trigger the resolver and preserve existing
behavior for executionPromise resolution and final flush.
packages/typescript/ai/tests/custom-events-integration.test.ts (2)

87-104: Test manually simulates the event emission pipeline — consider a note for clarity.

The mock context on Lines 90–101 manually injects toolCallId into data (via spread { ...data, toolCallId: 'tc-1' }), which means this test validates the processor's forwarding behavior but not the actual emitCustomEvent implementation that would run in production. The (testTool as any).execute cast is also fragile.

This is acceptable for integration-level coverage, but worth noting that the real emitCustomEvent → CUSTOM chunk pipeline is implicitly trusted here.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai/tests/custom-events-integration.test.ts` around lines
87 - 104, Add a short clarifying comment above the mockContext block explaining
that the test simulates the real emitCustomEvent → CUSTOM chunk pipeline by
manually injecting toolCallId and thus validates processor forwarding but not
the emitCustomEvent implementation; also tighten the fragile cast by replacing
"(testTool as any).execute" with a safer access (e.g., check for
testTool.execute via optional chaining or a typed helper) and reference the
symbols toolExecuteFunc, testTool, mockContext, emitCustomEvent, and
processor.processChunk so reviewers can easily locate and understand the intent
of the simulation.

1-4: Import ordering: zod should be imported before local imports per ESLint.

Fix
 import { describe, expect, it, vi } from 'vitest'
+import { z } from 'zod'
 import { toolDefinition } from '../src/activities/chat/tools/tool-definition'
 import { StreamProcessor } from '../src/activities/chat/stream/processor'
-import { z } from 'zod'
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai/tests/custom-events-integration.test.ts` around lines
1 - 4, Imports are out of order: import z from 'zod' should appear before local
module imports to satisfy ESLint; reorder the import statements so "import { z }
from 'zod'" comes before the local imports (toolDefinition and StreamProcessor)
in the file containing those symbols to fix the lint error.
packages/typescript/ai/src/activities/chat/stream/processor.ts (1)

864-868: (chunk.data as any)?.toolCallId — consider a narrower type assertion.

The as any works but loses type safety. A minor improvement would be a type guard or inline check:

toolCallId: (chunk.data != null && typeof chunk.data === 'object' && 'toolCallId' in chunk.data)
  ? (chunk.data as { toolCallId?: string }).toolCallId
  : undefined,

That said, since chunk.data is unknown, the as any approach is pragmatic and the optional chaining provides runtime safety.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai/src/activities/chat/stream/processor.ts` around lines
864 - 868, The current use of (chunk.data as any)?.toolCallId in the
onCustomEvent call loses type safety; replace it with a narrow runtime type
check (or small type-guard helper like isToolCallData) that ensures chunk.data
is a non-null object and has a 'toolCallId' property, then cast to {
toolCallId?: string } to read and pass toolCallId into
this.events.onCustomEvent?.(chunk.name, chunk.data, { toolCallId: ... }); this
updates the extraction in the processor (referenced by chunk.data and the
onCustomEvent call) without using as any.
packages/typescript/ai-client/tests/chat-client.test.ts (1)

3-9: Import ordering: createCustomEventChunks should be sorted alphabetically per ESLint.

Per the static analysis hint, the import should be sorted.

Fix
 import {
+  createCustomEventChunks,
   createMockConnectionAdapter,
   createTextChunks,
   createThinkingChunks,
   createToolCallChunks,
-  createCustomEventChunks,
 } from './test-utils'
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai-client/tests/chat-client.test.ts` around lines 3 - 9,
The import list in chat-client.test.ts is out of alphabetical order: move
createCustomEventChunks so the imported identifiers are sorted alphabetically
(e.g., createCustomEventChunks, createTextChunks, createThinkingChunks,
createToolCallChunks, createMockConnectionAdapter) to satisfy ESLint; update the
import statement that contains createMockConnectionAdapter, createTextChunks,
createThinkingChunks, createToolCallChunks, createCustomEventChunks accordingly.
packages/typescript/ai/src/activities/chat/index.ts (1)

1095-1106: createCustomEventChunk — minor: data parameter could be widened to unknown.

The data parameter is typed as Record<string, any>, but custom events could theoretically carry any data shape. Since this is an internal helper and the callers (tool execution context) always pass records, this is fine for now — just noting for future flexibility.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai/src/activities/chat/index.ts` around lines 1095 -
1106, The createCustomEventChunk helper currently types its data parameter as
Record<string, any>; widen this parameter to unknown so the function can accept
any payload shape (change the signature of createCustomEventChunk(eventName:
string, data: unknown)), and update the returned CustomEvent's data field typing
if necessary to accept unknown; ensure any callers (e.g. the tool execution
contexts) that rely on specific properties narrow or assert the unknown before
use.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.claude/settings.local.json:
- Line 17: The setting contains a hardcoded absolute path "Bash(git -C
/Users/jherr/projects/tanstack/ai checkout:*)" that leaks a developer-local
environment; update this entry to use a relative/generic path or placeholder
(e.g., "Bash(git -C . checkout:*)" or "Bash(git checkout:*)") or replace the
path with a variable like "${PROJECT_ROOT}" so it no longer references
/Users/jherr/projects/tanstack/ai.

In `@packages/typescript/ai-react/src/use-chat.ts`:
- Line 67: onCustomEvent is being captured at useMemo creation
(optionsRef.current.onCustomEvent) so updates to options.onCustomEvent won't be
used until clientId changes; change the onCustomEvent property in the object
created by useMemo to be a closure that calls
optionsRef.current.onCustomEvent(...) at invocation time (matching how
onFinish/onError are implemented) so it always dereferences the latest callback
while leaving useMemo and its clientId dependency intact.

---

Outside diff comments:
In @.claude/settings.local.json:
- Around line 1-19: The committed file .claude/settings.local.json contains
machine-specific paths and permission sets (see permissions.allow and the path
"/Users/jherr/projects/tanstack/ai"); add ".claude/settings.local.json" to
.gitignore, remove the file from the repo index without deleting the local copy
(so it remains on your machine), and update the commit (amend or create a new
commit) to remove the file from the repository so those sensitive entries are
not tracked or shared.

---

Nitpick comments:
In `@packages/typescript/ai-client/tests/chat-client.test.ts`:
- Around line 3-9: The import list in chat-client.test.ts is out of alphabetical
order: move createCustomEventChunks so the imported identifiers are sorted
alphabetically (e.g., createCustomEventChunks, createTextChunks,
createThinkingChunks, createToolCallChunks, createMockConnectionAdapter) to
satisfy ESLint; update the import statement that contains
createMockConnectionAdapter, createTextChunks, createThinkingChunks,
createToolCallChunks, createCustomEventChunks accordingly.

In `@packages/typescript/ai/src/activities/chat/index.ts`:
- Around line 1095-1106: The createCustomEventChunk helper currently types its
data parameter as Record<string, any>; widen this parameter to unknown so the
function can accept any payload shape (change the signature of
createCustomEventChunk(eventName: string, data: unknown)), and update the
returned CustomEvent's data field typing if necessary to accept unknown; ensure
any callers (e.g. the tool execution contexts) that rely on specific properties
narrow or assert the unknown before use.

In `@packages/typescript/ai/src/activities/chat/stream/processor.ts`:
- Around line 864-868: The current use of (chunk.data as any)?.toolCallId in the
onCustomEvent call loses type safety; replace it with a narrow runtime type
check (or small type-guard helper like isToolCallData) that ensures chunk.data
is a non-null object and has a 'toolCallId' property, then cast to {
toolCallId?: string } to read and pass toolCallId into
this.events.onCustomEvent?.(chunk.name, chunk.data, { toolCallId: ... }); this
updates the extraction in the processor (referenced by chunk.data and the
onCustomEvent call) without using as any.

In `@packages/typescript/ai/src/activities/chat/tools/tool-calls.ts`:
- Around line 259-290: The current executeWithEventPolling loop uses a 10ms
timeout causing busy-wait; change it to a push-based wakeup by introducing a
reusable "signal" promise and resolver that emitCustomEvent calls when it pushes
into pendingEvents, pass that signal into executeWithEventPolling (or make it
accessible alongside pendingEvents), and replace the setTimeout race participant
with awaiting the signal (and executionWithFlag) so the loop wakes immediately
on new events; ensure the signal is re-created/reset after being consumed so
subsequent events continue to trigger the resolver and preserve existing
behavior for executionPromise resolution and final flush.

In `@packages/typescript/ai/tests/custom-events-integration.test.ts`:
- Around line 87-104: Add a short clarifying comment above the mockContext block
explaining that the test simulates the real emitCustomEvent → CUSTOM chunk
pipeline by manually injecting toolCallId and thus validates processor
forwarding but not the emitCustomEvent implementation; also tighten the fragile
cast by replacing "(testTool as any).execute" with a safer access (e.g., check
for testTool.execute via optional chaining or a typed helper) and reference the
symbols toolExecuteFunc, testTool, mockContext, emitCustomEvent, and
processor.processChunk so reviewers can easily locate and understand the intent
of the simulation.
- Around line 1-4: Imports are out of order: import z from 'zod' should appear
before local module imports to satisfy ESLint; reorder the import statements so
"import { z } from 'zod'" comes before the local imports (toolDefinition and
StreamProcessor) in the file containing those symbols to fix the lint error.

optionsRef.current.onError?.(error)
},
tools: optionsRef.current.tools,
onCustomEvent: optionsRef.current.onCustomEvent,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

onCustomEvent is captured at memo-creation time, unlike onFinish/onError which dereference optionsRef at call time.

onFinish and onError (Lines 60–65) are wrapped in closures that read optionsRef.current when invoked, so they always use the latest callback. onCustomEvent is evaluated once when useMemo runs, meaning subsequent changes to options.onCustomEvent won't take effect until clientId changes.

Suggested fix: wrap in a closure for consistency
-      onCustomEvent: optionsRef.current.onCustomEvent,
+      onCustomEvent: (eventType: string, data: unknown, context: { toolCallId?: string }) => {
+        optionsRef.current.onCustomEvent?.(eventType, data, context)
+      },
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
onCustomEvent: optionsRef.current.onCustomEvent,
onCustomEvent: (eventType: string, data: unknown, context: { toolCallId?: string }) => {
optionsRef.current.onCustomEvent?.(eventType, data, context)
},
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/typescript/ai-react/src/use-chat.ts` at line 67, onCustomEvent is
being captured at useMemo creation (optionsRef.current.onCustomEvent) so updates
to options.onCustomEvent won't be used until clientId changes; change the
onCustomEvent property in the object created by useMemo to be a closure that
calls optionsRef.current.onCustomEvent(...) at invocation time (matching how
onFinish/onError are implemented) so it always dereferences the latest callback
while leaving useMemo and its clientId dependency intact.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
.changeset/custom-events-dispatch.md (1)

4-7: Consider bumping framework packages to minor instead of patch.

The framework integrations (@tanstack/ai-react, @tanstack/ai-solid, @tanstack/ai-svelte, @tanstack/ai-vue) are receiving new public API surface: the onCustomEvent callback in their hook options. While the addition is backward-compatible (optional parameter), semantic versioning typically treats new functionality as a minor change rather than a patch.

📦 Suggested versioning adjustment
-'@tanstack/ai-react': patch
-'@tanstack/ai-solid': patch
-'@tanstack/ai-svelte': patch
-'@tanstack/ai-vue': patch
+'@tanstack/ai-react': minor
+'@tanstack/ai-solid': minor
+'@tanstack/ai-svelte': minor
+'@tanstack/ai-vue': minor
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.changeset/custom-events-dispatch.md around lines 4 - 7, The changeset
currently marks the framework integration packages as a patch but they introduce
a new public API option (onCustomEvent) so update the version bumps for
'@tanstack/ai-react', '@tanstack/ai-solid', '@tanstack/ai-svelte', and
'@tanstack/ai-vue' from "patch" to "minor" in the .changeset entry to reflect
the new optional onCustomEvent hook option being added to their hook options.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.changeset/custom-events-dispatch.md:
- Line 12: The docs incorrectly reference dispatchEvent(); update the text to
use the real API context.emitCustomEvent() and clarify that emitCustomEvent is
called on the ToolExecutionContext passed into a tool's execute function (e.g.,
context.emitCustomEvent({...})), and ensure mentions of the streaming chunk and
client callback refer to the existing `custom_event` stream chunk and the chat
hook `onCustomEvent` so examples, tests and docs match the implementation.

---

Nitpick comments:
In @.changeset/custom-events-dispatch.md:
- Around line 4-7: The changeset currently marks the framework integration
packages as a patch but they introduce a new public API option (onCustomEvent)
so update the version bumps for '@tanstack/ai-react', '@tanstack/ai-solid',
'@tanstack/ai-svelte', and '@tanstack/ai-vue' from "patch" to "minor" in the
.changeset entry to reflect the new optional onCustomEvent hook option being
added to their hook options.


feat: add custom event dispatch support for tools

Tools can now emit custom events during execution via `dispatchEvent()`. Custom events are streamed to clients as `custom_event` stream chunks and surfaced through the client chat hook's `onCustomEvent` callback. This enables tools to send progress updates, intermediate results, or any structured data back to the UI during long-running operations.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

cat .changeset/custom-events-dispatch.md

Repository: TanStack/ai

Length of output: 626


🏁 Script executed:

# Search for method definitions and usage patterns
rg -n "emitCustomEvent|dispatchEvent" --type ts --type js -A 3 -B 1

Repository: TanStack/ai

Length of output: 5790


🏁 Script executed:

# Search for ToolExecutionContext interface definition
rg -n "interface ToolExecutionContext|class ToolExecutionContext" -A 20 --type ts

Repository: TanStack/ai

Length of output: 1569


🏁 Script executed:

# Look for any tool execution context type definitions
fd -t f -e ts -e d.ts | xargs grep -l "ToolExecutionContext" | head -5

Repository: TanStack/ai

Length of output: 221


Incorrect API name in documentation.

The changeset states tools emit events via dispatchEvent(), but the actual API is context.emitCustomEvent() where context is the ToolExecutionContext passed to the tool's execute function. The implementation, tests, and all usage examples consistently use emitCustomEvent().

Update the documentation to reflect the correct API name:

Suggested fix
-Tools can now emit custom events during execution via `dispatchEvent()`. Custom events are streamed to clients as `custom_event` stream chunks and surfaced through the client chat hook's `onCustomEvent` callback. This enables tools to send progress updates, intermediate results, or any structured data back to the UI during long-running operations.
+Tools can now emit custom events during execution via `context.emitCustomEvent()` where `context` is the `ToolExecutionContext` passed to the tool's execute function. Custom events are streamed to clients as `custom_event` stream chunks and surfaced through the client chat hook's `onCustomEvent` callback. This enables tools to send progress updates, intermediate results, or any structured data back to the UI during long-running operations.
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
Tools can now emit custom events during execution via `dispatchEvent()`. Custom events are streamed to clients as `custom_event` stream chunks and surfaced through the client chat hook's `onCustomEvent` callback. This enables tools to send progress updates, intermediate results, or any structured data back to the UI during long-running operations.
Tools can now emit custom events during execution via `context.emitCustomEvent()` where `context` is the `ToolExecutionContext` passed to the tool's execute function. Custom events are streamed to clients as `custom_event` stream chunks and surfaced through the client chat hook's `onCustomEvent` callback. This enables tools to send progress updates, intermediate results, or any structured data back to the UI during long-running operations.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.changeset/custom-events-dispatch.md at line 12, The docs incorrectly
reference dispatchEvent(); update the text to use the real API
context.emitCustomEvent() and clarify that emitCustomEvent is called on the
ToolExecutionContext passed into a tool's execute function (e.g.,
context.emitCustomEvent({...})), and ensure mentions of the streaming chunk and
client callback refer to the existing `custom_event` stream chunk and the chat
hook `onCustomEvent` so examples, tests and docs match the implementation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant