fix(ai-chat): abort/cancel support for streaming responses#983
Merged
threepointone merged 1 commit intomainfrom Feb 24, 2026
Merged
fix(ai-chat): abort/cancel support for streaming responses#983threepointone merged 1 commit intomainfrom
threepointone merged 1 commit intomainfrom
Conversation
- Pass abortSignal from options to streamText in ai-chat example - Framework safety net: cancel reader loop on abort signal, send done signal - Warning log when cancel arrives but stream still active (missing abortSignal) - Add cancel-request tests (plaintext, SSE, abort cleanup, full completion) - Fix vitest configs to scope test file discovery (prevents e2e/react cross-pickup)
🦋 Changeset detectedLatest commit: 0ac1fa2 The changes in this PR will be included in the next version bump. This PR includes changesets to release 1 package
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
commit: |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Fixes the stop button in AI Chat so it properly aborts streaming responses, and adds a framework-level safety net for cancellation.
What changed
Framework (
@cloudflare/ai-chat)_reply,_streamSSEReply,_sendPlaintextReply: When the abort signal fires, the reader loop is now cancelled and a done signal is sent to the client. Previously, cancelling a request would send the cancel message but the server-side stream would keep running until it finished naturally.abortSignalto their LLM call (e.g.streamText). This helps catch a common misconfiguration.Example (
examples/ai-chat)options.abortSignalthrough tostreamTextinserver.tsso the upstream LLM call is actually cancelled when the user hits stop.Test infrastructure
cancel-request.test.tswith 5 tests covering: plaintext cancel, SSE cancel, abort controller cleanup, cancel with abortSignal wired, and full stream completion without cancel.SlowStreamAgenttest fixture that streams chunks with configurable delays.src/tests/vitest.config.ts,src/react-tests/vitest.config.ts) to add explicitincludepatterns. Without these, vitest's default glob was picking upe2e/*.spec.tsfiles and trying to run them in the Workers pool (causingNo such module "node:child_process"errors) and react-test files in the wrong runner.Reviewer notes
packages/ai-chat/src/index.tsaround the_reply,_streamSSEReply, and_sendPlaintextReplymethods. Look for theabortSignalevent listener pattern — when abort fires, we cancel the reader, send the done signal, and log a warning if the stream was still open.includeglobs) but important — without it,npm run testfrom the ai-chat package root would fail on CI because Playwright e2e tests and browser-mode react tests got picked up by the Workers pool. The e2e tests are intentionally separate and run vianpm run test:e2e/ the nightly CI workflow.SlowStreamAgentinworker.tsis a test-only Durable Object. It simulates a slow LLM by yielding chunks withsetTimeoutdelays, which gives the cancel tests enough time to send a cancel message mid-stream.Testing