Skip to content
Open
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions .changeset/cjs-output-and-json-response.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
---
'@tanstack/ai': minor
'@tanstack/ai-client': minor
'@tanstack/ai-event-client': patch
---

**Dual ESM + CJS output.** `@tanstack/ai`, `@tanstack/ai-client`, and `@tanstack/ai-event-client` now ship both ESM and CJS builds with type-aware dual `exports` maps (`import` → `./dist/esm/*.js`, `require` → `./dist/cjs/*.cjs`), plus a `main` field pointing at CJS. Fixes Metro / Expo / CJS-only resolvers that previously couldn't find `@tanstack/ai/adapters` or `@tanstack/ai-client` because the packages were ESM-only (#308).

**New `toJSONResponse(stream, init?)` on `@tanstack/ai`.** Drains the chat stream fully and returns a JSON-array `Response` with `Content-Type: application/json`. Use on server runtimes that can't emit `ReadableStream` responses (Expo's `@expo/server`, some edge proxies). Pair with the new `fetchJSON(url, options?)` connection adapter on `@tanstack/ai-client` — it fetches the array and replays each chunk into the normal `ChatClient` pipeline. Trade-off: no incremental rendering (every chunk arrives at once when the request resolves). Closes #309.
11 changes: 9 additions & 2 deletions packages/typescript/ai-client/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,19 @@
"streaming"
],
"type": "module",
"main": "./dist/cjs/index.cjs",
"module": "./dist/esm/index.js",
"types": "./dist/esm/index.d.ts",
"exports": {
".": {
"types": "./dist/esm/index.d.ts",
"import": "./dist/esm/index.js"
"import": {
"types": "./dist/esm/index.d.ts",
"default": "./dist/esm/index.js"
},
"require": {
"types": "./dist/cjs/index.d.cts",
"default": "./dist/cjs/index.cjs"
}
}
},
"files": [
Expand Down
75 changes: 75 additions & 0 deletions packages/typescript/ai-client/src/connection-adapters.ts
Original file line number Diff line number Diff line change
Expand Up @@ -424,6 +424,81 @@ export function fetchHttpStream(
}
}

/**
* Create a JSON-array connection adapter for server runtimes that cannot
* stream `ReadableStream` responses (e.g. Expo's `@expo/server`, certain
* edge proxies). Pair with `toJSONResponse(stream)` on the server: the
* server drains the chat stream fully, JSON-serialises the collected
* chunks into an array, and this adapter fetches the array and replays
* each chunk one-by-one into the normal client pipeline.
*
* Trade-off: you lose incremental rendering — the UI sees every chunk
* only after the request resolves. Use SSE/HTTP-stream adapters when the
* runtime supports them.
*
* @param url - The API endpoint URL (or a function that returns the URL)
* @param options - Fetch options (headers, credentials, body, etc.) or a function that returns options (can be async)
* @returns A connection adapter for JSON-array responses
*
* @example
* ```typescript
* // Expo / RN client that hits an Expo API route returning toJSONResponse(stream)
* const connection = fetchJSON('/api/chat')
*
* const client = new ChatClient({ connection })
* ```
*/
export function fetchJSON(
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should add an e2e test for the roundtrip toJSONResponse → fetchJSON

url: string | (() => string),
options:
| FetchConnectionOptions
| (() => FetchConnectionOptions | Promise<FetchConnectionOptions>) = {},
): ConnectConnectionAdapter {
return {
async *connect(messages, data, abortSignal) {
const resolvedUrl = typeof url === 'function' ? url() : url
const resolvedOptions =
typeof options === 'function' ? await options() : options

const requestHeaders: Record<string, string> = {
'Content-Type': 'application/json',
...mergeHeaders(resolvedOptions.headers),
}

const requestBody = {
messages,
data,
...resolvedOptions.body,
}

const fetchClient = resolvedOptions.fetchClient ?? fetch
const response = await fetchClient(resolvedUrl, {
method: 'POST',
headers: requestHeaders,
body: JSON.stringify(requestBody),
credentials: resolvedOptions.credentials || 'same-origin',
signal: abortSignal || resolvedOptions.signal,
})

if (!response.ok) {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

servers usually put the actual diagnostic information in the body. e.g. OpenAI/Anthropic upstream rate limit: {"error":{"type":"rate_limit_error","message":"Rate limit exceeded for...","retryAfter":42}}

Suggested change
if (!response.ok) {
if (!response.ok) {
let bodySnippet = ''
try {
const text = await response.text()
bodySnippet = text.length > 500 ? `${text.slice(0, 500)}…` : text
} catch {
// body unreadable, fall through with status only
}
throw new Error(
`HTTP error! status: ${response.status} ${response.statusText}${
bodySnippet ? ` — ${bodySnippet}` : ''
}`,
)
}

throw new Error(
`HTTP error! status: ${response.status} ${response.statusText}`,
)
}

const payload = (await response.json()) as unknown
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You need to wrap this in try catch otherwise the user will get an annoying "Unexpected token < in JSON at position 0" error instead of what probably happened e.g. a gateway error or something that returned html

Suggested change
const payload = (await response.json()) as unknown
let payload: unknown
try {
payload = await response.json()
} catch (err) {
const cause = err instanceof Error ? err.message : String(err)
throw new Error(
`fetchJSON: failed to parse response body as JSON from ${resolvedUrl} (status ${response.status}):
${cause}`,
{ cause: err },
)
}

if (!Array.isArray(payload)) {
throw new Error(
'fetchJSON: expected response body to be a JSON array of StreamChunks. Did you forget to use `toJSONResponse(stream)` on the server?',
)
}
for (const chunk of payload) {
yield chunk as StreamChunk
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You never check for the abort signal in this yield loop despite adding it in toJSONResponse

}
},
}
}

/**
* Create a direct stream connection adapter (for server functions or direct streams)
*
Expand Down
1 change: 1 addition & 0 deletions packages/typescript/ai-client/src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,7 @@ export type {
export {
fetchServerSentEvents,
fetchHttpStream,
fetchJSON,
stream,
rpcStream,
type ConnectConnectionAdapter,
Expand Down
2 changes: 1 addition & 1 deletion packages/typescript/ai-client/vite.config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,6 @@ export default mergeConfig(
tanstackViteConfig({
entry: ['./src/index.ts'],
srcDir: './src',
cjs: false,
cjs: true,
}),
)
11 changes: 9 additions & 2 deletions packages/typescript/ai-event-client/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -10,12 +10,19 @@
"directory": "packages/typescript/ai-event-client"
},
"type": "module",
"main": "./dist/cjs/index.cjs",
"module": "./dist/esm/index.js",
"types": "./dist/esm/index.d.ts",
"exports": {
".": {
"types": "./dist/esm/index.d.ts",
"import": "./dist/esm/index.js"
"import": {
"types": "./dist/esm/index.d.ts",
"default": "./dist/esm/index.js"
},
"require": {
"types": "./dist/cjs/index.d.cts",
"default": "./dist/cjs/index.cjs"
}
}
},
"sideEffects": false,
Expand Down
2 changes: 1 addition & 1 deletion packages/typescript/ai-event-client/vite.config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,6 @@ export default mergeConfig(
tanstackViteConfig({
entry: ['./src/index.ts'],
srcDir: './src',
cjs: false,
cjs: true,
}),
)
31 changes: 25 additions & 6 deletions packages/typescript/ai/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -10,20 +10,39 @@
"directory": "packages/typescript/ai"
},
"type": "module",
"main": "./dist/cjs/index.cjs",
"module": "./dist/esm/index.js",
"types": "./dist/esm/index.d.ts",
"exports": {
".": {
"types": "./dist/esm/index.d.ts",
"import": "./dist/esm/index.js"
"import": {
"types": "./dist/esm/index.d.ts",
"default": "./dist/esm/index.js"
},
"require": {
"types": "./dist/cjs/index.d.cts",
"default": "./dist/cjs/index.cjs"
}
},
"./adapters": {
"types": "./dist/esm/activities/index.d.ts",
"import": "./dist/esm/activities/index.js"
"import": {
"types": "./dist/esm/activities/index.d.ts",
"default": "./dist/esm/activities/index.js"
},
"require": {
"types": "./dist/cjs/activities/index.d.cts",
"default": "./dist/cjs/activities/index.cjs"
}
},
"./middlewares": {
"types": "./dist/esm/middlewares/index.d.ts",
"import": "./dist/esm/middlewares/index.js"
"import": {
"types": "./dist/esm/middlewares/index.d.ts",
"default": "./dist/esm/middlewares/index.js"
},
"require": {
"types": "./dist/cjs/middlewares/index.d.cts",
"default": "./dist/cjs/middlewares/index.cjs"
}
}
},
"sideEffects": false,
Expand Down
1 change: 1 addition & 0 deletions packages/typescript/ai/src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,7 @@ export {
toServerSentEventsResponse,
toHttpStream,
toHttpResponse,
toJSONResponse,
} from './stream-to-response'

// Tool call management
Expand Down
49 changes: 49 additions & 0 deletions packages/typescript/ai/src/stream-to-response.ts
Original file line number Diff line number Diff line change
Expand Up @@ -250,3 +250,52 @@ export function toHttpResponse(
...init,
})
}

/**
* Drain a StreamChunk async iterable fully, then return the collected chunks
* as a single JSON-array `Response`.
*
* Use this when the target runtime does not support streaming
* `ReadableStream` responses — for example Expo's `@expo/server` runtime,
* Vercel Edge/Node hybrids behind certain proxies, or Cloudflare setups
* without streaming enabled. The consumer pairs with
* `fetchJSON` on the client, which decodes the array and yields each
* chunk back into the normal streaming pipeline — so the on-screen UX
* becomes "render everything at once when the request resolves" instead
* of incremental streaming, but the rest of the chat pipeline is unchanged.
*
* Trade-off: you lose the incremental rendering. Use only when you can't
* ship SSE / HTTP-stream responses.
*
* @param stream - AsyncIterable of StreamChunks from chat()
* @param init - Optional Response initialization options (including `abortController`)
* @returns Response with `Content-Type: application/json` containing an array of StreamChunks
*
* @example
* ```typescript
* // Expo API route where streaming responses aren't supported
* export async function POST(request: Request) {
* const stream = chat({ adapter: openaiText(), messages: [...] })
* return toJSONResponse(stream)
* }
* ```
*/
export async function toJSONResponse(
stream: AsyncIterable<StreamChunk>,
init?: ResponseInit & { abortController?: AbortController },
): Promise<Response> {
const { abortController, headers, ...rest } = init ?? {}
const chunks: Array<StreamChunk> = []
try {
for await (const chunk of stream) {
chunks.push(chunk)
}
} catch (error) {
abortController?.abort()
throw error
}
const merged = new Headers(headers)
if (!merged.has('Content-Type'))
merged.set('Content-Type', 'application/json')
return new Response(JSON.stringify(chunks), { ...rest, headers: merged })
}
73 changes: 73 additions & 0 deletions packages/typescript/ai/tests/stream-to-response.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ import { describe, it, expect, vi } from 'vitest'
import {
toServerSentEventsStream,
toServerSentEventsResponse,
toJSONResponse,
} from '../src/stream-to-response'
import type { StreamChunk } from '../src/types'

Expand Down Expand Up @@ -870,3 +871,75 @@ describe('SSE Round-Trip (Encode → Decode)', () => {
)
})
})

describe('toJSONResponse', () => {
it('drains the stream and returns a JSON-array Response', async () => {
const chunks: Array<Record<string, unknown>> = [
{
type: 'RUN_STARTED',
runId: 'r1',
model: 'test',
timestamp: 1,
},
{
type: 'TEXT_MESSAGE_CONTENT',
messageId: 'm1',
model: 'test',
timestamp: 2,
delta: 'Hello',
content: 'Hello',
},
{
type: 'RUN_FINISHED',
runId: 'r1',
model: 'test',
timestamp: 3,
},
]
const response = await toJSONResponse(createMockStream(chunks))

expect(response.status).toBe(200)
expect(response.headers.get('Content-Type')).toBe('application/json')
expect(await response.json()).toEqual(chunks)
})

it('defers to caller-provided headers and preserves extra init', async () => {
const response = await toJSONResponse(createMockStream([]), {
status: 201,
headers: { 'X-Custom': '1' },
})

expect(response.status).toBe(201)
expect(response.headers.get('X-Custom')).toBe('1')
expect(response.headers.get('Content-Type')).toBe('application/json')
})

it('does not override an explicit Content-Type', async () => {
const response = await toJSONResponse(createMockStream([]), {
headers: { 'Content-Type': 'application/vnd.tanstack-ai+json' },
})

expect(response.headers.get('Content-Type')).toBe(
'application/vnd.tanstack-ai+json',
)
})

it('aborts the supplied controller and rethrows if the upstream errors', async () => {
const abortController = new AbortController()
const abortSpy = vi.spyOn(abortController, 'abort')
async function* failing(): AsyncGenerator<StreamChunk> {
yield {
type: 'RUN_STARTED',
runId: 'r1',
model: 'test',
timestamp: 1,
} as StreamChunk
throw new Error('upstream failure')
}

await expect(
toJSONResponse(failing(), { abortController }),
).rejects.toThrow('upstream failure')
expect(abortSpy).toHaveBeenCalledOnce()
})
})
2 changes: 1 addition & 1 deletion packages/typescript/ai/vite.config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,6 @@ export default mergeConfig(
'./src/middlewares/index.ts',
],
srcDir: './src',
cjs: false,
cjs: true,
}),
)
Loading