Skip to main content
Version: 10.x

HTTP Batch Link

httpBatchLink is a terminating link that batches an array of individual tRPC operations into a single HTTP request that's sent to a single tRPC procedure.

Usage

You can import and add the httpBatchLink to the links array as such:

client/index.ts
ts
import { createTRPCProxyClient, httpBatchLink } from '@trpc/client';
import type { AppRouter } from '../server';
const client = createTRPCProxyClient<AppRouter>({
links: [
httpBatchLink({
url: 'http://localhost:3000',
}),
],
});
client/index.ts
ts
import { createTRPCProxyClient, httpBatchLink } from '@trpc/client';
import type { AppRouter } from '../server';
const client = createTRPCProxyClient<AppRouter>({
links: [
httpBatchLink({
url: 'http://localhost:3000',
}),
],
});

After that, you can make use of batching by setting all your procedures in a Promise.all. The code below will produce exactly one HTTP request and on the server exactly one database query:

ts
const somePosts = await Promise.all([
trpc.post.byId.query(1),
trpc.post.byId.query(2),
trpc.post.byId.query(3),
]);
ts
const somePosts = await Promise.all([
trpc.post.byId.query(1),
trpc.post.byId.query(2),
trpc.post.byId.query(3),
]);

The httpBatchLink function takes an options object that has the HTTPBatchLinkOptions shape.

ts
export interface HttpBatchLinkOptions extends HTTPLinkOptions {
maxURLLength?: number;
}
export interface HTTPLinkOptions {
url: string;
/**
* Add ponyfill for fetch
*/
fetch?: typeof fetch;
/**
* Add ponyfill for AbortController
*/
AbortController?: typeof AbortController | null;
/**
* Headers to be set on outgoing requests or a callback that of said headers
* @link http://trpc.io/docs/v10/header
*/
headers?:
| HTTPHeaders
| ((opts: { opList: Operation[] }) => HTTPHeaders | Promise<HTTPHeaders>);
}
ts
export interface HttpBatchLinkOptions extends HTTPLinkOptions {
maxURLLength?: number;
}
export interface HTTPLinkOptions {
url: string;
/**
* Add ponyfill for fetch
*/
fetch?: typeof fetch;
/**
* Add ponyfill for AbortController
*/
AbortController?: typeof AbortController | null;
/**
* Headers to be set on outgoing requests or a callback that of said headers
* @link http://trpc.io/docs/v10/header
*/
headers?:
| HTTPHeaders
| ((opts: { opList: Operation[] }) => HTTPHeaders | Promise<HTTPHeaders>);
}

Setting a maximum URL length

When sending batch requests, sometimes the URL can become too large causing HTTP errors like 413 Payload Too Large, 414 URI Too Long, and 404 Not Found. The maxURLLength option will limit the number of requests that can be sent together in a batch.

client/index.ts
ts
import { createTRPCProxyClient, httpBatchLink } from '@trpc/client';
import type { AppRouter } from '../server';
const client = createTRPCProxyClient<AppRouter>({
links: [
httpBatchLink({
url: 'http://localhost:3000',
maxURLLength: 2083, // a suitable size
}),
],
});
client/index.ts
ts
import { createTRPCProxyClient, httpBatchLink } from '@trpc/client';
import type { AppRouter } from '../server';
const client = createTRPCProxyClient<AppRouter>({
links: [
httpBatchLink({
url: 'http://localhost:3000',
maxURLLength: 2083, // a suitable size
}),
],
});

Disabling request batching

1. Disable batching on your server:

server.ts
ts
import { createHTTPServer } from '@trpc/server/adapters/standalone';
createHTTPServer({
// [...]
// 👇 disable batching
batching: {
enabled: false,
},
});
server.ts
ts
import { createHTTPServer } from '@trpc/server/adapters/standalone';
createHTTPServer({
// [...]
// 👇 disable batching
batching: {
enabled: false,
},
});

or, if you're using Next.js:

pages/api/trpc/[trpc].ts
ts
export default trpcNext.createNextApiHandler({
// [...]
// 👇 disable batching
batching: {
enabled: false,
},
});
pages/api/trpc/[trpc].ts
ts
export default trpcNext.createNextApiHandler({
// [...]
// 👇 disable batching
batching: {
enabled: false,
},
});
client/index.ts
ts
import { createTRPCProxyClient, httpLink } from '@trpc/client';
import type { AppRouter } from '../server';
const client = createTRPCProxyClient<AppRouter>({
links: [
httpLink({
url: 'http://localhost:3000',
}),
],
});
client/index.ts
ts
import { createTRPCProxyClient, httpLink } from '@trpc/client';
import type { AppRouter } from '../server';
const client = createTRPCProxyClient<AppRouter>({
links: [
httpLink({
url: 'http://localhost:3000',
}),
],
});

or, if you're using Next.js:

utils/trpc.ts
tsx
import type { AppRouter } from '@/server/routers/app';
import { httpLink } from '@trpc/client';
import { createTRPCNext } from '@trpc/next';
export const trpc = createTRPCNext<AppRouter>({
config() {
return {
links: [
httpLink({
url: '/api/trpc',
}),
],
};
},
});
utils/trpc.ts
tsx
import type { AppRouter } from '@/server/routers/app';
import { httpLink } from '@trpc/client';
import { createTRPCNext } from '@trpc/next';
export const trpc = createTRPCNext<AppRouter>({
config() {
return {
links: [
httpLink({
url: '/api/trpc',
}),
],
};
},
});

Streaming requester (experimental)

⚠️ streamRequester is experimental and may change in the future.

When batching requests together, the default behavior is to wait for all requests to finish before sending the response. This standard behavior uses the batchRequester under the hood. If you want to send responses as soon as they are ready, you can use the streamRequester instead. This is useful for long-running requests.

client/index.ts
ts
import { createTRPCProxyClient, httpBatchLink, unstable_streamRequester } from '@trpc/client';
import type { AppRouter } from '../server';
const client = createTRPCProxyClient<AppRouter>({
links: [
httpBatchLink({
url: 'http://localhost:3000',
// 👇 enable streaming mode
requester: unstable_streamRequester,
}),
],
});
client/index.ts
ts
import { createTRPCProxyClient, httpBatchLink, unstable_streamRequester } from '@trpc/client';
import type { AppRouter } from '../server';
const client = createTRPCProxyClient<AppRouter>({
links: [
httpBatchLink({
url: 'http://localhost:3000',
// 👇 enable streaming mode
requester: unstable_streamRequester,
}),
],
});

Using the streamRequester will:

  • Cause the requests to be sent with a X-Trpc-Batch-Mode: stream header
  • Cause the response to be sent with a Transfer-Encoding: chunked and Vary: x-trpc-batch-mode headers
  • Remove the data key from the argument object passed to responseMeta (because with a streamed response, the headers are sent before the data is available)

If you are overriding the fetch implementation in the httpBatchLink parameters, you should make sure that it supports streaming: the response.body returned by the fetch implementation should be of type ReadableStream<Uint8Array> | NodeJS.ReadableStream, meaning that:

  • either response.body.getReader() is a function that returns a ReadableStreamDefaultReader<Uint8Array> object
  • or response.body is a Uint8Array Buffer

⚠️ for aws lambda, streamRequester is not supported. It should not break anything if enabled, but will not have any effect.

⚠️ for cloudflare workers, you need to enable the ReadableStream API through a feature flag: streams_enable_constructors

Reference

You can check out the source code for this link on GitHub.