Skip to Content
Pagination

Last Updated: 1/27/2026


Pagination

Handle large datasets efficiently with pagination.

Pagination Types

TypeUse CaseProsCons
OffsetSimple listsEasy to implementSlow on large datasets
CursorReal-time dataFast, consistentCan’t jump to page
KeysetTime-seriesVery fastRequires sorted data

:::info The SDK uses cursor-based pagination by default for optimal performance. :::

Basic Pagination

// Get first page const page1 = await client.users.list({ limit: 20, }); console.log('Users:', page1.data); console.log('Has more:', page1.hasMore); console.log('Next cursor:', page1.nextCursor); // Get next page if (page1.hasMore) { const page2 = await client.users.list({ limit: 20, cursor: page1.nextCursor, }); }

Iterating All Pages

async function getAllUsers(): Promise<User[]> { const allUsers: User[] = []; let cursor: string | undefined; do { const page = await client.users.list({ limit: 100, cursor, }); allUsers.push(...page.data); cursor = page.hasMore ? page.nextCursor : undefined; } while (cursor); return allUsers; }

:::warning Be careful when fetching all pages. Large datasets can consume significant memory. :::

Async Iterator

// Elegant iteration with for-await for await (const user of client.users.iterate({ limit: 50 })) { console.log(user.name); // Process each user as it arrives await processUser(user); }

:::tip Use iterate() for memory-efficient processing of large datasets. :::

Pagination Response

interface PaginatedResponse<T> { data: T[]; hasMore: boolean; nextCursor?: string; total?: number; // Only with includeTotalCount }
{ "data": [ { "id": "user_1", "name": "Alice" }, { "id": "user_2", "name": "Bob" } ], "hasMore": true, "nextCursor": "eyJpZCI6InVzZXJfMiJ9", "total": 150 }

Filtering with Pagination

// Combine filters with pagination const activeAdmins = await client.users.list({ limit: 20, filter: { status: 'active', role: 'admin', }, orderBy: 'createdAt', orderDirection: 'desc', });

:::note Filters are applied before pagination. The cursor only tracks position within filtered results. :::

Parallel Fetching

// Fetch multiple pages in parallel (use with caution) async function fetchPagesParallel( cursors: string[], ): Promise<User[]> { const pages = await Promise.all( cursors.map(cursor => client.users.list({ limit: 100, cursor }) ) ); return pages.flatMap(page => page.data); }

:::danger Parallel fetching can trigger rate limits. Use sequential fetching for large datasets. :::

Counting Total Items

// Include total count (slower query) const page = await client.users.list({ limit: 20, includeTotalCount: true, }); console.log(`Showing ${page.data.length} of ${page.total} users`);

:::warning includeTotalCount requires a separate COUNT query. Avoid on large tables. :::

Cursor Structure

Cursors are opaque strings - don’t parse or construct them manually:

// Good - use cursor from response const page2 = await client.users.list({ cursor: page1.nextCursor, }); // Bad - never construct cursors const fakeCursor = btoa('{"id":"user_123"}'); // Don't do this!

Error Handling

try { const page = await client.users.list({ cursor: 'invalid_cursor', }); } catch (error) { if (error instanceof ValidationError) { // Invalid or expired cursor console.error('Invalid cursor, starting from beginning'); const page = await client.users.list({ limit: 20 }); } }

:::note Cursors may expire after 24 hours. Always handle invalid cursor errors gracefully. :::