-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Description
I have found these related issues/pull requests
I haven’t seen any relevant issues or PRs.
Description
Postgres 17 introduces a chunked rows result type which allows for less overhead than row-at-time while still supporting very large result sets.
I have implemented something in my own app using sqlx and stream adapters to obtain similar batching, but of course the performance is inherently limited by the api. It would be great if sqlx could support chunked rows.
Prefered solution
I would propose that the current row at a time interface (fetch
) could be configured (by some Postgres specific method?) to fetch a chunk at a time and dispense them one by one using the current stream api; and an additional fetch_chunked
api producing a stream of &[Row]
could be added to executor which is implemented natively for Postgres and with a stream adapter polyfill for other drivers.
Is this a breaking change? Why or why not?
No, even if fetch is amended in some way to take advantage of chunked responses it can still offer the same row at a time interface.