Skip to content

stream: add fast paths for webstreams read and pipeTo#61807

Merged
nodejs-github-bot merged 1 commit intonodejs:mainfrom
mcollina:webstreams-fast-paths
Feb 19, 2026
Merged

stream: add fast paths for webstreams read and pipeTo#61807
nodejs-github-bot merged 1 commit intonodejs:mainfrom
mcollina:webstreams-fast-paths

Conversation

@mcollina
Copy link
Member

@mcollina mcollina commented Feb 13, 2026

Add internal fast paths to improve webstreams performance without changing the public API or breaking spec compliance.

  1. ReadableStreamDefaultReader.read() fast path: When data is already buffered in the controller's queue, return PromiseResolve() directly without creating a DefaultReadRequest object. This is spec-compliant because read() returns a Promise, and resolved promises still run callbacks in the microtask queue.

  2. pipeTo() batch read fast path: When data is buffered, batch reads directly from the controller queue up to highWaterMark without creating PipeToReadableStreamReadRequest objects per chunk. Respects backpressure by checking desiredSize after each write.

Benchmark results:

  • pipeTo: ~11% faster (***)
  • buffered read(): ~17-20% faster (***)

This was done in partnership with Vercel to improve the performance of React and Next.js, and from a conversation on X with @cramforce.

@nodejs-github-bot
Copy link
Collaborator

Review requested:

  • @nodejs/performance
  • @nodejs/web-standards

@nodejs-github-bot nodejs-github-bot added needs-ci PRs that need a full CI run. web streams labels Feb 13, 2026
@codecov
Copy link

codecov bot commented Feb 13, 2026

Codecov Report

❌ Patch coverage is 98.71795% with 1 line in your changes missing coverage. Please review.
✅ Project coverage is 89.71%. Comparing base (ae2ffce) to head (76ccbaf).
⚠️ Report is 101 commits behind head on main.

Files with missing lines Patch % Lines
lib/internal/webstreams/readablestream.js 98.71% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main   #61807      +/-   ##
==========================================
- Coverage   89.75%   89.71%   -0.05%     
==========================================
  Files         674      675       +1     
  Lines      204416   204884     +468     
  Branches    39285    39377      +92     
==========================================
+ Hits       183472   183807     +335     
- Misses      13227    13338     +111     
- Partials     7717     7739      +22     
Files with missing lines Coverage Δ
lib/internal/webstreams/readablestream.js 98.45% <98.71%> (+<0.01%) ⬆️

... and 70 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@mcollina mcollina added the request-ci Add this label to start a Jenkins CI on a PR. label Feb 13, 2026
@github-actions github-actions bot removed the request-ci Add this label to start a Jenkins CI on a PR. label Feb 13, 2026
@mcollina mcollina force-pushed the webstreams-fast-paths branch from 1ad0edd to 080e458 Compare February 13, 2026 20:36
@nodejs-github-bot
Copy link
Collaborator

@mcollina mcollina force-pushed the webstreams-fast-paths branch from 080e458 to 5e0474e Compare February 13, 2026 20:37
@mcollina
Copy link
Member Author

@jasnell @Qard Can I get another approval?

Add internal fast paths to improve webstreams performance without
changing the public API or breaking spec compliance.

1. ReadableStreamDefaultReader.read() fast path:
   When data is already buffered in the controller's queue, return
   PromiseResolve() directly without creating a DefaultReadRequest
   object. This is spec-compliant because read() returns a Promise,
   and resolved promises still run callbacks in the microtask queue.

2. pipeTo() batch read fast path:
   When data is buffered, batch reads directly from the controller
   queue up to highWaterMark without creating
   PipeToReadableStreamReadRequest objects per chunk. Respects
   backpressure by checking desiredSize after each write.

Benchmark results:
  - pipeTo:          ~11% faster (***)
  - buffered read(): ~17-20% faster (***)

Co-Authored-By: Malte Ubl <malte@vercel.com>
@mcollina mcollina force-pushed the webstreams-fast-paths branch from 5e0474e to 76ccbaf Compare February 18, 2026 17:34
Copy link
Member

@gurgunday gurgunday left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

Copy link
Member

@mertcanaltin mertcanaltin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@mcollina mcollina added the request-ci Add this label to start a Jenkins CI on a PR. label Feb 19, 2026
@github-actions github-actions bot removed the request-ci Add this label to start a Jenkins CI on a PR. label Feb 19, 2026
@nodejs-github-bot
Copy link
Collaborator

@mcollina mcollina added author ready PRs that have at least one approval, no pending requests for changes, and a CI started. commit-queue Add this label to land a pull request using GitHub Actions. and removed needs-ci PRs that need a full CI run. labels Feb 19, 2026
@nodejs-github-bot nodejs-github-bot removed the commit-queue Add this label to land a pull request using GitHub Actions. label Feb 19, 2026
@nodejs-github-bot nodejs-github-bot merged commit 199daab into nodejs:main Feb 19, 2026
74 of 76 checks passed
@nodejs-github-bot
Copy link
Collaborator

Landed in 199daab

readableStreamDefaultControllerCallPullIfNeeded(controller);
}

return PromiseResolve({ value: chunk, done: false });
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like the only difference between the fast and slow paths is that the fast path calls PromiseResolve() while the slow path calls [kChunk] which calls PromiseWithResolvers().resolve(). The rest of the code is just a copy-paste of readableStreamDefaultControllerPullSteps.

Comparing the specification for PromiseResolve and Promise.withResolvers, they seem identical? The first calls promiseCapability.[[Resolve]] immediately, while the second exposes that same promiseCapability.[[Resolve]] as a function on the resulting obj. So there shouldn't be a difference in the number of microtasks: both resolve the promise immediately.

So where is the speed up coming from?

Copy link
Contributor

@MattiasBuelens MattiasBuelens Feb 19, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, so Promise.withResolvers is really just slower... Huh. 🤔

'use strict';
const common = require('../common.js');

const bench = common.createBenchmark(main, {
  kind: ['promise-resolve', 'with-resolvers', 'new-promise'],
  n: [1e5, 1e6, 1e7],
});

const PromiseResolve = Promise.resolve.bind(Promise);
const PromiseWithResolvers = Promise.withResolvers.bind(Promise);

async function main({kind, n}) {
  switch (kind) {
    case 'promise-resolve': {
      bench.start();
      for (let i = 0; i < n; ++i) {
        await PromiseResolve({value: 'a', done: false});
      }
      bench.end(n);
      break;
    }
    case 'with-resolvers': {
      bench.start();
      for (let i = 0; i < n; ++i) {
        const resolvers = PromiseWithResolvers();
        resolvers.resolve({value: 'a', done: false});
        await resolvers.promise;
      }
      bench.end(n);
      break;
    }
    case 'new-promise': {
      bench.start();
      for (let i = 0; i < n; ++i) {
        await new Promise(resolve => resolve({value: 'a', done: false}));
      }
      bench.end(n);
      break;
    }
  }
}

Results:

promise-resolve-vs-with-resolvers.js n=100000 kind="promise-resolve": 20,311,167.079660397
promise-resolve-vs-with-resolvers.js n=1000000 kind="promise-resolve": 29,780,074.152384643
promise-resolve-vs-with-resolvers.js n=10000000 kind="promise-resolve": 32,482,982.977292772
promise-resolve-vs-with-resolvers.js n=100000 kind="with-resolvers": 14,021,705.600269217
promise-resolve-vs-with-resolvers.js n=1000000 kind="with-resolvers": 18,045,948.594310835
promise-resolve-vs-with-resolvers.js n=10000000 kind="with-resolvers": 21,435,400.134614315
promise-resolve-vs-with-resolvers.js n=100000 kind="new-promise": 14,197,285.479016412
promise-resolve-vs-with-resolvers.js n=1000000 kind="new-promise": 21,022,403.5754904
promise-resolve-vs-with-resolvers.js n=10000000 kind="new-promise": 24,266,573.159471426

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

author ready PRs that have at least one approval, no pending requests for changes, and a CI started. web streams

Projects

None yet

Development

Successfully merging this pull request may close these issues.

8 participants

Comments