-
Notifications
You must be signed in to change notification settings - Fork 29k
Fix chunk load handling during prerendering #75132
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
Closed
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This was referenced Jan 21, 2025
Warning This pull request is not mergeable via GitHub because a downstack PR is open. Once all requirements are satisfied, merge this PR as a stack on Graphite.
This stack of pull requests is managed by Graphite. Learn more about stacking. |
"use cache"
Tests Passed |
Stats from current PRDefault Build (Increase detected
|
vercel/next.js canary | vercel/next.js hl/fix-inner-use-cache-chunk-loading-warmup | Change | |
---|---|---|---|
buildDuration | 21.8s | 19.2s | N/A |
buildDurationCached | 18s | 15.5s | N/A |
nodeModulesSize | 419 MB | 419 MB | |
nextStartRea..uration (ms) | 474ms | 474ms | ✓ |
Client Bundles (main, webpack)
vercel/next.js canary | vercel/next.js hl/fix-inner-use-cache-chunk-loading-warmup | Change | |
---|---|---|---|
5306-HASH.js gzip | 54.1 kB | 54 kB | N/A |
8276.HASH.js gzip | 169 B | 168 B | N/A |
8377-HASH.js gzip | 5.46 kB | 5.46 kB | N/A |
bccd1874-HASH.js gzip | 52.9 kB | 52.9 kB | N/A |
framework-HASH.js gzip | 57.5 kB | 57.5 kB | N/A |
main-app-HASH.js gzip | 240 B | 242 B | N/A |
main-HASH.js gzip | 34.6 kB | 34.6 kB | N/A |
webpack-HASH.js gzip | 1.71 kB | 1.71 kB | N/A |
Overall change | 0 B | 0 B | ✓ |
Legacy Client Bundles (polyfills)
vercel/next.js canary | vercel/next.js hl/fix-inner-use-cache-chunk-loading-warmup | Change | |
---|---|---|---|
polyfills-HASH.js gzip | 39.4 kB | 39.4 kB | ✓ |
Overall change | 39.4 kB | 39.4 kB | ✓ |
Client Pages
vercel/next.js canary | vercel/next.js hl/fix-inner-use-cache-chunk-loading-warmup | Change | |
---|---|---|---|
_app-HASH.js gzip | 193 B | 193 B | ✓ |
_error-HASH.js gzip | 193 B | 193 B | ✓ |
amp-HASH.js gzip | 512 B | 510 B | N/A |
css-HASH.js gzip | 343 B | 342 B | N/A |
dynamic-HASH.js gzip | 1.84 kB | 1.84 kB | ✓ |
edge-ssr-HASH.js gzip | 265 B | 265 B | ✓ |
head-HASH.js gzip | 363 B | 362 B | N/A |
hooks-HASH.js gzip | 393 B | 392 B | N/A |
image-HASH.js gzip | 4.59 kB | 4.58 kB | N/A |
index-HASH.js gzip | 268 B | 268 B | ✓ |
link-HASH.js gzip | 2.35 kB | 2.35 kB | N/A |
routerDirect..HASH.js gzip | 328 B | 328 B | ✓ |
script-HASH.js gzip | 397 B | 397 B | ✓ |
withRouter-HASH.js gzip | 323 B | 326 B | N/A |
1afbb74e6ecf..834.css gzip | 106 B | 106 B | ✓ |
Overall change | 3.59 kB | 3.59 kB | ✓ |
Client Build Manifests
vercel/next.js canary | vercel/next.js hl/fix-inner-use-cache-chunk-loading-warmup | Change | |
---|---|---|---|
_buildManifest.js gzip | 748 B | 747 B | N/A |
Overall change | 0 B | 0 B | ✓ |
Rendered Page Sizes
vercel/next.js canary | vercel/next.js hl/fix-inner-use-cache-chunk-loading-warmup | Change | |
---|---|---|---|
index.html gzip | 523 B | 522 B | N/A |
link.html gzip | 538 B | 537 B | N/A |
withRouter.html gzip | 520 B | 520 B | ✓ |
Overall change | 520 B | 520 B | ✓ |
Edge SSR bundle Size Overall increase ⚠️
vercel/next.js canary | vercel/next.js hl/fix-inner-use-cache-chunk-loading-warmup | Change | |
---|---|---|---|
edge-ssr.js gzip | 129 kB | 129 kB | N/A |
page.js gzip | 209 kB | 209 kB | |
Overall change | 209 kB | 209 kB |
Middleware size
vercel/next.js canary | vercel/next.js hl/fix-inner-use-cache-chunk-loading-warmup | Change | |
---|---|---|---|
middleware-b..fest.js gzip | 670 B | 667 B | N/A |
middleware-r..fest.js gzip | 155 B | 156 B | N/A |
middleware.js gzip | 31.3 kB | 31.3 kB | N/A |
edge-runtime..pack.js gzip | 844 B | 844 B | ✓ |
Overall change | 844 B | 844 B | ✓ |
Next Runtimes Overall increase ⚠️
vercel/next.js canary | vercel/next.js hl/fix-inner-use-cache-chunk-loading-warmup | Change | |
---|---|---|---|
274-experime...dev.js gzip | 322 B | 322 B | ✓ |
274.runtime.dev.js gzip | 314 B | 314 B | ✓ |
app-page-exp...dev.js gzip | 376 kB | 377 kB | |
app-page-exp..prod.js gzip | 131 kB | 131 kB | N/A |
app-page-tur..prod.js gzip | 144 kB | 144 kB | |
app-page-tur..prod.js gzip | 140 kB | 140 kB | |
app-page.run...dev.js gzip | 364 kB | 364 kB | |
app-page.run..prod.js gzip | 127 kB | 127 kB | |
app-route-ex...dev.js gzip | 37.6 kB | 37.6 kB | ✓ |
app-route-ex..prod.js gzip | 25.6 kB | 25.6 kB | ✓ |
app-route-tu..prod.js gzip | 25.6 kB | 25.6 kB | ✓ |
app-route-tu..prod.js gzip | 25.4 kB | 25.4 kB | ✓ |
app-route.ru...dev.js gzip | 39.2 kB | 39.2 kB | ✓ |
app-route.ru..prod.js gzip | 25.4 kB | 25.4 kB | ✓ |
pages-api-tu..prod.js gzip | 9.69 kB | 9.69 kB | ✓ |
pages-api.ru...dev.js gzip | 11.6 kB | 11.6 kB | ✓ |
pages-api.ru..prod.js gzip | 9.68 kB | 9.68 kB | ✓ |
pages-turbo...prod.js gzip | 21.9 kB | 21.9 kB | ✓ |
pages.runtim...dev.js gzip | 27.7 kB | 27.7 kB | ✓ |
pages.runtim..prod.js gzip | 21.9 kB | 21.9 kB | ✓ |
server.runti..prod.js gzip | 916 kB | 916 kB | ✓ |
Overall change | 2.35 MB | 2.35 MB |
build cache Overall increase ⚠️
vercel/next.js canary | vercel/next.js hl/fix-inner-use-cache-chunk-loading-warmup | Change | |
---|---|---|---|
0.pack gzip | 2.1 MB | 2.11 MB | |
index.pack gzip | 75.8 kB | 76.1 kB | |
Overall change | 2.18 MB | 2.18 MB |
Diff details
Diff for edge-ssr.js
Diff too large to display
Diff for main-HASH.js
Diff too large to display
Diff for app-page-exp..ntime.dev.js
failed to diff
Diff for app-page-exp..time.prod.js
Diff too large to display
Diff for app-page-tur..time.prod.js
Diff too large to display
Diff for app-page-tur..time.prod.js
Diff too large to display
Diff for app-page.runtime.dev.js
failed to diff
Diff for app-page.runtime.prod.js
Diff too large to display
fc2ebbb
to
e31ef97
Compare
0a346e3
to
77d1924
Compare
e31ef97
to
7a14af1
Compare
77d1924
to
c5998ce
Compare
c5998ce
to
2325b78
Compare
7a14af1
to
5f97eb9
Compare
2325b78
to
679b325
Compare
eps1lon
pushed a commit
that referenced
this pull request
May 7, 2025
This PR introduces some extra tracking which makes us treat `import(...)` like a call to a cached function. Thanks to this, `await import(...)` will no longer cause dynamicity errors in `dynamicIO`. The motivation is the same as allowing `fs.readFileSync` -- if something is available on the server at prerender time, we don't consider it IO. Fixes #72589 Closes #75132 ### Implementation notes The tracking is implemented via an SWC transform (`track_dynamic_imports.ts`) that turns `import(...)` into `trackDynamicImport(import(...))`. `trackDynamicImport(promise)` tracks the promise globally, without using `workUnitStore.cacheSignal`. The prospective render subscribes to pending modules using `trackPendingModules(cacheSignal)`, which causes the prospective render to wait for all `import()`s to finish before proceeding to the final render. The mechanism is analogous to `'use cache'`, but the "result" of an `import()` is stored *in the module cache* instead; when we invoke the `import()` again in the actual prerender, it will resolve at microtask-speed, like we need it to. The transform is enabled for all modules that run server-side, both RSC and SSR, because the prerender runs both. This also includes route handlers, because we also use a `cacheSignal` when prerendering those. Notably, we also instrument `import()` in `node_modules` to account for libraries that do lazy initialization. I've also had to adjust the prospective client render in PPR mode to wait for `cacheSignal.cacheReady()` - otherwise, it wouldn't wait for `import()`s to resolve before doing the final client render, which'd subsequently cause a dynamicity error. (we didn't need to wait for `cacheSignal` before, because all cached promises would've already been awaited in the server prerender) Finally, we no longer need `warmFlightResponse`, because `trackPendingModules` already makes the `cacheSignal` wait for all loading chunks to finish, so we don't need to do it ahead of time.
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
It turns out that the following heuristic in
warmFlightResponse
is not sufficient for loading all necessary SSR chunks before handing off to prerendering:An example demonstrating this issue has been added as a test case in this PR. In the example, we have a layout with
"use cache"
and a page with"use cache"
that renders a client component. With Turbopack in particular, it often happens that the client component’s chunk hasn’t started loading beforewarmFlightResponse
is resolved. As a result, dynamic I/O errors are triggered during subsequent SSR prerendering because the chunk may not resolve within microtasks.A more robust approach is to avoid a separate
warmFlightResponse
and instead rely on the cache signal during the initial prerender to track these chunk loads.However, using the cache signal in the existing
__next_chunk_load__
tracking is not sufficient because React caches loaded chunks in a module-scoped map and skips calling__next_chunk_load__
if a cached promise for that chunk already exists. This becomes problematic when a chunk starts loading during a normal dev request (i.e., not a prerender) while, in parallel, a prerender request (for dynamic validation) also requires the same chunk. In that scenario, we have no way to track the chunk load via the prerender cache signal, causing the initial prerender to complete prematurely and trigger a dynamic I/O error during the final prerender.To address this limitation, we need a way to hook into React’s
preloadModule
function so we can track any module loading regardless of React’s internal chunk caching. This would also allow us to cover the resolution phase of async modules.Unfortunately, React does not yet offer this API. In the meantime, we are patching
preloadModule
when generating the vendored packages to include our tracking logic.An additional benefit of this approach is that it should be faster during
next build
because pages that are rendered concurrently don't need to wait for each other's chunks to finish loading, even if they don't necessarily need them. With thetrackChunkLoading
approach this is currently an accepted limitation.