Skip to content

Virtualize is very inefficient if you've already scrolled way beyond the end of the dataset #37245

@SteveSandersonMS

Description

@SteveSandersonMS

With <Virtualize>, you can supply an ItemsProvider and return data programmatically instead of having it pre-instantiated in memory. Each time the data is refreshed, you can return a different totalItemCount value to represent changes in the dataset length.

This all works well except in the following case:

  1. The component is displaying a large dataset
  2. The user scrolls to the end
  3. The component changes to display a much smaller dataset

... or more generally, in any case where - after the data set becomes shorter - the user has already scrolled beyond the maximum length of the new dataset.

What should happen in this case is that Virtualize should see that its scroll offset is too big to be valid, and should immediately clamp it to the new maximum. What actually happens, however, is:

  • It requests a chunk of data that's beyond the end of the list. Call this page N. The provider will return an empty set.
  • It renders the empty set
  • It detects that the user is scrolled too far, but because you're outside the range of valid offsets, the existing logic can't calculate the new scroll offset. It tries to move back to the previous page N-1, and continues on this loop until it gets to a nonempty page of data

So if you're scrolled to offset 100000 (say) and then the dataset shrinks so the new maximum offset is 2000, then instead of just jumping back to 2000 it renders at scroll offset 99800, then 99600, then 99400, ... all the way until it gets to 2000. On each iteration it calls the data provider, which may make a DB query, and performs a UI refresh. This is absurdly inefficient.

Fixing it

This is a really straightforwards fix. The component already knows the maximum valid offset. It simply needs to check if the calculated offset is > the maximum, and if so, clamp it back to that maximum. Then it will complete this long iterative process in a single step.

It's surprising we didn't impose this rule originally. I think the reason is that, for sufficiently small data sets, the iterative process completes near-instantly anyway, so we may not have noticed it.

Metadata

Metadata

Labels

DoneThis issue has been fixedarea-blazorIncludes: Blazor, Razor ComponentsbugThis issue describes a behavior which is not expected - a bug.

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions