[GH-ISSUE #173] Slow when listing folders with a lot of files, e.g. millions. #90

Closed
opened 2026-04-08 16:50:16 +03:00 by zhus · 1 comment
Owner

Originally created by @zanderzhng on GitHub (Feb 14, 2023).
Original GitHub issue: https://github.com/sigoden/dufs/issues/173

Specific Demand

I have several folders with a lot of files, some folders might have a million files. The webpage is very slow.

Implement Suggestion

Allow to split pages? Only show certain files per page.

Originally created by @zanderzhng on GitHub (Feb 14, 2023). Original GitHub issue: https://github.com/sigoden/dufs/issues/173 ## Specific Demand I have several folders with a lot of files, some folders might have a million files. The webpage is very slow. ## Implement Suggestion Allow to split pages? Only show certain files per page.
zhus closed this issue 2026-04-08 16:50:16 +03:00
Author
Owner

@sigoden commented on GitHub (Feb 20, 2023):

If there are so many files in the directory, even the local file explorer will take a lot of effort to load, let alone the dufs web version file explorer.

Since dufs does not have a database to record file index, it still needs to query the local file system for each request. Adding pagination will only transfer less data, but will not reduce the time to query the file system, and the optimization effect is not significant.

For this million-level file scenario, you should use an object storage system like amazon s3, which will build an index/database for the file, and query and paging will be very efficient.

If the listing is not important, you can use --render-index in conjunction with a custom index.html to make dufs serve only the file other than the listing.

<!-- gh-comment-id:1437181093 --> @sigoden commented on GitHub (Feb 20, 2023): If there are so many files in the directory, even the local file explorer will take a lot of effort to load, let alone the dufs web version file explorer. Since dufs does not have a database to record file index, it still needs to query the local file system for each request. Adding pagination will only transfer less data, but will not reduce the time to query the file system, and the optimization effect is not significant. For this million-level file scenario, you should use an object storage system like amazon s3, which will build an index/database for the file, and query and paging will be very efficient. If the listing is not important, you can use `--render-index` in conjunction with a custom `index.html` to make dufs serve only the file other than the listing.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: sigoden/dufs#90