mirror of
https://github.com/sigoden/dufs.git
synced 2026-04-08 16:49:02 +03:00
[GH-ISSUE #173] Slow when listing folders with a lot of files, e.g. millions. #90
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @zanderzhng on GitHub (Feb 14, 2023).
Original GitHub issue: https://github.com/sigoden/dufs/issues/173
Specific Demand
I have several folders with a lot of files, some folders might have a million files. The webpage is very slow.
Implement Suggestion
Allow to split pages? Only show certain files per page.
@sigoden commented on GitHub (Feb 20, 2023):
If there are so many files in the directory, even the local file explorer will take a lot of effort to load, let alone the dufs web version file explorer.
Since dufs does not have a database to record file index, it still needs to query the local file system for each request. Adding pagination will only transfer less data, but will not reduce the time to query the file system, and the optimization effect is not significant.
For this million-level file scenario, you should use an object storage system like amazon s3, which will build an index/database for the file, and query and paging will be very efficient.
If the listing is not important, you can use
--render-indexin conjunction with a customindex.htmlto make dufs serve only the file other than the listing.