speaking of files, if you feel like testing just how bad your system is at dealing with say 1million files…
there this is a great way to generate them, i find copying 1mil files helps me getting an idea about how well the system does at processing lots and lots of files…
like say in the case of zfs… i can literally see my scrub time go up by an hour or two every time i add 1million files to the pool… also copying them can be a bit of a challenge… i think at my present setup i can manage in less than 3 minutes,
tho i am running 512k recordsizes now, the record i think was at 32k recordsizes and took less than 1 minutes to copy the 1mil files on the same pool, sync standard on all of them… have tried at sync always… but throughput kills that for me.
anyways i figured if anyone was interested in just how much of an effect working with big numbers of files have on the system, they are so easy to general… will take a bit tho
right has to remember to add the link
oh yeah and his math is kinda off… forgets a 0 here and there … i usually make 1mil files so i get a more accurate result… ofc you can manage with 100k files… will most likely put a good bit of strain on your system anyways