Anyone got a script for a filewalker data from logs?

Hi, does anyone got a script that is able to compile a report from the node’s log file?

like: time file walker started, time ended, how much it last, was it success
(ex.
u1.satelite:
started 27.01.2024:14:30:09, ended 27.01.2024:23:30:09, lasted 9h, type: used space filewalker, status: error (or success, if error, what error)

there are 4 types of filewalkers, but most interested in just: lazyfilewalker.used-space-filewalker

im asking because my node is uptime 291h, near 100% online, has like 3TB, and dashboard still shows 1.3TB average disk used, and 3.2 TB used, and file walker is still goin, and disk latency is decent, so i dont know why all those filewalkers don’t finish or what happens, while node is NOT being restarted or interupted, or anything.

im using log-rotate script from this forum, so
got like few GB of logs in few files, need a script that will search those files, and take filewalker information from them in chronological order, and show how long it walked, and is it finished or not and why. best working in windows powershell.

PowerShell:

sls "gc-filewalker" "C:\Program Files\Storj\Storage Node\*.log" | sls "started|finished|error|failed"
2 Likes

For Linux:
grep -w “filewalker” path/to/log/file