Checking log after error, but file is too big to open - 17GB

Hi there,

I just checked my node this morning and found that it was offline with an error!

So, I restarted it and it now seems to be running fine BUT I’m keen to check the log to see what the error was (in case it requires action).

However, it seems the log file is now far too big to open - 17GB.

So, two questions:

1 - Is there any way to set this at a max and then it starts overwriting itself?

2 - Is there any way for me to open this file?

Thanks so much,
Ben

image

Which Operating system are you using? I’m guesssing Windows.
Using Powershell you can type Get-Content storagenode.txt -Tail 30 to get the last 30 Lines of that file. Increase if necessary.

2 Likes

Thanks! However I’m having an issue with this - I need to set the tail as a large number, and when I try and scroll up to find the error, it is constantly writing the file, so it just brings me back down to the bottom.

Is there a way to get it to stop doing this? Or to export the request instead of having it live?

Hi @lookitsbenfb
Stop the node. Rename the file. Start the node. Run Get-Content with -Tail on the renamed file.

2 Likes

Thanks! This sort of works, but still having an issue - Powershell seems to only be able to display a certain number of entries, so when I scroll up, I can’t get to the time of the issue to be able to check it.

Is there a way to export the tail into a file so that I can see all the records?

I use this powershell script to split my large log files:

$from = "C:\Program Files\Storj\Storage Node\Logs\storagenode.log"
$rootName = "C:\Program Files\Storj\Storage Node\Logs\storagenode.2022.09"
$ext = "log"
$upperBound = 1000MB

$fromFile = [io.file]::OpenRead($from)
$buff = new-object byte[] $upperBound
$count = $idx = 0
try {
    do {
        "Reading $upperBound"
        $count = $fromFile.Read($buff, 0, $buff.Length)
        if ($count -gt 0) {
            $to = "{0}.{1}.{2}" -f ($rootName, $idx, $ext)
            $toFile = [io.file]::OpenWrite($to)
            try {
                "Writing $count to $to"
                $tofile.Write($buff, 0, $count)
            } finally {
                $tofile.Close()
            }
        }
        $idx ++
    } while ($count -gt 0)
}
finally {
    $fromFile.Close()
}

Just change the $from and $rootName to your liking and you’ll get 1GB files.

Note - it doesn’t care about lines so will split the files in the middle of a line.

4 Likes

That worked amazingly, thank you!!

I’ve found the error, and have created a separate post here to see if anyone can help with it.

Thanks again for your help and for providing that code, it worked great!

Here’s another alternative if you want to work on the same log file.

 Get-Content .\storagenode.txt | sls 2022-10-20 | Out-Host -Paging

It’s searching log based on the date. It will show the results in pages with the bottom of the screen showing this: :point_down:

 <SPACE> next page; <CR> next line; Q quit 

CR = Carriage Return or ENTER key; will show next line

1 Like