Node keeps stopping without logging the event

Very frustrating that Storj (Windows) keeps stopping without explaining or logging why

I need this thing to be bulletproof in terms of robustness as the last two months were rock solid, but August’s update is not running well.

The uptime scores are terrible this month because there’s no event data, nor can I spend every waking moment babysitting the server.

The previous logfiles were lost because they don’t roll on their own (4GB log is too big to open)

Anyways, 8-21 is the last entry in the log
8-22 is when I did the heath check and started the service after I saw it was offline yet again


2021-08-21T17:47:49.369-0400	INFO	piecestore	uploaded	{"Piece ID": "A5VGTG4T4OVF3WR25TGIKO3H3B44MF4RN3JMUSPEKY2HIG3LILIA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Size": 21504}
2021-08-21T17:47:49.528-0400	INFO	piecestore	download started	{"Piece ID": "IWHQCKKCLZMRDA33WXEO3HE3NYLPGPEC7AVBSE3FSX3J3L2GC6PQ", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "GET"}
2021-08-22T19:38:31.405-0400	INFO	Configuration loaded	{"Location": "C:\\Program Files\\Storj\\Storage Node\\config.yaml"}
2021-08-22T19:38:35.438-0400	INFO	Telemetry enabled
2021-08-22T19:38:35.577-0400	INFO	db.migration	Database Version	{"version": 53}
2021-08-22T19:38:35.787-0400	INFO	preflight:localtime	start checking local system clock with trusted satellites' system clock.
2021-08-22T19:38:36.278-0400	INFO	preflight:localtime	local system clock is in sync with trusted satellites' system clock.
2021-08-22T19:38:36.278-0400	INFO	trust	Scheduling next refresh	{"after": "4h50m55.877354181s"}
2021-08-22T19:38:36.279-0400	INFO	bandwidth	Performing bandwidth usage rollups
2021-08-22T19:38:36.281-0400	INFO	Node started
2021-08-22T19:38:36.281-0400	INFO	Public server started on [::]:28967
2021-08-22T19:38:36.281-0400	INFO	Private server started on 127.0.0.1:7778
2021-08-22T19:38:38.032-0400	ERROR	collector	unable to delete piece	{"Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Piece ID": "G2IOC3TEFGJVWXPOLMNP3ENJSNHVCGUS2XBB72PUFT5IWLKR7Y7A", "error": "pieces error: filestore error: file does not exist", "errorVerbose": "pieces error: filestore error: file does not exist\n\tstorj.io/storj/storage/filestore.(*blobStore).Stat:103\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).pieceSizes:239\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).Delete:220\n\tstorj.io/storj/storagenode/pieces.(*Store).Delete:299\n\tstorj.io/storj/storagenode/collector.(*Service).Collect:97\n\tstorj.io/storj/storagenode/collector.(*Service).Run.func1:57\n\tstorj.io/common/sync2.(*Cycle).Run:92\n\tstorj.io/storj/storagenode/collector.(*Service).Run:53\n\tstorj.io/storj/private/lifecycle.(*Group).Run.func2.1:87\n\truntime/pprof.Do:40\n\tstorj.io/storj/private/lifecycle.(*Group).Run.func2:86\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-08-22T19:38:38.352-0400	ERROR	collector	unable to delete piece	{"Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Piece ID": "YHURXVS34GKO65CGGJ6E4ROUMXSCVKQSMJK3E2XLRJ4WU4S3KIFQ", "error": "pieces error: filestore error: file does not exist", "errorVerbose": "pieces error: filestore error: file does not exist\n\tstorj.io/storj/storage/filestore.(*blobStore).Stat:103\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).pieceSizes:239\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).Delete:220\n\tstorj.io/storj/storagenode/pieces.(*Store).Delete:299\n\tstorj.io/storj/storagenode/collector.(*Service).Collect:97\n\tstorj.io/storj/storagenode/collector.(*Service).Run.func1:57\n\tstorj.io/common/sync2.(*Cycle).Run:92\n\tstorj.io/storj/storagenode/collector.(*Service).Run:53\n\tstorj.io/storj/private/lifecycle.(*Group).Run.func2.1:87\n\truntime/pprof.Do:40\n\tstorj.io/storj/private/lifecycle.(*Group).Run.func2:86\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-08-22T19:38:38.582-0400	ERROR	collector	unable to delete piece	{"Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Piece ID": "3MPRZTFLBZJB3WM7LAHEWY3EKNJZFBVIULPDR6PSA7OLDOQMCFAA", "error": "pieces error: filestore error: file does not exist", "errorVerbose": "pieces error: filestore error: file does not exist\n\tstorj.io/storj/storage/filestore.(*blobStore).Stat:103\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).pieceSizes:239\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).Delete:220\n\tstorj.io/storj/storagenode/pieces.(*Store).Delete:299\n\tstorj.io/storj/storagenode/collector.(*Service).Collect:97\n\tstorj.io/storj/storagenode/collector.(*Service).Run.func1:57\n\tstorj.io/common/sync2.(*Cycle).Run:92\n\tstorj.io/storj/storagenode/collector.(*Service).Run:53\n\tstorj.io/storj/private/lifecycle.(*Group).Run.func2.1:87\n\truntime/pprof.Do:40\n\tstorj.io/storj/private/lifecycle.(*Group).Run.func2:86\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-08-22T19:38:38.595-0400	ERROR	collector	unable to delete piece	{"Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Piece ID": "L7CC27RBXVPK4GB6CDJBA5XBERIAJJPW5EOGGWTNOGJUWKPGTHZA", "error": "pieces error: filestore error: file does not exist", "errorVerbose": "pieces error: filestore error: file does not exist\n\tstorj.io/storj/storage/filestore.(*blobStore).Stat:103\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).pieceSizes:239\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).Delete:220\n\tstorj.io/storj/storagenode/pieces.(*Store).Delete:299\n\tstorj.io/storj/storagenode/collector.(*Service).Collect:97\n\tstorj.io/storj/storagenode/collector.(*Service).Run.func1:57\n\tstorj.io/common/sync2.(*Cycle).Run:92\n\tstorj.io/storj/storagenode/collector.(*Service).Run:53\n\tstorj.io/storj/private/lifecycle.(*Group).Run.func2.1:87\n\truntime/pprof.Do:40\n\tstorj.io/storj/private/lifecycle.(*Group).Run.func2:86\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-08-22T19:38:38.678-0400	ERROR	collector	unable to delete piece	{"Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Piece ID": "6WL67EG4Q2GHGW26WC6U227YUXNHZ6F6TVY3JLIB4CUA6X2NOCSQ", "error": "pieces error: filestore error: file does not exist", "errorVerbose": "pieces error: filestore error: file does not exist\n\tstorj.io/storj/storage/filestore.(*blobStore).Stat:103\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).pieceSizes:239\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).Delete:220\n\tstorj.io/storj/storagenode/pieces.(*Store).Delete:299\n\tstorj.io/storj/storagenode/collector.(*Service).Collect:97\n\tstorj.io/storj/storagenode/collector.(*Service).Run.func1:57\n\tstorj.io/common/sync2.(*Cycle).Run:92\n\tstorj.io/storj/storagenode/collector.(*Service).Run:53\n\tstorj.io/storj/private/lifecycle.(*Group).Run.func2.1:87\n\truntime/pprof.Do:40\n\tstorj.io/storj/private/lifecycle.(*Group).Run.func2:86\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-08-22T19:38:39.118-0400	INFO	collector	delete expired	{"Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Piece ID": "BVN32HS3ILBYJI6JENDDFZJB5NWLWQTGSCWJ4L5ULWWDSSRGVTJA"}
2021-08-22T19:38:39.510-0400	INFO	collector	delete expired	{"Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Piece ID": "25OHTSWUHLUKPY2DGH7QQYCLOKKQ4S377C5XCT7ICOYRUCR3BPCQ"}
2021-08-22T19:38:39.867-0400	INFO	collector	delete expired	{"Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Piece ID": "KUKOADNRUP64LNO5OFLED7PWONQ64LFA2N4YVIMXFAPXULCGARXQ"}
2021-08-22T19:38:40.018-0400	INFO	collector	delete expired	{"Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Piece ID": "Z7D65W7LYGDXONF773ECYEHY3NF4I6FORKW6E4LRDBNKFIKD3IEQ"}
2021-08-22T19:38:40.359-0400	INFO	collector	delete expired	{"Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Piece ID": "72USA7TYEPQX7GG3OWXSARGQJUNL7JD4TOYKOGQPTIOJ7DJVZCFQ"

If the node log isn’t being written to, then it’s something outside of the node causing the storagenode process to crash/stop. Please check the Windows Event Viewer > Application Log for events around that time - 2021-08-21 17:47:49.

Please, stop the storagenode service and check your disk with data on errors.
If this is a USB disk, then make sure it has an external power supply, check all cables and system journals in “Event Viewer” - do you have disconnections of the disk?

It’s a connection to an iSCSI storage server and it’s stable/no disconnects at the time Storj stops

No Application Log events for the time of Storj stopping, but the System Log has this:

The Storj V3 Storage Node service terminated unexpectedly. It has done this 2 time(s).

Please, search for a reason of termination in the Event Viewer.
It doesn’t look like done by the storagenode itself, it was done by OS.

Please, check the updater logs too

Get-Content -Tail 10 "C:\Program Files\Storj\Storage Node\storagenode-updater.log"