Loop in updating process

Hi, All

One node regularly start updating process which comes with limitless loop.
Any ideas why it happening?

2024-07-19T15:59:47Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}
2024-07-19T15:59:47Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode-updater"}
2024-07-19T16:14:47Z	INFO	Downloading versions.	{"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}
2024-07-19T16:14:47Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}
2024-07-19T16:14:47Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode"}
2024-07-19T16:14:47Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}
2024-07-19T16:14:47Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode-updater"}
2024-07-19T16:29:47Z	INFO	Downloading versions.	{"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}
2024-07-19T16:29:47Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}
2024-07-19T16:29:47Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode"}
2024-07-19T16:29:47Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}
2024-07-19T16:29:47Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode-updater"}
2024-07-19T16:44:47Z	INFO	Downloading versions.	{"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}
2024-07-19T16:44:47Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}
2024-07-19T16:44:47Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode"}

Completely normal. As you can see, it checks every 15 minutes. The update process is rolling, meaning that slowly nodes get the new version until 100% of nodes are reached. You are not yet selected to be updated, as you can see in the “but hasn’t made it to this node yet” part.

1 Like

@Stez strange that the node is going offline in this loop until restart.

You will have to check the logs then to find an error that would cause the node to stop. I would see no reason why the updater would stop the node, as there is nothing to update (yet). Chances are the node is experiencing its own issue.

1 Like

Please search for Unrecoverable errors in the node’s logs. Please note, if you redirected logs, they will not be shown in the docker logs, and you need to search them in the redirected file instead.

maybe I will reinstall this buggy node)

storagenode/pieces.(*Store).Writer(0xc0001680e0, {0x1580470, 0xc0074923c0}, {0xa2, 0x8b, 0x4f, 0x4, 0xe1, 0xb, 0xae, ...}, ...)\n\t/go/src/storj.io/storj/storagenode/pieces/store.go:233 +0x27a\nstorj.io/storj/storagenode/piecestore.(*Endpoint).Upload(0xc000202000, {0x1583e30, 0xc0070e3310})\n\t/go/src/storj.io/storj/storagenode/piecestore/endpoint.go:390 +0xfed\nstorj.io/common/pb.DRPCPiecestoreDescription.Method.func1({0x11b0b80?, 0xc000202000}, {0xc00744c440?, 0x1d?}, {0x10fbec0?, 0xc007213380}, {0x156e240?, 0x0?})\n\t/go/pkg/mod/storj.io/common@v0.0.0-20240425113201-9815a85cbc32/pb/piecestore2_drpc.pb.go:294 +0xab\nstorj.io/drpc/drpcmux.(*Mux).HandleRPC(0xc0003cca80?, {0x1580f00, 0xc007213380}, {0xc00744c440, 0x1d})\n\t/go/pkg/mod/storj.io/drpc@v0.0.34/drpcmux/handle_rpc.go:33 +0x20d\nstorj.io/common/rpc/rpctracing.(*Handler).HandleRPC(0xc0004425b8, {0x1581280, 0xc007213340}, {0xc00744c440, 0x1d})\n\t/go/pkg/mod/storj.io/common@v0.0.0-20240425113201-9815a85cbc32/rpc/rpctracing/handler.go:61 +0x2e3\nstorj.io/common/experiment.(*Handler).HandleRPC(0xc00044a220, {0x15813c0, 0xc003f82000}, {0xc00744c440, 0x1d})\n\t/go/pkg/mod/storj.io/common@v0.0.0-20240425113201-9815a85cbc32/experiment/import.go:42 +0x167\nstorj.io/drpc/drpcserver.(*Server).handleRPC(0xc004328000?, 0x15801c0?, {0xc00744c440?, 0x1db7fa0?})\n\t/go/pkg/mod/storj.io/drpc@v0.0.34/drpcserver/server.go:167 +0x42\nstorj.io/drpc/drpcserver.(*Server).ServeOne(0xc0000200a0, {0x15808d8, 0xc001be59b0}, {0x157a040?, 0xc003137500?})\n\t/go/pkg/mod/storj.io/drpc@v0.0.34/drpcserver/server.go:109 +0x1e5\nstorj.io/drpc/drpcserver.(*Server).Serve.func2({0x15808d8, 0xc001be59b0})\n\t/go/pkg/mod/storj.io/drpc@v0.0.34/drpcserver/server.go:157 +0x59\nstorj.io/drpc/drpcctx.(*Tracker).track(0xc001be59b0, 0xc009004f10?)\n\t/go/pkg/mod/storj.io/drpc@v0.0.34/drpcctx/tracker.go:35 +0x2e\ncreated by storj.io/drpc/drpcctx.(*Tracker).Run in goroutine 1097\n\t/go/pkg/mod/storj.io/drpc@v0.0.34/drpcctx/tracker.go:30 +0x79\n\ngoroutine 169952 [select]:\nstorj.io/drpc/drpcmanager.(*Manager).manageStreams(0xc0035aab40)\n\t/go/pkg/mod/storj.io/drpc@v0.0.34/drpcmanager/manager.go:319 +0x11d\ncreated by storj.io/drpc/drpcmanager.NewWithOptions in goroutine 169950\n\t/go/pkg/mod/storj.io/drpc@v0.0.34/drpcmanager/manager.go:122 +0x456\n\ngoroutine 182166 [select]:\nstorj.io/drpc/drpcmanager.(*Manager).manageReader(0xc0033ab680)\n\t/go/pkg/mod/storj.io/drpc@v0.0.34/drpcmanager/manager.go:265 +0x325\ncreated by storj.io/drpc/drpcmanager.NewWithOptions in goroutine 182165\n\t/go/pkg/mod/storj.io/drpc@v0.0.34/drpcmanager/manager.go:121 +0x416\n\ngoroutine 174332 [select]:\nstorj.io/drpc/drpcmanager.(*Manager).manageReader(0xc0035aa1e0)\n\t/go/pkg/mod/storj.io/drpc@v0.0.34/drpcmanager/manager.go:265 +0x325\ncreated by storj.io/drpc/drpcmanager.NewWithOptions in goroutine 174331\n\t/go/pkg/mod/storj.io/drpc@v0.0.34/drpcmanager/manager.go:121 +0x416\n"}
2024-07-23T02:35:58Z	INFO	Downloading versions.	{"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}
2024-07-23T02:35:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}
2024-07-23T02:35:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode"}
2024-07-23T02:35:59Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}
2024-07-23T02:35:59Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode-updater"}
2024-07-23T02:43:16Z	ERROR	pieces:trash	emptying trash failed	{"Process": "storagenode", "error": "pieces error: filestore error: context canceled; context canceled", "errorVerbose": "pieces error: filestore error: context canceled; context canceled\n\tstorj.io/storj/storagenode/blobstore/filestore.(*blobStore).EmptyTrash:193\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).EmptyTrash:319\n\tstorj.io/storj/storagenode/pieces.(*Store).EmptyTrash:429\n\tstorj.io/storj/storagenode/pieces.(*TrashChore).Run.func1.1:84\n\tstorj.io/common/sync2.(*Workplace).Start.func1:89"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "4JNQ2FAQKR43RZUOF553W6QHAEQWMDWDQYZ7ALW6UOKEWIUQU6AQ", "Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Action": "PUT", "Remote Address": "83.89.250.40:53702"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "BSZJXX24JU26APEGZBN2TRLAWGB2GRWOP5S25VE7ENMTSZ7TD5XA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.205.241:47764"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "JZWNHN7GCX4ZAYZWGGYBL4CYLC6FLPTNXRPP62CNYAZWYVVCNGUQ", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "79.127.219.43:57510", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "ICLZ2626V5UIKE6L5I3NCEFHUS23RC52SUSDA5XCUZBKBVOB5J6Q", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.201.209:46398", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "7K4VOO2DKBEZPKDJXNTHUR2LONMWGKR3W3USM2QF5DV4X2C2RVBA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT_REPAIR", "Remote Address": "199.102.71.65:33704", "Size": 0}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "QG5LREH5Y3RJU55QB7MU3S6Q6RJQJHH24PSOCJO2UXNL7UPF2X5Q", "Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Action": "PUT", "Remote Address": "79.127.226.99:59768", "Size": 327680}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "WDAD7GTHGAKWBMQKDXASPDOYZBN5C6TBXXHYMUMAELZD4IX35S3Q", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "79.127.201.212:50072", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "RHLYIBJGTYF3FJUDLGHYKDG6SGPBGSYOUWH25O3Y5F33IBMWN62A", "Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Action": "PUT", "Remote Address": "79.127.226.99:39552", "Size": 1900544}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "ZH25B3KIBF2JWW7P3LUUCES3DAB2I4APRFVN4MRGJNTWG2X25HMQ", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT_REPAIR", "Remote Address": "5.161.245.22:12505", "Size": 0}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "G7PFNRMOBCCQ3WMRNMROVRK66IVAXPLMIXE2JHVWR2GSADG74UYA", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "79.127.205.243:58576", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "2RHPNJIMYHE757D7PMP5LNSQMU7MEGLWQRZ6XKFES3K5ITR2UYPA", "Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Action": "PUT", "Remote Address": "79.127.226.102:43988", "Size": 2228224}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "KROO3JYNCGAYKS4YVK33ISSIMQD23O72ICVKIBTDLQ2Z2JD5NR4A", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.41:41624", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "2ECNBG5PTU4DJFY3BZTX34GXYQFIDN365SVFUODEGG3MBF3VY2IQ", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.42:43288", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "O4YHTZTGHBSKZ5ZJH5WSU3VNVMVJNQ2KSBFRAOBXMXXTZL5SFX7A", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT_REPAIR", "Remote Address": "199.102.71.63:39178", "Size": 0}
2024-07-23T02:43:16Z	ERROR	retain	retain pieces failed	{"Process": "storagenode", "cachePath": "config/retain", "error": "retain: filewalker: context canceled", "errorVerbose": "retain: filewalker: context canceled\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkSatellitePieces:74\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkSatellitePiecesToTrash:181\n\tstorj.io/storj/storagenode/pieces.(*Store).WalkSatellitePiecesToTrash:568\n\tstorj.io/storj/storagenode/retain.(*Service).retainPieces:373\n\tstorj.io/storj/storagenode/retain.(*Service).Run.func2:259\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "7I7Y7ZP6DP6TJ5Y6PO2WC2FALJPQPBAEHKZXNIAHLKBDB7DXO4AQ", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.226.98:57602", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "JUO4K2OSLBB6AWPV2HRGOOJM7OADX75AEFJRLGWTWCMG77RO4RVQ", "Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Action": "PUT_REPAIR", "Remote Address": "49.13.230.80:46746", "Size": 0}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "SEPAAT7RTSSKD5UO7W36HNCVDDLQITC7LJNE5AXWRF76D76PJY7A", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.44:39318"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "NP4643X5GSOG6XZE7KFY47IRH6CV5GNW6W7YXWAD4Z5GLH36FB5Q", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.34:55642"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "FB6U7HAYRBVKGWLBKMM5CULNKV4UGPJBKRQMNIHRHISAFEYDQKNQ", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.205.225:51778", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "JGPHQDD3SJCEN3MVMU3AP3MYXATLNJRNI7GNKH5ZTG2GT2D6RHIQ", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.33:38110", "Size": 524288}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "VHP5A4Y6ZHNQ3GMU7U7FIY4NEHCWTZZ6ZBXQX6HFAKP2BT7BCM6A", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.201.210:42256"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "H6T6RBBXUHZMTBM5AELZDTBBG6XSPXRELJYBND5RFQZLKACMI5MQ", "Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Action": "PUT", "Remote Address": "79.127.226.99:55856", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "SYZHPISAQYNDN4AAB3J2EYWOABH4PBU6FWUI5KJUAA7T44XYEFGA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.40:60362"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "6NHSLK7EJ7EIBLHGWV6ZC2I2G3ROWCVKQ3H36HEYPB2NYDJEJE3Q", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.201.213:46386"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "UO4DEBIG7IAFWWMJYQODGCDQUECLBILI6HBQQKRCEX4KZNTE42RQ", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.226.101:36482"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "EZYI42AOTCBXKTME4ZI2YWPFKKP2T5BDFGPPJU3II4XOK6EJAZ5Q", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.41:56250"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "U5QZLJ7HDOQUD5TT4JXHXRG4LX7VPCIXK5XL7N5MOQIZ62LUYD6Q", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.205.239:39732"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "ELNGXH4TTRJROYXDDISTCQLTELF2AXNAAHMDNJDVJRRZIKZIUMUA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.35:38080"}
2024-07-23T02:43:16Z	ERROR	retain	retain pieces failed	{"Process": "storagenode", "cachePath": "config/retain", "error": "retain: filewalker: context canceled", "errorVerbose": "retain: filewalker: context canceled\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkSatellitePieces:74\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkSatellitePiecesToTrash:181\n\tstorj.io/storj/storagenode/pieces.(*Store).WalkSatellitePiecesToTrash:568\n\tstorj.io/storj/storagenode/retain.(*Service).retainPieces:373\n\tstorj.io/storj/storagenode/retain.(*Service).Run.func2:259\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "UWHZTI6SGGZG3AUKAO43F5LJR34R326SPETWEZQ45US6VFCES5JA", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "79.127.205.234:35176", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "U6PFUEKZSLHUZKZ7ZQ6VJ7BVBCOCE3SVW5KFDP4MMNXE3STQZFDA", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "79.127.219.36:45444", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "A6J2NQKBRJXXLWUJ6JOPBFEUKTRAQGS6Y7OII63JAAD6LTGOYH5Q", "Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Action": "PUT", "Remote Address": "79.127.226.97:35346", "Size": 2031616}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "Q6OWCRB4WK3AZ4RZNNOWTNGLHIEHUGS7RMMUWV4BXKQJNLY3DPFA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "173.47.190.119:6150", "Size": 1703936}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "7RBZUIFAZUWXXD55H7ZBVJG3D2X6SX2SIKXEAVRWJHQT4ZMCLGTA", "Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Action": "PUT", "Remote Address": "79.127.226.98:49586", "Size": 1245184}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "SHPWSYION6JWTP433BQB7ZGONF2R7XO7YSDJTPH6S6ZI6JMJIF6Q", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.226.97:44586", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "NCGO7IQCXZLCRTBLMHTWZV4PQAL5QWHVTZKC6NIY332U3LA5DCHA", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "79.127.205.243:43292", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "7EPXSXEY53HE3BBTJ6X6FZQ6AOZB2OQQKBWDC72BSIGCT5RDG2VA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.226.97:41540", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "BLTRQYBBV3FJ4NGUE3TQDFA4SQ26YP4GPOK4X22S3F5NRYZH6VQA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.205.240:57480"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "ELMXZSNEEFEHOPPVAJEOGCXP7NSRQ6WNRQR3IVOW3FNYBMKFCQRA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.205.235:59680", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "WZSIDQP73XUICVAKRYT444UHKWBLXMTIE6L5CLOUAET2S7FBGPCA", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "79.127.219.37:39912", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "EDXO7QBO3NZNZQ2SVPMDPPDZO6HKTEYBBHU7U52KYKVHX555GH5A", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.201.213:38016"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "N77VXGRLFV77DR2KW3PQOWJNKN426DOQSBRB2R7C722YC3PNORWQ", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "79.127.201.212:57392", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "D6YETDEE5BF6O6IVWNDVSMST4BR3SRJPKW4IEITAENTHFN55DHYA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.213.34:39976"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "K6S5U6AB6NRSIX6TVLZSUVTRZ2DHGWMYZKZF6IWUGGTBEZMGYKMA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.226.99:35336", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "4BTCBTG5ECW6ZD5EJKTYO26I5FG4J6CQDEIJG3TCFQFLYDNEVP4A", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.33:51616"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "2343KKKNWQODVDZSXTTAHL6HWKLVW2MQLPXPWMY5UICNG7MEPVOQ", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "79.127.205.230:37988", "Size": 65536}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "MP4W4USJ3CMEENOVRFX675Q2PEQDZMYSPEAVLIGBW63ZQFSZ2EIA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.226.99:37368"}
2024-07-23T02:43:16Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "KFLQEWLKRB2FU5ZWJAZACKCVP2WJPOCDLNZG5PWQNMOGJ4QM75AQ", "Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Action": "PUT_REPAIR", "Remote Address": "91.107.232.65:52324", "Size": 0}
2024-07-23T02:43:26Z	ERROR	gracefulexit:chore	error retrieving satellites.	{"Process": "storagenode", "error": "satellitesdb: database is locked", "errorVerbose": "satellitesdb: database is locked\n\tstorj.io/storj/storagenode/storagenodedb.(*satellitesDB).ListGracefulExits:197\n\tstorj.io/storj/storagenode/gracefulexit.(*Service).ListPendingExits:59\n\tstorj.io/storj/storagenode/gracefulexit.(*Chore).AddMissing:55\n\tstorj.io/common/sync2.(*Cycle).Run:160\n\tstorj.io/storj/storagenode/gracefulexit.(*Chore).Run:48\n\tstorj.io/storj/private/lifecycle.(*Group).Run.func2.1:87\n\truntime/pprof.Do:51\n\tstorj.io/storj/private/lifecycle.(*Group).Run.func2:86\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78"}
2024-07-23T02:43:26Z	ERROR	orders	cleaning DB archive	{"Process": "storagenode", "error": "ordersdb: database is locked", "errorVerbose": "ordersdb: database is locked\n\tstorj.io/storj/storagenode/storagenodedb.(*ordersDB).CleanArchive:325\n\tstorj.io/storj/storagenode/orders.(*Service).CleanArchive:162\n\tstorj.io/storj/storagenode/orders.(*Service).Run.func2:146\n\tstorj.io/common/sync2.(*Cycle).Run:160\n\tstorj.io/common/sync2.(*Cycle).Start.func1:77\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78"}
2024-07-23T02:50:58Z	INFO	Downloading versions.	{"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}
2024-07-23T02:50:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}
2024-07-23T02:50:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode"}
2024-07-23T02:50:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}
2024-07-23T02:50:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode-updater"}
2024-07-23T03:05:58Z	INFO	Downloading versions.	{"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}
2024-07-23T03:05:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}
2024-07-23T03:05:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode"}
2024-07-23T03:05:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}
2024-07-23T03:05:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode-updater"}
2024-07-23T03:20:58Z	INFO	Downloading versions.	{"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}
2024-07-23T03:20:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}
2024-07-23T03:20:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode"}
2024-07-23T03:20:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}
2024-07-23T03:20:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode-updater"}
2024-07-23T03:35:58Z	INFO	Downloading versions.	{"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}
2024-07-23T03:35:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}
2024-07-23T03:35:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode"}
2024-07-23T03:35:59Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}
2024-07-23T03:35:59Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode-updater"}
2024-07-23T03:50:58Z	INFO	Downloading versions.	{"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}
2024-07-23T03:50:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}
2024-07-23T03:50:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode"}
2024-07-23T03:50:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}
2024-07-23T03:50:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode-updater"}
2024-07-23T04:05:58Z	INFO	Downloading versions.	{"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}
2024-07-23T04:05:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}
2024-07-23T04:05:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode"}
2024-07-23T04:05:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}
2024-07-23T04:05:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode-updater"}
2024-07-23T04:20:58Z	INFO	Downloading versions.	{"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}
2024-07-23T04:20:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}
2024-07-23T04:20:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode"}
2024-07-23T04:20:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}
2024-07-23T04:20:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode-updater"}
2024-07-23T04:35:58Z	INFO	Downloading versions.	{"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}
2024-07-23T04:35:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}
2024-07-23T04:35:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode"}
2024-07-23T04:35:58Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}
2024-07-23T04:35:58Z	INFO	New version is being rolled out but hasn't made it to this node yet	{"Process": "storagenode-updater", "Service": "storagenode-updater"}
2024-07-23T04:50:58Z	INFO	Downloading versions.	{"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}

loop

Seems it’s hanging here?

Do you have a storagenode process running?

docker top storagenode

Perhaps not needed, but seems you need to move databases to SSD: How to move DB’s to SSD on Docker.

For sure need ssd, I will, just need to sell beer bottles for recycle :recycle:
for now I can just move it to different drive with raid 5, maybe it will postpone buying on some days or few weeks

one more logs just another same case

"12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.33:34556", "Size": 65536}

2024-07-25T00:08:09Z INFO piecestore upload canceled {"Process": "storagenode", "Piece ID": "GGQ26NS2MR2RTWHX6NTRFW45PABWOL6C65AMYIXH5544RE3MIR4Q", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "79.127.201.210:41244", "Size": 65536}

2024-07-25T00:08:09Z INFO piecestore upload canceled {"Process": "storagenode", "Piece ID": "Z2RY4J4PK6YSZONDA23Z32DTWASI4VFYS3AI72ITG5KHNLXQDN2A", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.205.228:48804", "Size": 65536}

2024-07-25T00:08:09Z INFO piecestore upload canceled {"Process": "storagenode", "Piece ID": "PMVEGZC3NQFNWQV3K6QPC6KD5IEQZ75IZ3O5VWA2TKOYIWTH45OQ", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.33:34964", "Size": 65536}

2024-07-25T00:08:09Z INFO piecestore upload canceled {"Process": "storagenode", "Piece ID": "TWWHJNPALLFSQU5OPEHLXMH2UY4PUNWTEZFCBMDQO2M4MAIKUF4Q", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.40:44670", "Size": 65536}

2024-07-25T00:08:09Z INFO piecestore upload canceled {"Process": "storagenode", "Piece ID": "JUD3V4ADIYKWZG6V27JQKS5GDAALGCMCZ76YO7QK6CLXFWFT2F7A", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.37:47164", "Size": 65536}

2024-07-25T00:08:09Z INFO piecestore upload canceled {"Process": "storagenode", "Piece ID": "IES56EYHKUDODRMMD7UDGMT4D4VSFQ333OV3N7NQDBJU5Y5UJ35A", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT_REPAIR", "Remote Address": "199.102.71.55:58820", "Size": 0}

2024-07-25T00:08:09Z INFO piecestore upload canceled {"Process": "storagenode", "Piece ID": "GQ4NHFGAW4PBQRF3V575ONDMJGTMLGI7CQ7LAZA7MTG4PCVILOKA", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "109.61.92.84:45186", "Size": 65536}

2024-07-25T00:08:09Z INFO piecestore upload canceled {"Process": "storagenode", "Piece ID": "CP2YESSBIF4ANANNFWFHEJCRG74WHQ5IDERIIEYM4YEW5TUXDXPA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT_REPAIR", "Remote Address": "199.102.71.70:59198", "Size": 0}

2024-07-25T00:08:09Z INFO piecestore upload canceled {"Process": "storagenode", "Piece ID": "6HUUKUJ6HFLRRYKBEB2QQPLFJGGAFSL6NQ52TBDTHBEWWS3XUW5Q", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "109.61.92.66:47646", "Size": 65536}

2024-07-25T00:13:55Z INFO Downloading versions. {"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}

2024-07-25T00:13:56Z INFO Current binary version {"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}

2024-07-25T00:13:56Z INFO New version is being rolled out but hasn't made it to this node yet {"Process": "storagenode-updater", "Service": "storagenode"}

2024-07-25T00:13:56Z INFO Current binary version {"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}

2024-07-25T00:13:56Z INFO New version is being rolled out but hasn't made it to this node yet {"Process": "storagenode-updater", "Service": "storagenode-updater"}

2024-07-25T00:28:55Z INFO Downloading versions. {"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}

2024-07-25T00:28:56Z INFO Current binary version {"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}

2024-07-25T00:28:56Z INFO New version is being rolled out but hasn't made it to this node yet {"Process": "storagenode-updater", "Service": "storagenode"}

2024-07-25T00:28:56Z INFO Current binary version {"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}

2024-07-25T00:28:56Z INFO New version is being rolled out but hasn't made it to this node yet {"Process": "storagenode-updater", "Service": "storagenode-updater"}

2024-07-25T00:43:55Z INFO Downloading versions. {"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}

2024-07-25T00:43:56Z INFO Current binary version {"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}

2024-07-25T00:43:56Z INFO New version is being rolled out but hasn't made it to this node yet {"Process": "storagenode-updater", "Service": "storagenode"}

2024-07-25T00:43:56Z INFO Current binary version {"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.105.4"}

2024-07-25T00:43:56Z INFO New version is being rolled out but hasn't made it to this node yet {"Process": "storagenode-updater", "Service": "storagenode-updater"}

2024-07-25T00:58:55Z INFO Downloading versions. {"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}

2024-07-25T00:58:56Z INFO Current binary version {"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.105.4"}

2024-07-25T0

Do you still have a storagenode process running there?

Today yes but previously needed restart

If you would see the repeated checks by the updater, but no lines from the storagenode process, please execute this command next time:

docker top storagenode

and post the result here.

it just happened again so I run it

here it is:

huge log before in one line

.HandleRPC:61\n\tstorj.io/common/experiment.(*Handler).HandleRPC:42\n\tstorj.io/drpc/drpcserver.(*Server).handleRPC:167\n\tstorj.io/drpc/drpcserver.(*Server).ServeOne:109\n\tstorj.io/drpc/drpcserver.(*Server).Serve.func2:157\n\tstorj.io/drpc/drpcctx.(*Tracker).track:35\n\ngoroutine 194157\n\tsync.runtime_Semacquire:62\n\tsync.(*WaitGroup).Wait:116\n\tgolang.org/x/sync/errgroup.(*Group).Wait:56\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download:862\n\tstorj.io/common/pb.DRPCPiecestoreDescription.Method.func2:302\n\tstorj.io/drpc/drpcmux.(*Mux).HandleRPC:33\n\tstorj.io/common/rpc/rpctracing.(*Handler).HandleRPC:61\n\tstorj.io/common/experiment.(*Handler).HandleRPC:42\n\tstorj.io/drpc/drpcserver.(*Server).handleRPC:167\n\tstorj.io/drpc/drpcserver.(*Server).ServeOne:109\n\tstorj.io/drpc/drpcserver.(*Server).Serve.func2:157\n\tstorj.io/drpc/drpcctx.(*Tracker).track:35\n\ngoroutine 194173\n\tsyscall.Syscall:69\n\tsyscall.read:736\n\tsyscall.Read:181\n\tinternal/poll.ignoringEINTRIO:736\n\tinternal/poll.(*FD).Read:160\n\tos.(*File).read:29\n\tos.(*File).Read:118\n\tstorj.io/storj/storagenode/pieces.(*Reader).Read:304\n\tio.ReadAtLeast:335\n\tio.ReadFull:354\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).sendData:876\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download.func7:786\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78\n"}
2024-07-27T19:53:56Z	INFO	Downloading versions.	{"Process": "storagenode-updater", "Server Address": "https://version.storj.io"}
2024-07-27T19:53:57Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode", "Version": "v1.108.3"}
2024-07-27T19:53:57Z	INFO	Version is up to date	{"Process": "storagenode-updater", "Service": "storagenode"}
2024-07-27T19:53:57Z	INFO	Current binary version	{"Process": "storagenode-updater", "Service": "storagenode-updater", "Version": "v1.108.3"}
2024-07-27T19:53:57Z	INFO	Version is up to date	{"Process": "storagenode-updater", "Service": "storagenode-updater"}
admin@DATAHUB:/volume1/home/admin $ docker top storagenode
permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Get "http://%2Fvar%2Frun%2Fdocker.sock/v1.44/containers/storagenode/top": dial unix /var/run/docker.sock: connect: permission denied
admin@DATAHUB:/volume1/home/admin $ sudo docker top storagenode
PID                 USER                TIME                COMMAND
11143               root                0:05                {supervisord} /usr/bin/python2 /usr/bin/supervisord -c /etc/supervisor/supervisord.conf
11179               root                0:00                {stop-supervisor} /bin/bash /bin/stop-supervisor
11180               root                2:26                /app/storagenode run --config-dir config --identity-dir identity --version.server-address=https://version.storj.io --storage.allocated-disk-space=11TB --contact.external-address=XXX:28967 --operator.email=XXXX@gmail.com --operator.wallet=0xXXX --pieces.enable-lazy-filewalker=false
11181               root                0:00                /app/storagenode-updater run --binary-location /app/storagenode --config-dir config --identity-dir identity --version.server-address=https://version.storj.io
admin@DATAHUB:/volume1/home/admin $

now it starts order cancelation with some other errors

|2024-07-27T20:00:04Z|ERROR|pieces:trash|emptying trash failed|{Process: storagenode, error: pieces error: filestore error: context canceled; context canceled, errorVerbose: pieces error: filestore error: context canceled; context canceled\n\tstorj.io/storj/storagenode/blobstore/filestore.(*blobStore).EmptyTrash:193\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).EmptyTrash:361\n\tstorj.io/storj/storagenode/pieces.(*Store).EmptyTrash:430\n\tstorj.io/storj/storagenode/pieces.(*TrashChore).Run.func1.1:84\n\tstorj.io/common/sync2.(*Workplace).Start.func1:89}|
|---|---|---|---|---|
|2024-07-27T20:00:04Z|ERROR|piecestore|error sending hash and order limit|{Process: storagenode, Piece ID: 7WRREC7LYDCDX6YBJPVAKKKJ67XKP4RTWAKVGIEIYTH2EUPKAZDQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR, Offset: 0, Size: 36864, Remote Address: 5.161.74.82:54244, error: context canceled}|
|2024-07-27T20:00:04Z|INFO|piecestore|download canceled|{Process: storagenode, Piece ID: 7WRREC7LYDCDX6YBJPVAKKKJ67XKP4RTWAKVGIEIYTH2EUPKAZDQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR, Offset: 0, Size: 36864, Remote Address: 5.161.74.82:54244}|
|2024-07-27T20:00:05Z|ERROR|piecestore|error sending hash and order limit|{Process: storagenode, Piece ID: YPVTYDALBC7EY42KZFAFBEITWMIO4OICMX2PLMWFCUYTFDYYPAZQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR, Offset: 0, Size: 12288, Remote Address: 199.102.71.54:36666, error: context canceled}|
|2024-07-27T20:00:05Z|INFO|piecestore|download canceled|{Process: storagenode, Piece ID: YPVTYDALBC7EY42KZFAFBEITWMIO4OICMX2PLMWFCUYTFDYYPAZQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR, Offset: 0, Size: 12288, Remote Address: 199.102.71.54:36666}|
|2024-07-27T20:00:06Z|ERROR|piecestore|error sending hash and order limit|{Process: storagenode, Piece ID: KGFMHKRW65GT5RKDKMP3GRHVJAZLFBIBNCP7E5XIWNPFZSU7MWUQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR, Offset: 0, Size: 2048, Remote Address: 199.102.71.57:48388, error: context canceled}|
|2024-07-27T20:00:06Z|INFO|piecestore|download canceled|{Process: storagenode, Piece ID: KGFMHKRW65GT5RKDKMP3GRHVJAZLFBIBNCP7E5XIWNPFZSU7MWUQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR, Offset: 0, Size: 2048, Remote Address: 199.102.71.57:48388}|

The storagenode process is still running.
Could you please search for FATAL errors?

sudo docker logs storagenode 2>&1 | grep FATAL | tail

funny but there is only fatal pieces ID or Satellite

admin@DATAHUB:/volume1/home/admin $ sudo docker logs storagenode 2>&1 | grep --color=always FATAL
2024-07-02T20:45:54Z	INFO	piecestore	upload started	{"Process": "storagenode", "Piece ID": "W6S4LE64K6JOEFFD7KXLFRPYOV4LPLF53UWF5HFATALY2MNRBU4A", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.205.239:57198", "Available Space": 3048512996352}
2024-07-02T20:46:05Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "W6S4LE64K6JOEFFD7KXLFRPYOV4LPLF53UWF5HFATALY2MNRBU4A", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.205.239:57198"}
2024-07-03T09:49:54Z	INFO	piecestore	upload started	{"Process": "storagenode", "Piece ID": "4QIZCCNFCVUCKX3ZAPYFATALGHPRYTCF3ZODEQZFH5HC6JKUCHZA", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "109.61.92.70:39804", "Available Space": 2954854918390}
2024-07-03T09:49:54Z	INFO	piecestore	uploaded	{"Process": "storagenode", "Piece ID": "4QIZCCNFCVUCKX3ZAPYFATALGHPRYTCF3ZODEQZFH5HC6JKUCHZA", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "109.61.92.70:39804", "Size": 3328}
2024-07-03T15:08:26Z	INFO	piecestore	upload started	{"Process": "storagenode", "Piece ID": "5HY6FATAL32ETB3SWVR7K3KR2MDL4B44JS3W4E3KLRBHZH5P4GNA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.226.99:57970", "Available Space": 2918319419392}
2024-07-03T15:08:27Z	INFO	piecestore	upload canceled (race lost or node shutdown)	{"Process": "storagenode", "Piece ID": "5HY6FATAL32ETB3SWVR7K3KR2MDL4B44JS3W4E3KLRBHZH5P4GNA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.226.99:57970"}
2024-07-08T00:55:48Z	INFO	piecestore	upload started	{"Process": "storagenode", "Piece ID": "DHILGGIFATALREJEZYTJDKMJ6KZO6GIP2DBIHAEHUX2DEU4K2TXQ", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.205.242:59702", "Available Space": 2772676889580}
2024-07-08T00:56:22Z	INFO	piecestore	uploaded	{"Process": "storagenode", "Piece ID": "DHILGGIFATALREJEZYTJDKMJ6KZO6GIP2DBIHAEHUX2DEU4K2TXQ", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.205.242:59702", "Size": 2319360}
2024-07-08T04:39:06Z	INFO	piecestore	upload started	{"Process": "storagenode", "Piece ID": "QFATALEK6TMJU3SQAF6IGFAEA5EL53AHOUWEQF4K24EFV4I36ZDQ", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.43:59170", "Available Space": 2757832108740}
2024-07-08T04:39:07Z	INFO	piecestore	uploaded	{"Process": "storagenode", "Piece ID": "QFATALEK6TMJU3SQAF6IGFAEA5EL53AHOUWEQF4K24EFV4I36ZDQ", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.219.43:59170", "Size": 291328}
2024-07-12T08:21:00Z	INFO	piecestore	upload started	{"Process": "storagenode", "Piece ID": "NALHXRRVAAKU5ITBKSUMEUVW6QI7ZLAVR2ID7FATALFBGE334S3A", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "109.61.92.81:58722", "Available Space": 2450541814006}
2024-07-12T08:21:00Z	INFO	piecestore	uploaded	{"Process": "storagenode", "Piece ID": "NALHXRRVAAKU5ITBKSUMEUVW6QI7ZLAVR2ID7FATALFBGE334S3A", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "109.61.92.81:58722", "Size": 197376}
2024-07-13T00:04:07Z	INFO	piecestore	upload started	{"Process": "storagenode", "Piece ID": "YZVHQNYDVL2NICBDTUZLJCFATALSMJR4AP2BEYDIC4X6E5WV6RNQ", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "79.127.205.234:33238", "Available Space": 2364744232704}
2024-07-13T00:04:08Z	INFO	piecestore	uploaded	{"Process": "storagenode", "Piece ID": "YZVHQNYDVL2NICBDTUZLJCFATALSMJR4AP2BEYDIC4X6E5WV6RNQ", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "79.127.205.234:33238", "Size": 197376}
2024-07-16T09:01:48Z	INFO	piecestore	upload started	{"Process": "storagenode", "Piece ID": "HTBSC4E4IJXMWALQJLPCV6ICOSBLT2ZKZPX3IQXEXT3PFATALQUA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "109.61.92.81:38612", "Available Space": 2526817329920}
2024-07-16T09:01:49Z	INFO	piecestore	uploaded	{"Process": "storagenode", "Piece ID": "HTBSC4E4IJXMWALQJLPCV6ICOSBLT2ZKZPX3IQXEXT3PFATALQUA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "109.61.92.81:38612", "Size": 1024}
2024-07-16T09:01:55Z	INFO	piecestore	download started	{"Process": "storagenode", "Piece ID": "HTBSC4E4IJXMWALQJLPCV6ICOSBLT2ZKZPX3IQXEXT3PFATALQUA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "GET", "Offset": 0, "Size": 768, "Remote Address": "109.61.92.75:56074"}
2024-07-16T09:01:55Z	INFO	piecestore	downloaded	{"Process": "storagenode", "Piece ID": "HTBSC4E4IJXMWALQJLPCV6ICOSBLT2ZKZPX3IQXEXT3PFATALQUA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "GET", "Offset": 0, "Size": 768, "Remote Address": "109.61.92.75:56074"}
2024-07-16T20:05:49Z	INFO	piecestore	upload started	{"Process": "storagenode", "Piece ID": "UVH2YFATALGINU4YV7X6VFB5LRO52B5HQWEY7N4VH7FSP55QGZEQ", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "109.61.92.76:46390", "Available Space": 2513520758518}
2024-07-16T20:05:49Z	INFO	piecestore	upload canceled	{"Process": "storagenode", "Piece ID": "UVH2YFATALGINU4YV7X6VFB5LRO52B5HQWEY7N4VH7FSP55QGZEQ", "Satellite ID": "1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE", "Action": "PUT", "Remote Address": "109.61.92.76:46390", "Size": 131072}
2024-07-24T19:27:15Z	INFO	piecestore	upload started	{"Process": "storagenode", "Piece ID": "5R3FATALKUBF7RLS3PSESNTVJM2IAYKQI7EXT7XCM4NQ2ZDEKJQA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.226.97:49332", "Available Space": 2573989160192}
2024-07-24T19:27:16Z	INFO	piecestore	uploaded	{"Process": "storagenode", "Piece ID": "5R3FATALKUBF7RLS3PSESNTVJM2IAYKQI7EXT7XCM4NQ2ZDEKJQA", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "PUT", "Remote Address": "79.127.226.97:49332", "Size": 284160}

Yes, I saw that before.
Does the node reported as offline? If so, is it possible that the external IP has changed?

Yes it is offline. I don’t think that ip could change. I have two wan router and ip fixed hard to ports by Mac address. external ip also static by default. no any dydns.

the trouble with this node start few month ago. there is nothing special happened. I think may be some configuration comes with Asustor update or some other settings changed. another node has more traffic and same one separate hdd and there is no such issues. another node works on Debian and rockpro64 it has much less resources than Asustor. So I think I need to dig in some asustor customization from OS to docker.

Thanks for reading my boring logs)

todays log is different:

2024-07-30 18:53:31,295 INFO waiting for storagenode, processes-exit-eventlistener to die
2024-07-30 18:53:34,298 INFO waiting for storagenode, processes-exit-eventlistener to die
2024-07-30 18:53:37,302 INFO waiting for storagenode, processes-exit-eventlistener to die
2024-07-30 18:53:40,305 INFO waiting for storagenode, processes-exit-eventlistener to die
2024-07-30 18:53:41,306 WARN killing 'storagenode' (12) with SIGKILL
2024-07-30 18:53:43,308 INFO waiting for storagenode, processes-exit-eventlistener to die
2024-07-30 18:53:46,312 INFO waiting for storagenode, processes-exit-eventlistener to die
2024-07-30 18:53:49,315 INFO waiting for storagenode, processes-exit-eventlistener to die
2024-07-30 18:53:51,318 WARN killing 'storagenode' (12) with SIGKILL
2024-07-30 18:53:52,319 INFO waiting for storagenode, processes-exit-eventlistener to die
2024-07-30 18:53:55,323 INFO waiting for storagenode, processes-exit-eventlistener to die
2024-07-30 18:53:58,326 INFO waiting for storagenode, processes-exit-eventlistener to die

Seems it cannot stop the storagenode process, this is usually an indication of the hardware issue.
Please check the cables, the power supply, the disk, the RAM.

Maybe, but there is no to much options with asustor, it is just 2 years old. main drive 1 years old.

I decided to make it die without trouble so I try to initiate graceful exit to gain a time for redesign.

trying second day and no any progress. restarting always. no any progress in exit percentages.

last logs:

|2024-08-06T17:01:00Z|INFO|piecestore|download started|{Process: storagenode, Piece ID: K7D2T7M6H3KKXM373H5MQKKBMDHCGAG5NIIIZ2WU23YUHPVHIV3A, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET, Offset: 0, Size: 7680, Remote Address: 109.61.92.69:35640}|
|---|---|---|---|---|
|2024-08-06T17:01:00Z|INFO|piecestore|download started|{Process: storagenode, Piece ID: FRUM4OBHTDFQRYJ5JVLIW343JUB74HC554TSLOGWMD7ZFL3BE2AQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR, Offset: 0, Size: 36864, Remote Address: 5.161.236.119:56780}|
|2024-08-06T17:01:03Z|INFO|piecestore|download started|{Process: storagenode, Piece ID: HFPYDDPYVMM5SVH4OY5BIBXMM7JV2H35WPA6L73SKBVMVTCCASYQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR, Offset: 0, Size: 4864, Remote Address: 5.161.74.82:45276}|
|2024-08-06T17:01:04Z|INFO|piecestore|download started|{Process: storagenode, Piece ID: TWTXUREYQWP6EGL7K7TJI5J6OR46IQW7ZD5NTUXOCNDEVYGCGERQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR, Offset: 0, Size: 62720, Remote Address: 5.161.214.55:41636}|
|2024-08-06T17:01:06Z|INFO|piecestore|download started|{Process: storagenode, Piece ID: MZ2CPIT2AFRTZ6O7EIVDHIPLULFK3J3QELPRQIVL227SNG4RO24Q, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR, Offset: 0, Size: 768, Remote Address: 5.161.198.238:56672}|
|2024-08-06T17:01:07Z|INFO|piecestore|download started|{Process: storagenode, Piece ID: CDFUJQR3FTWP5QBW6FBZTXA5J3KJXMUC2LGOORG6QVDR32DEKV2Q, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR, Offset: 0, Size: 181504, Remote Address: 5.161.118.173:34076}|
|2024-08-06T17:01:07Z|ERROR|services|unexpected shutdown of a runner|{Process: storagenode, name: piecestore:monitor, error: piecestore monitor: timed out after 2m0s while verifying writability of storage directory, errorVerbose: piecestore monitor: timed out after 2m0s while verifying writability of storage directory\n\tstorj.io/storj/storagenode/monitor.(*Service).Run.func2.1:175\n\tstorj.io/common/sync2.(*Cycle).Run:160\n\tstorj.io/storj/storagenode/monitor.(*Service).Run.func2:164\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78}|
|2024-08-06T17:01:07Z|ERROR|pieces|used-space-filewalker failed|{Process: storagenode, Satellite ID: 12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs, Lazy File Walker: false, error: filewalker: context canceled, errorVerbose: filewalker: context canceled\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkSatellitePieces:74\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkAndComputeSpaceUsedBySatellite:79\n\tstorj.io/storj/storagenode/pieces.(*Store).SpaceUsedTotalAndBySatellite:724\n\tstorj.io/storj/storagenode/pieces.(*CacheService).Run.func1:71\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78}|
|2024-08-06T17:01:07Z|INFO|pieces|used-space-filewalker started|{Process: storagenode, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S}|
|2024-08-06T17:01:08Z|ERROR|pieces|used-space-filewalker failed|{Process: storagenode, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Lazy File Walker: false, error: filewalker: context canceled, errorVerbose: filewalker: context canceled\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkSatellitePieces:74\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkAndComputeSpaceUsedBySatellite:79\n\tstorj.io/storj/storagenode/pieces.(*Store).SpaceUsedTotalAndBySatellite:724\n\tstorj.io/storj/storagenode/pieces.(*CacheService).Run.func1:71\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78}|
|2024-08-06T17:01:08Z|INFO|pieces|used-space-filewalker started|{Process: storagenode, Satellite ID: 121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6}|
|2024-08-06T17:01:08Z|ERROR|pieces|used-space-filewalker failed|{Process: storagenode, Satellite ID: 121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6, Lazy File Walker: false, error: filewalker: context canceled, errorVerbose: filewalker: context canceled\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkSatellitePieces:74\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkAndComputeSpaceUsedBySatellite:79\n\tstorj.io/storj/storagenode/pieces.(*Store).SpaceUsedTotalAndBySatellite:724\n\tstorj.io/storj/storagenode/pieces.(*CacheService).Run.func1:71\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78}|
|2024-08-06T17:01:08Z|INFO|pieces|used-space-filewalker started|{Process: storagenode, Satellite ID: 1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE}|
|2024-08-06T17:01:09Z|ERROR|pieces|used-space-filewalker failed|{Process: storagenode, Satellite ID: 1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE, Lazy File Walker: false, error: filewalker: context canceled, errorVerbose: filewalker: context canceled\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkSatellitePieces:74\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkAndComputeSpaceUsedBySatellite:79\n\tstorj.io/storj/storagenode/pieces.(*Store).SpaceUsedTotalAndBySatellite:724\n\tstorj.io/storj/storagenode/pieces.(*CacheService).Run.func1:71\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78}|
|2024-08-06T17:01:09Z|ERROR|piecestore:cache|error getting current used space: |{Process: storagenode, error: filewalker: context canceled; filewalker: context canceled; filewalker: context canceled; filewalker: context canceled, errorVerbose: group:\n--- filewalker: context canceled\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkSatellitePieces:74\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkAndComputeSpaceUsedBySatellite:79\n\tstorj.io/storj/storagenode/pieces.(*Store).SpaceUsedTotalAndBySatellite:724\n\tstorj.io/storj/storagenode/pieces.(*CacheService).Run.func1:71\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78\n--- filewalker: context canceled\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkSatellitePieces:74\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkAndComputeSpaceUsedBySatellite:79\n\tstorj.io/storj/storagenode/pieces.(*Store).SpaceUsedTotalAndBySatellite:724\n\tstorj.io/storj/storagenode/pieces.(*CacheService).Run.func1:71\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78\n--- filewalker: context canceled\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkSatellitePieces:74\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkAndComputeSpaceUsedBySatellite:79\n\tstorj.io/storj/storagenode/pieces.(*Store).SpaceUsedTotalAndBySatellite:724\n\tstorj.io/storj/storagenode/pieces.(*CacheService).Run.func1:71\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78\n--- filewalker: context canceled\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkSatellitePieces:74\n\tstorj.io/storj/storagenode/pieces.(*FileWalker).WalkAndComputeSpaceUsedBySatellite:79\n\tstorj.io/storj/storagenode/pieces.(*Store).SpaceUsedTotalAndBySatellite:724\n\tstorj.io/storj/storagenode/pieces.(*CacheService).Run.func1:71\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78}|
|2024-08-06T17:01:22Z|WARN|servers|service takes long to shutdown|{Process: storagenode, name: server}|
|2024-08-06T17:01:22Z|WARN|services|service takes long to shutdown|{Process: storagenode, name: pieces:trash}|
|2024-08-06T17:01:22Z|WARN|services|service takes long to shutdown|{Process: storagenode, name: retain}|
|2024-08-06T17:01:22Z|WARN|services|service takes long to shutdown|{Process: storagenode, name: gracefulexit:chore}|
|2024-08-06T17:01:22Z|INFO|services|slow shutdown|{Process: storagenode, stack: goroutine 1079\n\tstorj.io/storj/private/lifecycle.(*Group).logStackTrace.func1:107\n\tsync.(*Once).doSlow:74\n\tsync.(*Once).Do:65\n\tstorj.io/storj/private/lifecycle.(*Group).logStackTrace:104\n\tstorj.io/storj/private/lifecycle.(*Group).Run.func1:77\n\truntime/pprof.Do:51\n\ngoroutine 1\n\tsync.runtime_Semacquire:62\n\tsync.(*WaitGroup).Wait:116\n\tgolang.org/x/sync/errgroup.(*Group).Wait:56\n\tstorj.io/storj/storagenode.(*Peer).Run:981\n\tmain.cmdRun:125\n\tmain.newRunCmd.func1:33\n\tstorj.io/common/process.cleanup.func1.4:392\n\tstorj.io/common/process.cleanup.func1:410\n\tgithub.com/spf13/cobra.(*Command).execute:983\n\tgithub.com/spf13/cobra.(*Command).ExecuteC:1115\n\tgithub.com/spf13/cobra.(*Command).Execute:1039\n\tstorj.io/common/process.ExecWithCustomOptions:112\n\tmain.main:34\n\ngoroutine 24\n\tgithub.com/golang/glog.(*fileSink).flushDaemon:351\n\ngoroutine 25\n\tgo.opencensus.io/stats/view.(*worker).start:292\n\ngoroutine 4\n\tstorj.io/monkit-jaeger.(*ThriftCollector).Run:174\n\tstorj.io/common/process.cleanup.func1.2:351\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78\n\ngoroutine 84\n\tstorj.io/common/time2.Clock.Sleep:60\n\tstorj.io/common/time2.Sleep:40\n\tstorj.io/common/sync2.Sleep:15\n\tstorj.io/common/telemetry.(*Reporter).Run:43\n\tstorj.io/common/telemetry.(*Client).Run:124\n\ngoroutine 1017\n\tbufio.(*Scanner).Scan:139\n\tstorj.io/storj/private/lifecycle.condenseStack:23\n\tstorj.io/storj/private/lifecycle.(*Group).logStackTrace.func1:114\n\tsync.(*Once).doSlow:74\n\tsync.(*Once).Do:65\n\tstorj.io/storj/private/lifecycle.(*Group).logStackTrace:104\n\tstorj.io/storj/private/lifecycle.(*Group).Run.func1:77\n\truntime/pprof.Do:51\n\ngoroutine 85\n\tstorj.io/eventkit.(*UDPClient).Run:209\n\ngoroutine 5\n\tsync.runtime_Semacquire:62\n\tsync.(*WaitGroup).Wait:116\n\tgolang.org/x/sync/errgroup.(*Group).Wait:56\n\tstorj.io/common/debug.(*Server).Run:205\n\tstorj.io/common/process.initDebug.func1:40\n\ngoroutine 7\n\tos/signal.signal_recv:152\n\tos/signal.loop:23\n\ngoroutine 37\n\tstorj.io/common/process.Ctx.func1:139\n\ngoroutine 26\n\tstorj.io/common/debug.(*Server).Run.func3:184\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78\n\ngoroutine 27\n\tstorj.io/drpc/drpcmigrate.(*ListenMux).Run:90\n\tstorj.io/common/debug.(*Server).Run.func4:188\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78\n\ngoroutine 28\n\tstorj.io/drpc/drpcmigrate.(*listener).Accept:37\n\tnet/http.(*Server).Serve:3260\n\tstorj.io/common/debug.(*Server).Run.func5:197\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78\n\ngoroutine 8\n\tstorj.io/drpc/drpcmigrate.(*ListenMux).monitorContext:106\n\ngoroutine 9\n\tinternal/poll.runtime_pollWait:345\n\tinternal/poll.(*pollDesc).wait:84\n\tinternal/poll.(*pollDesc).waitRead:89\n\tinternal/poll.(*FD).Accept:611\n\tnet.(*netFD).accept:172\n\tnet.(*TCPListener).accept:159\n\tnet.(*TCPListener).Accept:327\n\tstorj.io/drpc/drpcmigrate.(*ListenMux).monitorBase:115\n\ngoroutine 53592\n\tsync.runtime_notifyListWait:569\n\tsync.(*Cond).Wait:70\n\tstorj.io/common/sync2.(*Throttle).ConsumeOrWait:50\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download.func7:813\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78\n\ngoroutine 53489\n\tsync.runtime_notifyListWait:569\n\tsync.(*Cond).Wait:70\n\tstorj.io/common/sync2.(*Throttle).ConsumeOrWait:50\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download.func7:813\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:78\n\ngoroutine 53316\n\tsyscall.Syscall:69\n\tsyscall.read:736\n\tsyscall.Read:181\n\tinternal/poll.ignoringEINTRIO:736\n\tinternal/poll.(*FD).Read:160\n\tos.(*|

maybe all my issues with this node is in btrfs filesystem on one hdd used by this node. so will not use btrfs for the future BTRFS vs EXT4 vs ZFS Filesystem for storj - #2 by Alexey