12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S@us1.storj.io:7777
This is US1, not Saltlake and not AP1
Please, check reasons of disqualification: https://support.storj.io/hc/en-us/articles/4403035941780-Why-is-my-node-disqualified-
I’m concerned about Saltlake by the way - your node is dangerous close to disqualification on it too.
I think you have had a huge data loss recently or in the past and now either audit or repair workers figured that out right now.
About the errors from description - my nodes have them too for US1
X:\storagenode2\storagenode.log:3324762:2021-10-06T04:49:35.329Z ERROR piecedeleter could not send delete piece to trash {"Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Piece ID":"DOEKKSYI356LUJQT5EDLP6DX7IQ5HXBV5FAK734GYBHNBEORX3TA", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
X:\storagenode2\storagenode.log:3896268:2021-10-11T01:10:23.487Z ERROR piecedeleter could not send delete piece to trash {"Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Piece ID":"RMMCDK33GEONLWKZLF35PU3FU4Z7IGWRHVY2XGEFKE4Z7AIF36YQ", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
X:\storagenode2\storagenode.log:4581547:2021-10-15T10:07:53.382Z ERROR piecestore could not get hash and order limit {"error": "v0pieceinfodb: sql: no rows in result set", "errorVerbose": "v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).GetV0PieceInfo:688\n\tstorj.io/storj/storagenode/pieces.(*Store).GetHashAndLimit:468\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download:552\n\tstorj.io/common/pb.DRPCPiecestoreDescription.Method.func2:228\n\tstorj.io/drpc/drpcmux.(*Mux).HandleRPC:33\n\tstorj.io/common/rpc/rpctracing.(*Handler).HandleRPC:58\n\tstorj.io/drpc/drpcserver.(*Server).handleRPC:104\n\tstorj.io/drpc/drpcserver.(*Server).ServeOne:60\n\tstorj.io/drpc/drpcserver.(*Server).Serve.func2:97\n\tstorj.io/drpc/drpcctx.(*Tracker).track:52"}
X:\storagenode2\storagenode.log:4581548:2021-10-15T10:07:53.384Z ERROR piecestore download failed {"Piece ID":"XJUWC2LL74KYPWZO5UV6CILPRUIGSSB7GXGI4XHIJPWDOFWH5XFQ", "Satellite ID":"12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "GET_REPAIR", "error": "v0pieceinfodb: sql: no rows in result set", "errorVerbose": "v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).GetV0PieceInfo:688\n\tstorj.io/storj/storagenode/pieces.(*Store).GetHashAndLimit:468\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download:552\n\tstorj.io/common/pb.DRPCPiecestoreDescription.Method.func2:228\n\tstorj.io/drpc/drpcmux.(*Mux).HandleRPC:33\n\tstorj.io/common/rpc/rpctracing.(*Handler).HandleRPC:58\n\tstorj.io/drpc/drpcserver.(*Server).handleRPC:104\n\tstorj.io/drpc/drpcserver.(*Server).ServeOne:60\n\tstorj.io/drpc/drpcserver.(*Server).Serve.func2:97\n\tstorj.io/drpc/drpcctx.(*Tracker).track:52"}
The other node have them not only for US1, but for AP1 too:
Y:\storagenode3\storagenode.log:727478:2021-10-10T10:50:09.065+0300 ERROR piecedeleter could not send delete piece to trash {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID":"HMQ7WMAZHXVVDPQAUQYXF7DIXU2SKN3AL2QGSG2KVJWVQNADTFSQ", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
Y:\storagenode3\storagenode.log:803861:2021-10-11T13:40:30.507+0300 ERROR piecedeleter could not send delete piece to trash {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID":"FHX7L65SCIQNL5EX6FFY6VAL2FOVZH5VMGCW3IQB736KZVLI3MNQ", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
PS C:\Users\USER> sls "v0pieceinfodb: sql" w:\storagenode5\storagenode.log | select -last 10
W:\storagenode5\storagenode.log:656374:2021-10-09T05:16:59.315Z ERROR piecedeleter could not send delete piece to trash {"Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Piece ID":"PSWFH5J755T5JZDYGBGQP3ELTWZJZNXL6EHMPKL6Z3E4EI7LJVSA", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2022-06-03T08:09:26.009Z ERROR piecedeleter could not send delete piece to trash {"Process": "storagenode", "Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Piece ID": "UAPDXDA2KRBHO7UN7PPGOXZKZ7CWBFYOXGXMM5GF66YR736Q4CCQ", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
Have a look at other log lines for the piece. Usually this is just a piece that was deleted or expired. I’ve had a couple of those and they all were that scenario.
Do you still get this error? I just found one on one of my nodes. It’s been restarted yesterday for some maintenance and today I got one error: “could not send delete piece to trash”. I didn’t look for it untill now, and the old logs are gone. Is it a normal occurence?
Do you only log Errors? Maybe there is more info. I have similar entries but shouldn’t be an issue. Something like this, for example:
2023-04-13T18:47:32.156+0200 INFO piecedeleter delete piece sent to trash {"Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Piece ID": "WTFFJWB6KJVDFREF432YFOZCJ3VFDXTTQOQWO2GUIGH2U7K3LSOQ"}
2023-04-13T18:47:32.156+0200 WARN pieces failed to migrate v0 piece. Piece may not be recoverable
2023-04-13T18:47:32.156+0200 ERROR piecedeleter could not send delete piece to trash {"Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Piece ID": "WTFFJWB6KJVDFREF432YFOZCJ3VFDXTTQOQWO2GUIGH2U7K3LSOQ", "error": "pieces error: filestore error: file does not exist", "errorVerbose": "pieces error: filestore error: file does not exist\n\tstorj.io/storj/storage/filestore.(*blobStore).Stat:105\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).pieceSizes:245\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).Trash:290\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:388\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:75"}
Yes so I mean you may just don’t see the related info.
If you look at my previous comment the first line “info” is that the piece was sent to trash the second line “warn” was trying to recover the piece because third line “error” was that the same piece could not send to trash because it aready was sent like shown in the first line. See “Piece ID” is the same.
If they all with v0, it’s likely mean that these pieces either lost or we have a bug here.
Can you confirm that these pieces were uploaded to your node, downloaded then deleted/moved to the trash and now it’s trying to move to the trash again?
@Alexey
I don’t know how to do that. I run logs in error mode, so no uploads and downloads info. Is there a way to identify pieces by file names, maybe?