Piecedeleter could not send delete piece to trash .. "error": "pieces error: v0pieceinfodb: sql: no rows in result set"

1.40.4 docker the problem is still there
image

2021-10-15T21:30:26.733Z ERROR piecedeleter could not send delete piece to trash {“Satellite ID”: “12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S”, “Piece ID”: “4WUJEXNU4TNVWGS4N2QC4D3EUJWI3HGTZAONEJ3N775CPTENEXPQ”, “error”: “pieces error: v0pieceinfodb: sql: no rows in result set”, “errorVerbose”: “pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57”}
2021-10-15T21:30:30.677Z ERROR piecedeleter could not send delete piece to trash {“Satellite ID”: “12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S”, “Piece ID”: “NS2L7MSCY35FBXMSMPWEGTYFU4CMRMNYTACJ3LI7PWLHD5CSTVHQ”, “error”: “pieces error: v0pieceinfodb: sql: no rows in result set”, “errorVerbose”: “pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57”}
2021-10-15T21:31:13.290Z ERROR piecedeleter could not send delete piece to trash {“Satellite ID”: “12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S”, “Piece ID”: “ROEMYFJ3TTLSCODGVGZB6SZCWVTFI7MRQXIU7J2RXF647PUB4E3A”, “error”: “pieces error: v0pieceinfodb: sql: no rows in result set”, “errorVerbose”: “pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57”}
2021-10-15T21:31:52.444Z ERROR piecedeleter could not send delete piece to trash {“Satellite ID”: “12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S”, “Piece ID”: “RMOL4MLOYBL4YR7HNY4ZXUHYUDKBWH5IKLX5QNKZSWIYGEYMPRKQ”, “error”: “pieces error: v0pieceinfodb: sql: no rows in result set”, “errorVerbose”: “pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57”}
2021-10-15T21:31:58.634Z ERROR piecedeleter could not send delete piece to trash {“Satellite ID”: “12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S”, “Piece ID”: “J5TZXW7EJRSSNOJBHKNT6DQYH2CYM42ER7OYTNAS6CZECEMWXM5A”, “error”: “pieces error: v0pieceinfodb: sql: no rows in result set”, “errorVerbose”: “pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57”}
2021-10-15T21:32:08.528Z ERROR piecedeleter could not send delete piece to trash {“Satellite ID”: “12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S”, “Piece ID”: “CHJ5PY752DYSBSZU4L7I5A5PICFMT5NTOEMRGU3KDA3XX7VADAPA”, “error”: “pieces error: v0pieceinfodb: sql: no rows in result set”, “errorVerbose”: “pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57”}
2021-10-15T21:32:09.930Z ERROR piecedeleter could not send delete piece to trash {“Satellite ID”: “12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S”, “Piece ID”: “CTQZMBOCGIRTUSG5WYYXWJA2AWRQ3YFVONWVIOZ5XAGFAFRUJR4Q”, “error”: “pieces error: v0pieceinfodb: sql: no rows in result set”, “errorVerbose”: “pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57”}
2021-10-15T21:32:10.229Z ERROR piecedeleter could not send delete piece to trash {“Satellite ID”: “12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S”, “Piece ID”: “CGYOE33CYVH5X5QYGRANWMCJOVD4CWU7B7ELCXX67WW6DXWHPGFQ”, “error”: “pieces error: v0pieceinfodb: sql: no rows in result set”, “errorVerbose”: “pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57”}

12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S@us1.storj.io:7777
This is US1, not Saltlake and not AP1

Please, check reasons of disqualification: https://support.storj.io/hc/en-us/articles/4403035941780-Why-is-my-node-disqualified-
I’m concerned about Saltlake by the way - your node is dangerous close to disqualification on it too.
I think you have had a huge data loss recently or in the past and now either audit or repair workers figured that out right now.

About the errors from description - my nodes have them too for US1

X:\storagenode2\storagenode.log:3324762:2021-10-06T04:49:35.329Z        ERROR   piecedeleter    could not send delete piece to trash   {"Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Piece ID":"DOEKKSYI356LUJQT5EDLP6DX7IQ5HXBV5FAK734GYBHNBEORX3TA", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
X:\storagenode2\storagenode.log:3896268:2021-10-11T01:10:23.487Z        ERROR   piecedeleter    could not send delete piece to trash   {"Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Piece ID":"RMMCDK33GEONLWKZLF35PU3FU4Z7IGWRHVY2XGEFKE4Z7AIF36YQ", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
X:\storagenode2\storagenode.log:4581547:2021-10-15T10:07:53.382Z        ERROR   piecestore      could not get hash and order limit {"error": "v0pieceinfodb: sql: no rows in result set", "errorVerbose": "v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).GetV0PieceInfo:688\n\tstorj.io/storj/storagenode/pieces.(*Store).GetHashAndLimit:468\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download:552\n\tstorj.io/common/pb.DRPCPiecestoreDescription.Method.func2:228\n\tstorj.io/drpc/drpcmux.(*Mux).HandleRPC:33\n\tstorj.io/common/rpc/rpctracing.(*Handler).HandleRPC:58\n\tstorj.io/drpc/drpcserver.(*Server).handleRPC:104\n\tstorj.io/drpc/drpcserver.(*Server).ServeOne:60\n\tstorj.io/drpc/drpcserver.(*Server).Serve.func2:97\n\tstorj.io/drpc/drpcctx.(*Tracker).track:52"}
X:\storagenode2\storagenode.log:4581548:2021-10-15T10:07:53.384Z        ERROR   piecestore      download failed {"Piece ID":"XJUWC2LL74KYPWZO5UV6CILPRUIGSSB7GXGI4XHIJPWDOFWH5XFQ", "Satellite ID":"12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "GET_REPAIR", "error": "v0pieceinfodb: sql: no rows in result set", "errorVerbose": "v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).GetV0PieceInfo:688\n\tstorj.io/storj/storagenode/pieces.(*Store).GetHashAndLimit:468\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download:552\n\tstorj.io/common/pb.DRPCPiecestoreDescription.Method.func2:228\n\tstorj.io/drpc/drpcmux.(*Mux).HandleRPC:33\n\tstorj.io/common/rpc/rpctracing.(*Handler).HandleRPC:58\n\tstorj.io/drpc/drpcserver.(*Server).handleRPC:104\n\tstorj.io/drpc/drpcserver.(*Server).ServeOne:60\n\tstorj.io/drpc/drpcserver.(*Server).Serve.func2:97\n\tstorj.io/drpc/drpcctx.(*Tracker).track:52"}

The other node have them not only for US1, but for AP1 too:

Y:\storagenode3\storagenode.log:727478:2021-10-10T10:50:09.065+0300     ERROR   piecedeleter    could not send delete piece to trash   {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID":"HMQ7WMAZHXVVDPQAUQYXF7DIXU2SKN3AL2QGSG2KVJWVQNADTFSQ", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
Y:\storagenode3\storagenode.log:803861:2021-10-11T13:40:30.507+0300     ERROR   piecedeleter    could not send delete piece to trash   {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID":"FHX7L65SCIQNL5EX6FFY6VAL2FOVZH5VMGCW3IQB736KZVLI3MNQ", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
PS C:\Users\USER> sls "v0pieceinfodb: sql" w:\storagenode5\storagenode.log | select -last 10

W:\storagenode5\storagenode.log:656374:2021-10-09T05:16:59.315Z ERROR   piecedeleter    could not send delete piece to trash   {"Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Piece ID":"PSWFH5J755T5JZDYGBGQP3ELTWZJZNXL6EHMPKL6Z3E4EI7LJVSA", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}

Created an issue: Piecedeleter could not send delete piece to trash .. "error": "pieces error: v0pieceinfodb: sql: no rows in result set" · Issue #4225 · storj/storj · GitHub

3 Likes

I am seeing this error too with a good node:

2022-06-03T08:09:26.009Z        ERROR   piecedeleter    could not send delete piece to trash   {"Process": "storagenode", "Satellite ID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Piece ID": "UAPDXDA2KRBHO7UN7PPGOXZKZ7CWBFYOXGXMM5GF66YR736Q4CCQ", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}

Edit: I have just realized that this is the node that had been over the assigned space. I wonder if that is a coincidence.

It’s worth checking pieceinfo.db for errors. This could explain both issues.

You mean checking it like it is described here: https://support.storj.io/hc/en-us/articles/360029309111-How-to-fix-a-database-disk-image-is-malformed- ?

Yes, if that one isn’t up to date it might not know your space is full until it’s too late. Worth a try.

Ok, I’ll put that on my to do list.

sqlite3 pieceinfo.db "PRAGMA integrity_check;"
ok

That’s all? It took a millisecond for execution. Or am I doing something wrong?

No that’s it, it looks fine. The github issue also still seems to be open. So I dunno, might be unrelated to the other issue you saw.

1 Like

I am seeing this quite often on this node now:

docker logs storagenode | grep -c "no rows in result set"
68

I wonder if I should become concerned.

I don’t know if it is connected to the issue, but I am also seeing

2022-06-03T16:03:33.176Z        ERROR   piecestore      download failed {"Process": "storagenode", "Piece ID": "XJVYJNDVZEBMGHGEL2G5HJGIZZBOYLYTGX47DTKME5VCNG4ITMLQ", "Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Action": "GET", "error": "file does not exist", "errorVerbose": "file does not exist\n\tstorj.io/common/rpc/rpcstatus.Wrap:73\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download:546\n\tstorj.io/common/pb.DRPCPiecestoreDescription.Method.func2:228\n\tstorj.io/drpc/drpcmux.(*Mux).HandleRPC:33\n\tstorj.io/common/rpc/rpctracing.(*Handler).HandleRPC:58\n\tstorj.io/drpc/drpcserver.(*Server).handleRPC:122\n\tstorj.io/drpc/drpcserver.(*Server).ServeOne:66\n\tstorj.io/drpc/drpcserver.(*Server).Serve.func2:112\n\tstorj.io/drpc/drpcctx.(*Tracker).track:52"}

on this node.

Have a look at other log lines for the piece. Usually this is just a piece that was deleted or expired. I’ve had a couple of those and they all were that scenario.

Do you still get this error? I just found one on one of my nodes. It’s been restarted yesterday for some maintenance and today I got one error: “could not send delete piece to trash”. I didn’t look for it untill now, and the old logs are gone. Is it a normal occurence?

Do you only log Errors? Maybe there is more info. I have similar entries but shouldn’t be an issue. Something like this, for example:

2023-04-13T18:47:32.156+0200	INFO	piecedeleter	delete piece sent to trash	{"Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Piece ID": "WTFFJWB6KJVDFREF432YFOZCJ3VFDXTTQOQWO2GUIGH2U7K3LSOQ"}
2023-04-13T18:47:32.156+0200	WARN	pieces	failed to migrate v0 piece. Piece may not be recoverable
2023-04-13T18:47:32.156+0200	ERROR	piecedeleter	could not send delete piece to trash	{"Satellite ID": "12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S", "Piece ID": "WTFFJWB6KJVDFREF432YFOZCJ3VFDXTTQOQWO2GUIGH2U7K3LSOQ", "error": "pieces error: filestore error: file does not exist", "errorVerbose": "pieces error: filestore error: file does not exist\n\tstorj.io/storj/storage/filestore.(*blobStore).Stat:105\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).pieceSizes:245\n\tstorj.io/storj/storagenode/pieces.(*BlobsUsageCache).Trash:290\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:388\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:75"}

Yes, only errors. I have 1 in 2 days on each node.

Yes so I mean you may just don’t see the related info.
If you look at my previous comment the first line “info” is that the piece was sent to trash the second line “warn” was trying to recover the piece because third line “error” was that the same piece could not send to trash because it aready was sent like shown in the first line. See “Piece ID” is the same.

You won’t see this if you log only errors.

If they all with v0, it’s likely mean that these pieces either lost or we have a bug here.
Can you confirm that these pieces were uploaded to your node, downloaded then deleted/moved to the trash and now it’s trying to move to the trash again?

@Alexey
I don’t know how to do that. I run logs in error mode, so no uploads and downloads info. Is there a way to identify pieces by file names, maybe?

If you have errors, but do not have mentions of v0, you probably OK. And also this is a wrong topic then :wink: