Audit scores dropping on ap1; Storj is working on a fix, SNO's don't need to do anything. ERROR piecedeleter could not send delete piece to trash >> Pieces error: v0pieceinfodb: sql: no rows in result set

Already reported here

Yes, this confirms you’re seeing the same issue. I guess we’ll have to wait and see what they’ll find.

I have my suspicion that this is more likely an issue on the satellite end. Since it seems the initial delete is processed just fine on the storage node end. It should never have received any subsequent delete requests or audits from the satellite. It seems the satellite thinks the deletes haven’t been processed correctly, even though they have. So I’d say it’s either on the satellite end or in the communication between the two. Based on the logs it doesn’t seem like the storagenode is doing anything wrong.

1 Like

What does this mean in detail? Do you have only one piece id “GNEAJQTG4SM4YSEHFLDGIHTQO45YECPE53VLG3MJ6AC7DYKUVWOA” that causes the error?

In my case there are tons of pieces (in total 45072) causing the same error.

We all have tons of those delete errors. But luckily very few of them are audited. That log shows the history of a specific piece that was audited.

1 Like

Just noticed this anomaly today on a couple of my nodes:

So this explains why my logs suddenly exploded. On my bigger node I have seen over 230k entries for “could not send delete piece to trash” in the last 24 hours. Very happy that I have set-up logrotate at this point.

My main node is now at 93.27 % on ap1 and has 21622 pieces that have been deleted successfully and then since failed to delete as they’re already deleted. Of these pieces 3 have been attempted to be audited and failed.

My first piece deleted was:

2021-07-22T20:58:11.444+0100 INFO piecedeleter delete piece sent to trash {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "222B3CET6VBMUH72OSGSXRQMBNGMHUJPR5AUWVZSC4MX5Z6FMDXA"}

And the first duplicate deletion:

2021-07-22T22:05:06.481+0100 ERROR piecedeleter could not send delete piece to trash {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "222B3CET6VBMUH72OSGSXRQMBNGMHUJPR5AUWVZSC4MX5Z6FMDXA", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}

1 Like

As Alexey has previously stated, our dev team is already working on resolving this issue. We are very sorry for this incident. No action from SNOs is needed, and there is no need to post more logs at this time. We already have the node IDs affected by this issue so in the event that any of them should get disqualified due to this issue, we will manually reinstate them asap. We will let you know if we have any update. Thank you for your patience.

18 Likes

Thanks for getting back to us from both you and @Alexey ! Good to see you’re all on top of this. Hopefully it won’t cut into your weekends. Let us know if we can do anything else to help.

12 Likes

duno if this is related… but noticed it and the timing of it seems to correlate pretty closely…
ofc the graphs in question isn’t very useful since they have no reference numbers to gauge the drop with…

most of the egress and ingress graphs from the satellites seems to drop dead around the last storagenode update deployment.

might be nothing, but a broken or changed API, but figured i would mention it as it seems odd and could be related, and its never easy to know which hints leads helps to narrow down a problem.

so here it is.
https://storjstats.info/d/storj/storj-network-statistics?orgId=1

It’s not related. The bad statistics were mentioned elsewhere and they’ve made some corrections.

One small update on this issue. I’m now seeing errors for repairs on these pieces as well. Here’s an example.

2021-07-22T20:08:41.735Z        INFO    piecedeleter    delete piece sent to trash      {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A"}
2021-07-22T21:15:00.416Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-22T22:19:50.397Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-22T23:41:37.863Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T00:29:38.521Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T01:21:32.608Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T02:39:54.231Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T03:39:48.551Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T04:33:47.393Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T05:25:16.466Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T06:47:25.071Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T07:58:43.582Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T09:14:50.025Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T10:24:26.117Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T11:23:04.481Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T12:40:13.502Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T20:00:29.542Z        INFO    piecestore      download started        {"Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Action": "GET_REPAIR"}
2021-07-23T20:00:29.547Z        ERROR   piecestore      download failed {"Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Action": "GET_REPAIR", "error": "file does not exist", "errorVerbose": "file does not exist\n\tstorj.io/common/rpc/rpcstatus.Wrap:73\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download:534\n\tstorj.io/common/pb.DRPCPiecestoreDescription.Method.func2:217\n\tstorj.io/drpc/drpcmux.(*Mux).HandleRPC:33\n\tstorj.io/common/rpc/rpctracing.(*Handler).HandleRPC:58\n\tstorj.io/drpc/drpcserver.(*Server).handleRPC:102\n\tstorj.io/drpc/drpcserver.(*Server).ServeOne:60\n\tstorj.io/drpc/drpcserver.(*Server).Serve.func2:95\n\tstorj.io/drpc/drpcctx.(*Tracker).track:51"}

This isn’t necessarily surprising considering the source of these issues. But I figured it might be worth mentioning.

Yep, same here.

BTW, a question to @Storjlings, would it make sense to temporarily suspend, or significantly reduce the rate of audits on the affected satellite, so that node operators affected by false failed audits can sleep well? I hope this is a tunable satellite parameter…

I will sleep just fine after reading this message. I wouldn’t worry about it. I know from experiences during the beta that Storj Labs will do the right thing in these scenarios.

Stopping audits may not even be so easy as repairs are also used for reputation as far as I know and they can’t stop repair on the satellite as that would risk data loss. They’re on it. You can trust them to do the right thing. Just give em some time to fix the underlying issue. If it helps, it’s definitely in their best interest to reinstate good nodes as well.

3 Likes

2 posts were split to a new topic: Audit falling on eu1

this is unlikely to be related to the issue on AP1, if you are seeing similar errors on EU1, please post your log on a new thread, lets keep this one focused on the issue on AP1 thanks.

2 Likes

Thanks for pointing this out - devs are already aware of this, but this will not affect node reputation. They are already working on applying the fix so please stand by.

1 Like

12 hours without an audit failure on any satellite… woop woop
seems like the fix is working,

a bit to close for comfort, ofc thats easy to say…

I have this situation by today morning:

I was looking into the storagenode.log file and found quite many of these kind of lines, 1 or 2 almost in every minute:

2021-07-24T12:22:23.248+0200 ERROR piecestore download failed {Piece ID: ZGW2ZEQM44G2PGUU2LINYHVI5HKUIMPSZD4OJNVL3MRVFSFW2YKQ, Satellite ID: 121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6, Action: GET_REPAIR, error: file does not exist, errorVerbose: file does not exist\n\tstorj.io/common/rpc/rpcstatus.Wrap:73\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download:534\n\tstorj.io/common/pb.DRPCPiecestoreDescription.Method.func2:217\n\tstorj.io/drpc/drpcmux.(*Mux).HandleRPC:33\n\tstorj.io/common/rpc/rpctracing.(*Handler).HandleRPC:58\n\tstorj.io/drpc/drpcserver.(*Server).handleRPC:102\n\tstorj.io/drpc/drpcserver.(*Server).ServeOne:60\n\tstorj.io/drpc/drpcserver.(*Server).Serve.func2:95\n\tstorj.io/drpc/drpcctx.(*Tracker).track:51}
2021-07-24T12:22:29.947+0200 INFO piecestore upload started {Piece ID: ENIIT5INBZAFCCG5YZ2IK6X3O47JVXZEXKQW32NLDHKPUGOZT46Q, Satellite ID: 12rfG3sh9NCWiX3ivPjq2HtdLmbqCrvHVEzJubnzFzosMuawymB, Action: PUT_REPAIR, Available Space: 1124694833152}
2021-07-24T12:22:30.584+0200 INFO piecestore uploaded {Piece ID: ENIIT5INBZAFCCG5YZ2IK6X3O47JVXZEXKQW32NLDHKPUGOZT46Q, Satellite ID: 12rfG3sh9NCWiX3ivPjq2HtdLmbqCrvHVEzJubnzFzosMuawymB, Action: PUT_REPAIR, Size: 2319360}
2021-07-24T12:22:30.978+0200 INFO piecestore download started {Piece ID: AW77ICRZWM5XAWRQ4VQ772BERNBS4XCGIYI4XERBHIY7ROVWHPAQ, Satellite ID: 1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE, Action: GET_REPAIR}
2021-07-24T12:22:31.052+0200 INFO piecestore downloaded {Piece ID: AW77ICRZWM5XAWRQ4VQ772BERNBS4XCGIYI4XERBHIY7ROVWHPAQ, Satellite ID: 1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE, Action: GET_REPAIR}
2021-07-24T12:22:31.360+0200 INFO piecestore download started {Piece ID: T32U7HMCFBVGDQPZ6MXTJVHUJNFIRSSQFXNE7T7YLOG7OHTGYHFQ, Satellite ID: 121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6, Action: GET_REPAIR}
2021-07-24T12:22:31.442+0200 ERROR piecestore download failed {Piece ID: T32U7HMCFBVGDQPZ6MXTJVHUJNFIRSSQFXNE7T7YLOG7OHTGYHFQ, Satellite ID: 121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6, Action: GET_REPAIR, error: file does not exist, errorVerbose: file does not exist\n\tstorj.io/common/rpc/rpcstatus.Wrap:73\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download:534\n\tstorj.io/common/pb.DRPCPiecestoreDescription.Method.func2:217\n\tstorj.io/drpc/drpcmux.(*Mux).HandleRPC:33\n\tstorj.io/common/rpc/rpctracing.(*Handler).HandleRPC:58\n\tstorj.io/drpc/drpcserver.(*Server).handleRPC:102\n\tstorj.io/drpc/drpcserver.(*Server).ServeOne:60\n\tstorj.io/drpc/drpcserver.(*Server).Serve.func2:95\n\tstorj.io/drpc/drpcctx.(*Tracker).track:51}

Also, I see these in the log:

2021-07-24T12:17:36.300+0200 WARN contact:service Your node is still considered to be online but encountered an error. {Satellite ID: 12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs, Error: contact: failed to dial storage node (ID: 1FYXCFTDX4DN5R5Y5GHTmvBrttN3sBYWawiWzgx6UNdSdQameJ) at address birtok.myddns.me:28967 using QUIC: rpc: quic: timeout: no recent network activity}
2021-07-24T12:17:36.422+0200 WARN contact:service Your node is still considered to be online but encountered an error. {Satellite ID: 12rfG3sh9NCWiX3ivPjq2HtdLmbqCrvHVEzJubnzFzosMuawymB, Error: contact: failed to dial storage node (ID: 1FYXCFTDX4DN5R5Y5GHTmvBrttN3sBYWawiWzgx6UNdSdQameJ) at address birtok.myddns.me:28967 using QUIC: rpc: quic: timeout: no recent network activity}
2021-07-24T12:17:36.778+0200 WARN contact:service Your node is still considered to be online but encountered an error. {Satellite ID: 12tRQrMTWUWwzwGh18i7Fqs67kmdhH9t6aToeiwbo5mfS2rUmo, Error: contact: failed to dial storage node (ID: 1FYXCFTDX4DN5R5Y5GHTmvBrttN3sBYWawiWzgx6UNdSdQameJ) at address birtok.myddns.me:28967 using QUIC: rpc: quic: timeout: no recent network activity}
2021-07-24T12:17:37.080+0200 WARN contact:service Your node is still considered to be online but encountered an error. {Satellite ID: 1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE, Error: contact: failed to dial storage node (ID: 1FYXCFTDX4DN5R5Y5GHTmvBrttN3sBYWawiWzgx6UNdSdQameJ) at address birtok.myddns.me:28967 using QUIC: rpc: quic: timeout: no recent network activity}
2021-07-24T12:17:37.292+0200 WARN contact:service Your node is still considered to be online but encountered an error. {Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Error: contact: failed to dial storage node (ID: 1FYXCFTDX4DN5R5Y5GHTmvBrttN3sBYWawiWzgx6UNdSdQameJ) at address birtok.myddns.me:28967 using QUIC: rpc: quic: timeout: no recent network activity}
2021-07-24T12:17:37.763+0200 INFO piecestore uploaded {Piece ID: U4HVWXSMWPQQZXW22NR3KSFBMUGO4QB2HJ4POZQ47DFDAOFR3UGA, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: PUT_REPAIR, Size: 2212096}
2021-07-24T12:17:37.824+0200 WARN contact:service Your node is still considered to be online but encountered an error. {Satellite ID: 121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6, Error: contact: failed to dial storage node (ID: 1FYXCFTDX4DN5R5Y5GHTmvBrttN3sBYWawiWzgx6UNdSdQameJ) at address birtok.myddns.me:28967 using QUIC: rpc: quic: timeout: no recent network activity}
2021-07-24T12:17:39.848+0200 INFO piecestore download started {Piece ID: C7BKMP5LMBRTRFMXZDFZAUDHCMQXU6G477L36FCEPUMZFNN5U6DQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR}
2021-07-24T12:17:40.658+0200 INFO piecestore download started {Piece ID: SWTKZD53CVH3ZVHOFMESAI6UWWL5SJFOE4NB4B6UZEDET2EYU4TQ, Satellite ID: 12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs, Action: GET_REPAIR}
2021-07-24T12:17:40.807+0200 INFO piecestore downloaded {Piece ID: SWTKZD53CVH3ZVHOFMESAI6UWWL5SJFOE4NB4B6UZEDET2EYU4TQ, Satellite ID: 12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs, Action: GET_REPAIR}
2021-07-24T12:17:40.838+0200 INFO piecestore downloaded {Piece ID: C7BKMP5LMBRTRFMXZDFZAUDHCMQXU6G477L36FCEPUMZFNN5U6DQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR}
2021-07-24T12:17:41.056+0200 INFO piecestore upload started {Piece ID: TTOYNX4I4O6TLBVYDNNZ3N6BCMLSV44ADLHN7RRILQFF64VUZSIQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: PUT, Available Space: 1124741660928}
2021-07-24T12:17:42.360+0200 INFO piecestore uploaded {Piece ID: TTOYNX4I4O6TLBVYDNNZ3N6BCMLSV44ADLHN7RRILQFF64VUZSIQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: PUT, Size: 181504}
2021-07-24T12:17:42.543+0200 INFO piecestore upload started {Piece ID: 77ZAS4SMHPJEF5CRABRYOED6EEAIPUSPAWNYDE7NE5F47S3VIC6Q, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: PUT, Available Space: 1124741478912}
2021-07-24T12:17:44.838+0200 INFO piecestore download started {Piece ID: XFTBOE5Y7JNCD4FIYFSPHUOWKFSE25Y3D2EO3Y67S2XW2MOP7NEA, Satellite ID: 121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6, Action: GET_REPAIR}
2021-07-24T12:17:45.143+0200 INFO piecestore downloaded {Piece ID: XFTBOE5Y7JNCD4FIYFSPHUOWKFSE25Y3D2EO3Y67S2XW2MOP7NEA, Satellite ID: 121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6, Action: GET_REPAIR}
2021-07-24T12:17:48.128+0200 INFO piecestore upload started {Piece ID: MDI5T6PPSOQUGTKRT65NTYCZB3WDKBS2KXUJVDLEHCI5LLVJOTWQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: PUT, Available Space: 1124741478912}
2021-07-24T12:17:48.431+0200 INFO piecestore download started {Piece ID: HRBQDR6PBOUKPMYLOMSE3IQH3XQXXD6RSMCRSM77MGYQGTKFMFGQ, Satellite ID: 12rfG3sh9NCWiX3ivPjq2HtdLmbqCrvHVEzJubnzFzosMuawymB, Action: GET_REPAIR}
2021-07-24T12:17:48.705+0200 INFO piecestore upload started {Piece ID: EJBQ54NCDPB6UFX6F44IZJWMFFJ5AFYJV4LDV64SGXR5ETOIA5TA, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: PUT, Available Space: 1124741478912}
2021-07-24T12:17:48.829+0200 INFO piecestore upload started {Piece ID: K4DK2ZB55XNZFB76U4R26S7EO4UHI267GGVYTTD2IBA7ZDS3A4MQ, Satellite ID: 12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs, Action: PUT, Available Space: 1124741478912}
2021-07-24T12:17:49.146+0200 INFO piecestore uploaded {Piece ID: K4DK2ZB55XNZFB76U4R26S7EO4UHI267GGVYTTD2IBA7ZDS3A4MQ, Satellite ID: 12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs, Action: PUT, Size: 145408}

Is it related to this issue?

Spoke to soon… the problem persists.
during the last 4 hours i got another 2 failures on trying to track them down…