Audit scores dropping on ap1; Storj is working on a fix, SNO's don't need to do anything. ERROR piecedeleter could not send delete piece to trash >> Pieces error: v0pieceinfodb: sql: no rows in result set

So this explains why my logs suddenly exploded. On my bigger node I have seen over 230k entries for “could not send delete piece to trash” in the last 24 hours. Very happy that I have set-up logrotate at this point.

My main node is now at 93.27 % on ap1 and has 21622 pieces that have been deleted successfully and then since failed to delete as they’re already deleted. Of these pieces 3 have been attempted to be audited and failed.

My first piece deleted was:

2021-07-22T20:58:11.444+0100 INFO piecedeleter delete piece sent to trash {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "222B3CET6VBMUH72OSGSXRQMBNGMHUJPR5AUWVZSC4MX5Z6FMDXA"}

And the first duplicate deletion:

2021-07-22T22:05:06.481+0100 ERROR piecedeleter could not send delete piece to trash {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "222B3CET6VBMUH72OSGSXRQMBNGMHUJPR5AUWVZSC4MX5Z6FMDXA", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}

1 Like

As Alexey has previously stated, our dev team is already working on resolving this issue. We are very sorry for this incident. No action from SNOs is needed, and there is no need to post more logs at this time. We already have the node IDs affected by this issue so in the event that any of them should get disqualified due to this issue, we will manually reinstate them asap. We will let you know if we have any update. Thank you for your patience.

18 Likes

Thanks for getting back to us from both you and @Alexey ! Good to see you’re all on top of this. Hopefully it won’t cut into your weekends. Let us know if we can do anything else to help.

12 Likes

duno if this is related… but noticed it and the timing of it seems to correlate pretty closely…
ofc the graphs in question isn’t very useful since they have no reference numbers to gauge the drop with…

most of the egress and ingress graphs from the satellites seems to drop dead around the last storagenode update deployment.

might be nothing, but a broken or changed API, but figured i would mention it as it seems odd and could be related, and its never easy to know which hints leads helps to narrow down a problem.

so here it is.
https://storjstats.info/d/storj/storj-network-statistics?orgId=1

It’s not related. The bad statistics were mentioned elsewhere and they’ve made some corrections.

One small update on this issue. I’m now seeing errors for repairs on these pieces as well. Here’s an example.

2021-07-22T20:08:41.735Z        INFO    piecedeleter    delete piece sent to trash      {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A"}
2021-07-22T21:15:00.416Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-22T22:19:50.397Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-22T23:41:37.863Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T00:29:38.521Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T01:21:32.608Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T02:39:54.231Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T03:39:48.551Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T04:33:47.393Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T05:25:16.466Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T06:47:25.071Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T07:58:43.582Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T09:14:50.025Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T10:24:26.117Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T11:23:04.481Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T12:40:13.502Z        ERROR   piecedeleter    could not send delete piece to trash    {"Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "error": "pieces error: v0pieceinfodb: sql: no rows in result set", "errorVerbose": "pieces error: v0pieceinfodb: sql: no rows in result set\n\tstorj.io/storj/storagenode/storagenodedb.(*v0PieceInfoDB).Get:131\n\tstorj.io/storj/storagenode/pieces.(*Store).MigrateV0ToV1:404\n\tstorj.io/storj/storagenode/pieces.(*Store).Trash:348\n\tstorj.io/storj/storagenode/pieces.(*Deleter).deleteOrTrash:185\n\tstorj.io/storj/storagenode/pieces.(*Deleter).work:135\n\tstorj.io/storj/storagenode/pieces.(*Deleter).Run.func1:72\n\tgolang.org/x/sync/errgroup.(*Group).Go.func1:57"}
2021-07-23T20:00:29.542Z        INFO    piecestore      download started        {"Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Action": "GET_REPAIR"}
2021-07-23T20:00:29.547Z        ERROR   piecestore      download failed {"Piece ID": "BPBMRLTWVDS325O6B5NXRDHGT2H5QLCLXHP6ENEA6AVTOGJRLN6A", "Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Action": "GET_REPAIR", "error": "file does not exist", "errorVerbose": "file does not exist\n\tstorj.io/common/rpc/rpcstatus.Wrap:73\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download:534\n\tstorj.io/common/pb.DRPCPiecestoreDescription.Method.func2:217\n\tstorj.io/drpc/drpcmux.(*Mux).HandleRPC:33\n\tstorj.io/common/rpc/rpctracing.(*Handler).HandleRPC:58\n\tstorj.io/drpc/drpcserver.(*Server).handleRPC:102\n\tstorj.io/drpc/drpcserver.(*Server).ServeOne:60\n\tstorj.io/drpc/drpcserver.(*Server).Serve.func2:95\n\tstorj.io/drpc/drpcctx.(*Tracker).track:51"}

This isn’t necessarily surprising considering the source of these issues. But I figured it might be worth mentioning.

Yep, same here.

BTW, a question to @Storjlings, would it make sense to temporarily suspend, or significantly reduce the rate of audits on the affected satellite, so that node operators affected by false failed audits can sleep well? I hope this is a tunable satellite parameter…

I will sleep just fine after reading this message. I wouldn’t worry about it. I know from experiences during the beta that Storj Labs will do the right thing in these scenarios.

Stopping audits may not even be so easy as repairs are also used for reputation as far as I know and they can’t stop repair on the satellite as that would risk data loss. They’re on it. You can trust them to do the right thing. Just give em some time to fix the underlying issue. If it helps, it’s definitely in their best interest to reinstate good nodes as well.

3 Likes

2 posts were split to a new topic: Audit falling on eu1

this is unlikely to be related to the issue on AP1, if you are seeing similar errors on EU1, please post your log on a new thread, lets keep this one focused on the issue on AP1 thanks.

2 Likes

Thanks for pointing this out - devs are already aware of this, but this will not affect node reputation. They are already working on applying the fix so please stand by.

1 Like

12 hours without an audit failure on any satellite… woop woop
seems like the fix is working,

a bit to close for comfort, ofc thats easy to say…

I have this situation by today morning:

I was looking into the storagenode.log file and found quite many of these kind of lines, 1 or 2 almost in every minute:

2021-07-24T12:22:23.248+0200 ERROR piecestore download failed {Piece ID: ZGW2ZEQM44G2PGUU2LINYHVI5HKUIMPSZD4OJNVL3MRVFSFW2YKQ, Satellite ID: 121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6, Action: GET_REPAIR, error: file does not exist, errorVerbose: file does not exist\n\tstorj.io/common/rpc/rpcstatus.Wrap:73\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download:534\n\tstorj.io/common/pb.DRPCPiecestoreDescription.Method.func2:217\n\tstorj.io/drpc/drpcmux.(*Mux).HandleRPC:33\n\tstorj.io/common/rpc/rpctracing.(*Handler).HandleRPC:58\n\tstorj.io/drpc/drpcserver.(*Server).handleRPC:102\n\tstorj.io/drpc/drpcserver.(*Server).ServeOne:60\n\tstorj.io/drpc/drpcserver.(*Server).Serve.func2:95\n\tstorj.io/drpc/drpcctx.(*Tracker).track:51}
2021-07-24T12:22:29.947+0200 INFO piecestore upload started {Piece ID: ENIIT5INBZAFCCG5YZ2IK6X3O47JVXZEXKQW32NLDHKPUGOZT46Q, Satellite ID: 12rfG3sh9NCWiX3ivPjq2HtdLmbqCrvHVEzJubnzFzosMuawymB, Action: PUT_REPAIR, Available Space: 1124694833152}
2021-07-24T12:22:30.584+0200 INFO piecestore uploaded {Piece ID: ENIIT5INBZAFCCG5YZ2IK6X3O47JVXZEXKQW32NLDHKPUGOZT46Q, Satellite ID: 12rfG3sh9NCWiX3ivPjq2HtdLmbqCrvHVEzJubnzFzosMuawymB, Action: PUT_REPAIR, Size: 2319360}
2021-07-24T12:22:30.978+0200 INFO piecestore download started {Piece ID: AW77ICRZWM5XAWRQ4VQ772BERNBS4XCGIYI4XERBHIY7ROVWHPAQ, Satellite ID: 1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE, Action: GET_REPAIR}
2021-07-24T12:22:31.052+0200 INFO piecestore downloaded {Piece ID: AW77ICRZWM5XAWRQ4VQ772BERNBS4XCGIYI4XERBHIY7ROVWHPAQ, Satellite ID: 1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE, Action: GET_REPAIR}
2021-07-24T12:22:31.360+0200 INFO piecestore download started {Piece ID: T32U7HMCFBVGDQPZ6MXTJVHUJNFIRSSQFXNE7T7YLOG7OHTGYHFQ, Satellite ID: 121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6, Action: GET_REPAIR}
2021-07-24T12:22:31.442+0200 ERROR piecestore download failed {Piece ID: T32U7HMCFBVGDQPZ6MXTJVHUJNFIRSSQFXNE7T7YLOG7OHTGYHFQ, Satellite ID: 121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6, Action: GET_REPAIR, error: file does not exist, errorVerbose: file does not exist\n\tstorj.io/common/rpc/rpcstatus.Wrap:73\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download:534\n\tstorj.io/common/pb.DRPCPiecestoreDescription.Method.func2:217\n\tstorj.io/drpc/drpcmux.(*Mux).HandleRPC:33\n\tstorj.io/common/rpc/rpctracing.(*Handler).HandleRPC:58\n\tstorj.io/drpc/drpcserver.(*Server).handleRPC:102\n\tstorj.io/drpc/drpcserver.(*Server).ServeOne:60\n\tstorj.io/drpc/drpcserver.(*Server).Serve.func2:95\n\tstorj.io/drpc/drpcctx.(*Tracker).track:51}

Also, I see these in the log:

2021-07-24T12:17:36.300+0200 WARN contact:service Your node is still considered to be online but encountered an error. {Satellite ID: 12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs, Error: contact: failed to dial storage node (ID: 1FYXCFTDX4DN5R5Y5GHTmvBrttN3sBYWawiWzgx6UNdSdQameJ) at address birtok.myddns.me:28967 using QUIC: rpc: quic: timeout: no recent network activity}
2021-07-24T12:17:36.422+0200 WARN contact:service Your node is still considered to be online but encountered an error. {Satellite ID: 12rfG3sh9NCWiX3ivPjq2HtdLmbqCrvHVEzJubnzFzosMuawymB, Error: contact: failed to dial storage node (ID: 1FYXCFTDX4DN5R5Y5GHTmvBrttN3sBYWawiWzgx6UNdSdQameJ) at address birtok.myddns.me:28967 using QUIC: rpc: quic: timeout: no recent network activity}
2021-07-24T12:17:36.778+0200 WARN contact:service Your node is still considered to be online but encountered an error. {Satellite ID: 12tRQrMTWUWwzwGh18i7Fqs67kmdhH9t6aToeiwbo5mfS2rUmo, Error: contact: failed to dial storage node (ID: 1FYXCFTDX4DN5R5Y5GHTmvBrttN3sBYWawiWzgx6UNdSdQameJ) at address birtok.myddns.me:28967 using QUIC: rpc: quic: timeout: no recent network activity}
2021-07-24T12:17:37.080+0200 WARN contact:service Your node is still considered to be online but encountered an error. {Satellite ID: 1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE, Error: contact: failed to dial storage node (ID: 1FYXCFTDX4DN5R5Y5GHTmvBrttN3sBYWawiWzgx6UNdSdQameJ) at address birtok.myddns.me:28967 using QUIC: rpc: quic: timeout: no recent network activity}
2021-07-24T12:17:37.292+0200 WARN contact:service Your node is still considered to be online but encountered an error. {Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Error: contact: failed to dial storage node (ID: 1FYXCFTDX4DN5R5Y5GHTmvBrttN3sBYWawiWzgx6UNdSdQameJ) at address birtok.myddns.me:28967 using QUIC: rpc: quic: timeout: no recent network activity}
2021-07-24T12:17:37.763+0200 INFO piecestore uploaded {Piece ID: U4HVWXSMWPQQZXW22NR3KSFBMUGO4QB2HJ4POZQ47DFDAOFR3UGA, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: PUT_REPAIR, Size: 2212096}
2021-07-24T12:17:37.824+0200 WARN contact:service Your node is still considered to be online but encountered an error. {Satellite ID: 121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6, Error: contact: failed to dial storage node (ID: 1FYXCFTDX4DN5R5Y5GHTmvBrttN3sBYWawiWzgx6UNdSdQameJ) at address birtok.myddns.me:28967 using QUIC: rpc: quic: timeout: no recent network activity}
2021-07-24T12:17:39.848+0200 INFO piecestore download started {Piece ID: C7BKMP5LMBRTRFMXZDFZAUDHCMQXU6G477L36FCEPUMZFNN5U6DQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR}
2021-07-24T12:17:40.658+0200 INFO piecestore download started {Piece ID: SWTKZD53CVH3ZVHOFMESAI6UWWL5SJFOE4NB4B6UZEDET2EYU4TQ, Satellite ID: 12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs, Action: GET_REPAIR}
2021-07-24T12:17:40.807+0200 INFO piecestore downloaded {Piece ID: SWTKZD53CVH3ZVHOFMESAI6UWWL5SJFOE4NB4B6UZEDET2EYU4TQ, Satellite ID: 12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs, Action: GET_REPAIR}
2021-07-24T12:17:40.838+0200 INFO piecestore downloaded {Piece ID: C7BKMP5LMBRTRFMXZDFZAUDHCMQXU6G477L36FCEPUMZFNN5U6DQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: GET_REPAIR}
2021-07-24T12:17:41.056+0200 INFO piecestore upload started {Piece ID: TTOYNX4I4O6TLBVYDNNZ3N6BCMLSV44ADLHN7RRILQFF64VUZSIQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: PUT, Available Space: 1124741660928}
2021-07-24T12:17:42.360+0200 INFO piecestore uploaded {Piece ID: TTOYNX4I4O6TLBVYDNNZ3N6BCMLSV44ADLHN7RRILQFF64VUZSIQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: PUT, Size: 181504}
2021-07-24T12:17:42.543+0200 INFO piecestore upload started {Piece ID: 77ZAS4SMHPJEF5CRABRYOED6EEAIPUSPAWNYDE7NE5F47S3VIC6Q, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: PUT, Available Space: 1124741478912}
2021-07-24T12:17:44.838+0200 INFO piecestore download started {Piece ID: XFTBOE5Y7JNCD4FIYFSPHUOWKFSE25Y3D2EO3Y67S2XW2MOP7NEA, Satellite ID: 121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6, Action: GET_REPAIR}
2021-07-24T12:17:45.143+0200 INFO piecestore downloaded {Piece ID: XFTBOE5Y7JNCD4FIYFSPHUOWKFSE25Y3D2EO3Y67S2XW2MOP7NEA, Satellite ID: 121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6, Action: GET_REPAIR}
2021-07-24T12:17:48.128+0200 INFO piecestore upload started {Piece ID: MDI5T6PPSOQUGTKRT65NTYCZB3WDKBS2KXUJVDLEHCI5LLVJOTWQ, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: PUT, Available Space: 1124741478912}
2021-07-24T12:17:48.431+0200 INFO piecestore download started {Piece ID: HRBQDR6PBOUKPMYLOMSE3IQH3XQXXD6RSMCRSM77MGYQGTKFMFGQ, Satellite ID: 12rfG3sh9NCWiX3ivPjq2HtdLmbqCrvHVEzJubnzFzosMuawymB, Action: GET_REPAIR}
2021-07-24T12:17:48.705+0200 INFO piecestore upload started {Piece ID: EJBQ54NCDPB6UFX6F44IZJWMFFJ5AFYJV4LDV64SGXR5ETOIA5TA, Satellite ID: 12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S, Action: PUT, Available Space: 1124741478912}
2021-07-24T12:17:48.829+0200 INFO piecestore upload started {Piece ID: K4DK2ZB55XNZFB76U4R26S7EO4UHI267GGVYTTD2IBA7ZDS3A4MQ, Satellite ID: 12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs, Action: PUT, Available Space: 1124741478912}
2021-07-24T12:17:49.146+0200 INFO piecestore uploaded {Piece ID: K4DK2ZB55XNZFB76U4R26S7EO4UHI267GGVYTTD2IBA7ZDS3A4MQ, Satellite ID: 12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs, Action: PUT, Size: 145408}

Is it related to this issue?

Spoke to soon… the problem persists.
during the last 4 hours i got another 2 failures on trying to track them down…

2 posts were merged into an existing topic: Audit weirdness

The failed deletions seem to have stopped and I don’t seem to be getting audited on them anymore, however I’m still failing repair traffic for these deleted pieces (which I guess is sort of correct as they are deleted). I guess that’s the next part of the fixup.

2021-07-24T09:24:03.165+0100 ERROR piecestore download failed {"Piece ID": "ZY6QV3GFEUZBTNSFYF6MWWHNMUXGVV6VDRPU6R2I66DMMEK6XJWQ", "Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Action": "GET_REPAIR", "error": "file does not exist", "errorVerbose": "file does not exist\n\tstorj.io/common/rpc/rpcstatus.Wrap:73\n\tstorj.io/storj/storagenode/piecestore.(*Endpoint).Download:534\n\tstorj.io/common/pb.DRPCPiecestoreDescription.Method.func2:217\n\tstorj.io/drpc/drpcmux.(*Mux).HandleRPC:33\n\tstorj.io/common/rpc/rpctracing.(*Handler).HandleRPC:58\n\tstorj.io/drpc/drpcserver.(*Server).handleRPC:102\n\tstorj.io/drpc/drpcserver.(*Server).ServeOne:60\n\tstorj.io/drpc/drpcserver.(*Server).Serve.func2:95\n\tstorj.io/drpc/drpcctx.(*Tracker).track:51"}

it does seem to be running a lot better… but still seeing issues, even if very rarely… odd thing is that i cannot find it in my logs… i can only assume either my logs or my dashboard is wrong.

been looking everywhere can i just cannot find the failed audit for ap1, even tho it just dropped on in audit score on the dashboard…

even checking my log consistency i am recording every minut… and since it exports on a 10minute schedule… and docker will save the last 10MB worth of logs…

but yeah repairs for today sure doesn’t look to be doing to well
usually that is 99.5% or better

========== REPAIR DOWNLOAD ====
Failed:                4030
Fail Rate:             8.720%
Canceled:              0
Cancel Rate:           0.000%
Successful:            42188
Success Rate:          91.280%

us2 is having a different issue, discussed here. Audit weirdness (us2 satellite). (Team aware. No additional information is needed)

But please verify by looking at the piece history. If it’s the same as mentioned there, you can post logs in the appropriate topic. If you’re seeing a different issue, please start a new topic. Let’s try and keep this one focused on the issue on ap1. (Also looking at you @SGC :wink: )

1 Like

Moved out to Audit weirdness

1 Like