Bloom dont work or why my trash is empty?

Nodestats VERSION v1.100.5:
image


Trash folder:

image

Logs with no errors:

{"L":"INFO","T":"2024-03-29T15:40:35Z","N":"lazyfilewalker.gc-filewalker","M":"starting subprocess","process":"storagenode","satelliteID":"121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6"}
{"L":"INFO","T":"2024-03-29T15:40:35Z","N":"lazyfilewalker.gc-filewalker","M":"subprocess started","process":"storagenode","satelliteID":"121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6"}
{"L":"INFO","T":"2024-03-29T15:40:35Z","N":"lazyfilewalker.gc-filewalker.subprocess.db","M":"migrating trash namespace to use per-day directories","process":"storagenode","satelliteID":"121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6","Process":"storagenode","namespace":"qstuylguhrn2ozjv4h2c6xpxykd622gtgurhql2k7k75wqaaaaaa"}
{"L":"INFO","T":"2024-03-29T15:40:35Z","N":"lazyfilewalker.gc-filewalker.subprocess.db","M":"trash namespace migration complete","process":"storagenode","satelliteID":"121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6","Process":"storagenode","namespace":"qstuylguhrn2ozjv4h2c6xpxykd622gtgurhql2k7k75wqaaaaaa"}
{"L":"INFO","T":"2024-03-29T15:40:35Z","N":"lazyfilewalker.gc-filewalker.subprocess.db","M":"migrating trash namespace to use per-day directories","process":"storagenode","satelliteID":"121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6","Process":"storagenode","namespace":"v4weeab67sbgvnbwd5z7tweqsqqun7qox2agpbxy44mqqaaaaaaa"}
{"L":"INFO","T":"2024-03-29T15:40:35Z","N":"lazyfilewalker.gc-filewalker.subprocess.db","M":"trash namespace migration complete","process":"storagenode","satelliteID":"121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6","Process":"storagenode","namespace":"v4weeab67sbgvnbwd5z7tweqsqqun7qox2agpbxy44mqqaaaaaaa"}
{"L":"INFO","T":"2024-03-29T15:40:35Z","N":"lazyfilewalker.gc-filewalker.subprocess.db","M":"database does not exist","process":"storagenode","satelliteID":"121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6","database":"garbage_collection_filewalker_progress","Process":"storagenode"}
{"L":"INFO","T":"2024-03-29T15:40:35Z","N":"lazyfilewalker.gc-filewalker.subprocess.db","M":"database does not exist","process":"storagenode","satelliteID":"121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6","Process":"storagenode","database":"used_space_per_prefix"}
{"L":"INFO","T":"2024-03-29T15:40:35Z","N":"lazyfilewalker.gc-filewalker.subprocess","M":"Database started","process":"storagenode","satelliteID":"121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6","Process":"storagenode"}
{"L":"INFO","T":"2024-03-29T15:40:35Z","N":"lazyfilewalker.gc-filewalker.subprocess","M":"gc-filewalker started","process":"storagenode","satelliteID":"121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6","Process":"storagenode","createdBefore":"2024-03-25T17:59:59Z","bloomFilterSize":1026686}
{"L":"INFO","T":"2024-03-29T15:41:02Z","N":"lazyfilewalker.gc-filewalker.subprocess","M":"gc-filewalker completed","process":"storagenode","satelliteID":"121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6","Process":"storagenode","piecesCount":1910003,"piecesSkippedCount":0}
{"L":"INFO","T":"2024-03-29T15:41:02Z","N":"lazyfilewalker.gc-filewalker","M":"subprocess finished successfully","process":"storagenode","satelliteID":"121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6"}
{"L":"INFO","T":"2024-03-30T03:49:01Z","N":"lazyfilewalker.gc-filewalker","M":"starting subprocess","Process":"storagenode","satelliteID":"12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S"}
{"L":"INFO","T":"2024-03-30T03:49:01Z","N":"lazyfilewalker.gc-filewalker","M":"subprocess started","Process":"storagenode","satelliteID":"12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S"}
{"L":"INFO","T":"2024-03-30T03:49:01Z","N":"lazyfilewalker.gc-filewalker.subprocess","M":"Database started","Process":"storagenode","satelliteID":"12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S","Process":"storagenode"}
{"L":"INFO","T":"2024-03-30T03:49:01Z","N":"lazyfilewalker.gc-filewalker.subprocess","M":"gc-filewalker started","Process":"storagenode","satelliteID":"12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S","bloomFilterSize":4100003,"Process":"storagenode","createdBefore":"2024-03-20T17:59:59Z"}
{"L":"INFO","T":"2024-03-30T03:55:28Z","N":"lazyfilewalker.gc-filewalker.subprocess","M":"gc-filewalker completed","Process":"storagenode","satelliteID":"12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S","piecesCount":33579387,"piecesSkippedCount":0,"Process":"storagenode"}
{"L":"INFO","T":"2024-03-30T03:55:29Z","N":"lazyfilewalker.gc-filewalker","M":"subprocess finished successfully","Process":"storagenode","satelliteID":"12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S"}
{"L":"INFO","T":"2024-03-31T09:12:06Z","N":"lazyfilewalker.gc-filewalker","M":"starting subprocess","Process":"storagenode","satelliteID":"12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs"}
{"L":"INFO","T":"2024-03-31T09:12:06Z","N":"lazyfilewalker.gc-filewalker","M":"subprocess started","Process":"storagenode","satelliteID":"12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs"}
{"L":"INFO","T":"2024-03-31T09:12:06Z","N":"lazyfilewalker.gc-filewalker.subprocess","M":"Database started","Process":"storagenode","satelliteID":"12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs","Process":"storagenode"}
{"L":"INFO","T":"2024-03-31T09:12:06Z","N":"lazyfilewalker.gc-filewalker.subprocess","M":"gc-filewalker started","Process":"storagenode","satelliteID":"12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs","Process":"storagenode","createdBefore":"2024-03-26T17:59:59Z","bloomFilterSize":3607706}
{"L":"INFO","T":"2024-03-31T09:15:00Z","N":"lazyfilewalker.gc-filewalker.subprocess","M":"gc-filewalker completed","Process":"storagenode","satelliteID":"12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs","Process":"storagenode","piecesCount":7006620,"piecesSkippedCount":0}
{"L":"INFO","T":"2024-03-31T09:15:01Z","N":"lazyfilewalker.gc-filewalker","M":"subprocess finished successfully","Process":"storagenode","satelliteID":"12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs"}
2 Likes

The pieces are biodegradables. They don’t reach the trash. They vanish.

3 Likes

Also satellite pay less for storage: du -sBG --apparent-size show 11603G (GiB or 12458 GB), but satellite pay only for 11.26TB (or only 90% from stored data)

1 Like

Yep, something is broken globally in Storj network:
image

Data is still on the disk, but it’s not calculated from the satellite side. You may see this on the disk space usage graph. In my case, the delta is about 10Tb.


So, welcome to the era of unpaid SNO =)

3 Likes

My trash has tripled over last 1 or 2 days.

The problem could potentially stem from the recent modifications made to the bloom filter pushing mechanism. It seems that the existing bloom filter on the satellite side is not compatible with node versions higher than v1.95. Currently, I am using version v1.100.3.

What version of the storage node software are you currently using?

I am seeing 1.95.1 / 1.96.6 / 1.97.3 on my nodes.

1 Like

I don’t think this will be the case.
I’m running 1.100.4 and have received a bloom filter today:

storagenode version
2024-03-31T11:30:50Z    INFO    Anonymized tracing enabled      {"Process": "storagenode"}
Release build
Version: v1.100.4
Build timestamp: 19 Mar 24 09:37 UTC
Git commit: 8e38bd71fb047b9cb00149fd90bb5f94ff21d9aa

grep walk /var/log/storj/storj.log 
2024-03-31T04:45:38Z    INFO    lazyfilewalker.gc-filewalker    starting subprocess     {"Process": "storagenode", "satelliteID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs"}
2024-03-31T04:45:38Z    INFO    lazyfilewalker.gc-filewalker    subprocess started      {"Process": "storagenode", "satelliteID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs"}
2024-03-31T04:45:38Z    INFO    lazyfilewalker.gc-filewalker.subprocess Database started        {"Process": "storagenode", "satelliteID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Process": "storagenode"}
2024-03-31T04:45:38Z    INFO    lazyfilewalker.gc-filewalker.subprocess gc-filewalker started   {"Process": "storagenode", "satelliteID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Process": "storagenode", "createdBefore": "2024-03-26T17:59:59Z", "bloomFilterSize": 813386}
2024-03-31T04:51:25Z    INFO    lazyfilewalker.gc-filewalker.subprocess gc-filewalker completed {"Process": "storagenode", "satelliteID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs", "Process": "storagenode", "piecesCount": 1528491, "piecesSkippedCount": 0}
2024-03-31T04:51:25Z    INFO    lazyfilewalker.gc-filewalker    subprocess finished successfully        {"Process": "storagenode", "satelliteID": "12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs"}
1 Like

This is strange to see: it looks like there’s a healthy amount of client uploads happening right now… but used space is still going down. Can’t outrun the trashman! :wink:

Quadrupled now. :face_vomiting:

You’re lucky; I, on the other hand, have no idea what will happen when the Bloom filter is fixed and I end up with a couple of dozen terabytes of small files that will be deleted more or less at the same time. That’s when my disk array is really going to suffer.

I have 2Tb of trash overnight

Cattura1
Incredible

And we had to keep them for a week

Adding that I have about 10TB trash overnight on my side.

Yep, that drop seems to be a huge deletion.

Is this where we scream “Hold the line!”?

3 Likes

v1.100 and v1.101 have problems with GC. All of the runs end with this:

2024-03-29T13:06:47+01:00       INFO    retain  Moved pieces to trash during retain     {"Process": "storagenode", "cachePath": "[...]", "Deleted pieces": 0, "Failed to delete": 1, "Pieces failed to read": 0, "Pieces count": 0, "Satellite ID": "121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6", "Duration": "1.615540737s", "Retain Status": "enabled"}
1 Like

Node on 1.96.6:

{"L":"INFO","T":"2024-03-29T18:08:00Z","N":"retain","M":"Moved pieces to trash during retain","process":"storagenode","Deleted pieces":1,"Failed to delete":0,"Pieces failed to read":0,"Pieces count":0,"Satellite ID":"121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6","Duration":"58.722145294s","Retain Status":"enabled"}
{"L":"INFO","T":"2024-03-31T04:12:41Z","N":"retain","M":"Moved pieces to trash during retain","process":"storagenode","Deleted pieces":1,"Failed to delete":0,"Pieces failed to read":0,"Pieces count":0,"Satellite ID":"12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs","Duration":"1m56.679891372s","Retain Status":"enabled"}