Error creating revocation database after config change

Hi SNO’s,

I am getting an error after changing my payment address in the config, on my Linux CentOS 7 machine. I’ve stopped the node, and started it again with the new config. Since then it shows OFFLINE and gives the following error:

2020-01-19T13:14:24.810Z FATAL Unrecoverable error {"error": "Error creating revocation database: revocation database error: boltdb error: timeout\n\\n\\n\\n\\n\tmain.cmdRun:162\n\\n\\n\*Command).execute:826\n\*Command).ExecuteC:914\n\*Command).Execute:864\n\\n\tmain.main:315\n\truntime.main:203", "errorVerbose": "Error creating revocation database: revocation database error: boltdb error: timeout\n\\n\\n\\n\\n\tmain.cmdRun:162\n\\n\\n\*Command).execute:826\n\*Command).ExecuteC:914\n\*Command).Execute:864\n\\n\tmain.main:315\n\truntime.main:203\n\tmain.cmdRun:164\n\\n\\n\*Command).execute:826\n\*Command).ExecuteC:914\n\*Command).Execute:864\n\\n\tmain.main:315\n\truntime.main:203"}

Can anyone make sense of this error?


Welcome to the forum @XenonOrion!

Are you using Windows or Linux ? How is your HDD connected ?

CentOS 7, Internal Drive connected through Mount Point through the disks utility.

Node settings:

Storage Node Dashboard ( Node Version: v0.29.3 )


ID 12idSMzaHLhRQmUB7kk5GeRBAof6qsQpYCa2nYR7DCk9up5Q67V
Last Contact OFFLINE
Uptime 5m1s

               Available       Used     Egress      Ingress
 Bandwidth        1.0 PB     1.8 TB     1.6 TB     253.0 GB (since Jan 1)
      Disk        7.7 TB     2.3 TB


Your help is really appreciated!

Hmm, it seems this is not the right error. I was running two instances on the same identity and settings. Solved this error, but node still not running…

What does this command show ?

docker ps -a

Thank you for your message!

 storjlabs/storagenode:beta   "/entrypoint"            19 minutes ago      Up 19 minutes        >14002/tcp,>28967/tcp   storagenode

So the revocation error is gone, be the node is still not up, hence I made a new thread:
Node Offline, setup seems fine 78fcdfb4a5a1

Any thoughts @nerdatwork?