Storj-Exporter Name not resolved

Hello,
i am trying to get the Grafana Dashboard with multiple nodes to run. One node works fine. The other to not so.
This is my current docker command:

docker run -d --restart unless-stopped --stop-timeout 300 \
    -p 28968:28967/tcp \
    -p 28968:28967/udp \
    -p 0.0.0.0:14003:14002 \
    -e WALLET="XXX" \
    -e EMAIL="XXX" \
    -e ADDRESS="XXX" \
    -e STORAGE="1.8TB" \
    --user $(id -u):$(id -g) \
    --mount type=bind,source="/root/storjid/localnode2",destination=/app/identity \
    --mount type=bind,source="/mnt/hdd2tb/storj/",destination=/app/config \
    --mount type=bind,source="/storagelogs/node2",destination=/app/logs \
    --sysctl net.ipv4.tcp_fastopen=3 \
    --name Localnode2 storjlabs/storagenode:latest

(The other nodes have pretty much the same startup)

This is my exporter startup command:

docker run -d --link=Localnode2 --name=storj-exporter-node2 -p 9652:9651 -e STORJ_HOST_ADDRESS=Localnode2 anclrii/storj-exporter:latest

i also tryed to set the api port

docker run -d --link=Localnode2 --name=storj-exporter-node2 -p 9652:9651 -e STORJ_HOST_ADDRESS=Localnode2 -e STORJ_API_PORT=14003 anclrii/storj-exporter:latest

but i get only this result:

{"log":"2024-05-08 00:17:58 [WARNING]: Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('\u003curllib3.connection.HTTPConnection object at 0x7f7bdfa2f6a0\u003e: Failed to establish a new connection: [Errno -2] Name does not resolve')': /api/sno/\n","stream":"stderr","time":"2024-05-08T00:17:58.168903735Z"}

I dont know how to fix the “Name does not resolve” error. And if i set the IP directly i get connection refused

thanks in advance
Marvi

make it: -p 14003:14002 \

I don’t use Grafana, but maybe these ports 9652:9651 need forwarding in firewall and router, too.
And check the firewall rules.

1 Like

Isn’t it the same as the above? It just binds these to all interfaces.

But everything is on the same system. So there should be no firewall issue. The connection between Prometheus and grafana is working correctly. It’s just the connection between the exporter and the node itself

1 Like

I cant get the second exporter to work. What am i doing wrong? or is there an alternative?
The first exporter is working properly:

curl localhost:9651

(bunch of things above)

# TYPE storj_sat_audit gauge
storj_sat_audit{satellite="1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE",type="auditScore",url="saltlake.tardigrade.io:7777"} 1.0
storj_sat_audit{satellite="1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE",type="suspensionScore",url="saltlake.tardigrade.io:7777"} 1.0
storj_sat_audit{satellite="1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE",type="onlineScore",url="saltlake.tardigrade.io:7777"} 0.9988539238539238
storj_sat_audit{satellite="121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6",type="auditScore",url="ap1.storj.io:7777"} 1.0
storj_sat_audit{satellite="121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6",type="suspensionScore",url="ap1.storj.io:7777"} 1.0
storj_sat_audit{satellite="121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6",type="onlineScore",url="ap1.storj.io:7777"} 0.9983297520929564
storj_sat_audit{satellite="12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S",type="auditScore",url="us1.storj.io:7777"} 1.0
storj_sat_audit{satellite="12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S",type="suspensionScore",url="us1.storj.io:7777"} 1.0
storj_sat_audit{satellite="12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S",type="onlineScore",url="us1.storj.io:7777"} 0.9985329106323171
storj_sat_audit{satellite="12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs",type="auditScore",url="eu1.storj.io:7777"} 1.0
storj_sat_audit{satellite="12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs",type="suspensionScore",url="eu1.storj.io:7777"} 1.0
storj_sat_audit{satellite="12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs",type="onlineScore",url="eu1.storj.io:7777"} 0.9987423207709915
# HELP storj_sat_month_egress Storj satellite egress since current month start
# TYPE storj_sat_month_egress gauge
storj_sat_month_egress{satellite="1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE",type="repair",url="saltlake.tardigrade.io:7777"} 2.887808e+07
storj_sat_month_egress{satellite="1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE",type="audit",url="saltlake.tardigrade.io:7777"} 126976.0
storj_sat_month_egress{satellite="1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE",type="usage",url="saltlake.tardigrade.io:7777"} 4.120576e+06
storj_sat_month_egress{satellite="121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6",type="repair",url="ap1.storj.io:7777"} 6.20742912e+08
storj_sat_month_egress{satellite="121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6",type="audit",url="ap1.storj.io:7777"} 381696.0
storj_sat_month_egress{satellite="121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6",type="usage",url="ap1.storj.io:7777"} 1.65744128e+08
storj_sat_month_egress{satellite="12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S",type="repair",url="us1.storj.io:7777"} 3.330474496e+09
storj_sat_month_egress{satellite="12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S",type="audit",url="us1.storj.io:7777"} 616704.0
storj_sat_month_egress{satellite="12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S",type="usage",url="us1.storj.io:7777"} 9.064815422e+09
storj_sat_month_egress{satellite="12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs",type="repair",url="eu1.storj.io:7777"} 3.038383104e+09
storj_sat_month_egress{satellite="12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs",type="audit",url="eu1.storj.io:7777"} 1.023232e+06
storj_sat_month_egress{satellite="12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs",type="usage",url="eu1.storj.io:7777"} 5.931405056e+09
# HELP storj_sat_month_ingress Storj satellite ingress since current month start
# TYPE storj_sat_month_ingress gauge
storj_sat_month_ingress{satellite="1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE",type="repair",url="saltlake.tardigrade.io:7777"} 3.79058944e+08
storj_sat_month_ingress{satellite="1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE",type="usage",url="saltlake.tardigrade.io:7777"} 3.7855232e+07
storj_sat_month_ingress{satellite="121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6",type="repair",url="ap1.storj.io:7777"} 4.16308992e+08
storj_sat_month_ingress{satellite="121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6",type="usage",url="ap1.storj.io:7777"} 1.796445696e+09
storj_sat_month_ingress{satellite="12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S",type="repair",url="us1.storj.io:7777"} 1.640035072e+09
storj_sat_month_ingress{satellite="12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S",type="usage",url="us1.storj.io:7777"} 5.807339731e+010
storj_sat_month_ingress{satellite="12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs",type="repair",url="eu1.storj.io:7777"} 1.492502272e+09
storj_sat_month_ingress{satellite="12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs",type="usage",url="eu1.storj.io:7777"} 8.339954432e+09
# HELP storj_sat_day_storage Storj satellite data stored on disk since current day start
# TYPE storj_sat_day_storage gauge
storj_sat_day_storage{satellite="1wFTAgs9DP5RSnCqKV1eLf6N9wtk4EAtmN5DpSxcs8EjT69tGE",type="atRestTotal",url="saltlake.tardigrade.io:7777"} 3.624677448776676e+011
storj_sat_day_storage{satellite="121RTSDpyNZVcEU84Ticf2L1ntiuUimbWgfATz21tuvgk3vzoA6",type="atRestTotal",url="ap1.storj.io:7777"} 4.068121807937546e+011
storj_sat_day_storage{satellite="12EayRS2V1kEsWESU9QMRseFhdxYxKicsiFmxrsLZHeLUtdps3S",type="atRestTotal",url="us1.storj.io:7777"} 5.24769372628044e+012
storj_sat_day_storage{satellite="12L9ZFwhzVpuEKMUNUqkaTLGzwY9G24tbiigLiXpmZWKwmcNDDs",type="atRestTotal",url="eu1.storj.io:7777"} 8.988038061184225e+012

but the second one is refusing to work:

curl localhost:9652

# HELP python_gc_objects_collected_total Objects collected during gc
# TYPE python_gc_objects_collected_total counter
python_gc_objects_collected_total{generation="0"} 567.0
python_gc_objects_collected_total{generation="1"} 263.0
python_gc_objects_collected_total{generation="2"} 0.0
# HELP python_gc_objects_uncollectable_total Uncollectable object found during GC
# TYPE python_gc_objects_uncollectable_total counter
python_gc_objects_uncollectable_total{generation="0"} 0.0
python_gc_objects_uncollectable_total{generation="1"} 0.0
python_gc_objects_uncollectable_total{generation="2"} 0.0
# HELP python_gc_collections_total Number of times this generation was collected
# TYPE python_gc_collections_total counter
python_gc_collections_total{generation="0"} 54.0
python_gc_collections_total{generation="1"} 4.0
python_gc_collections_total{generation="2"} 0.0
# HELP python_info Python platform information
# TYPE python_info gauge
python_info{implementation="CPython",major="3",minor="8",patchlevel="5",version="3.8.5"} 1.0
# HELP process_virtual_memory_bytes Virtual memory size in bytes.
# TYPE process_virtual_memory_bytes gauge
process_virtual_memory_bytes 2.6353664e+07
# HELP process_resident_memory_bytes Resident memory size in bytes.
# TYPE process_resident_memory_bytes gauge
process_resident_memory_bytes 2.1110784e+07
# HELP process_start_time_seconds Start time of the process since unix epoch in seconds.
# TYPE process_start_time_seconds gauge
process_start_time_seconds 1.71528244215e+09
# HELP process_cpu_seconds_total Total user and system CPU time spent in seconds.
# TYPE process_cpu_seconds_total counter
process_cpu_seconds_total 0.18000000000000002
# HELP process_open_fds Number of open file descriptors.
# TYPE process_open_fds gauge
process_open_fds 6.0
# HELP process_max_fds Maximum number of open file descriptors.
# TYPE process_max_fds gauge
process_max_fds 1.048576e+06
# HELP storj_node_info Storj node info
# TYPE storj_node_info gauge
# HELP storj_total_diskspace Storj total diskspace metrics
# TYPE storj_total_diskspace gauge
# HELP storj_total_bandwidth Storj total bandwidth metrics
# TYPE storj_total_bandwidth gauge
# HELP storj_payout_currentMonth Storj estimated payouts for current month
# TYPE storj_payout_currentMonth gauge
# HELP storj_sat_summary Storj satellite summary metrics
# TYPE storj_sat_summary gauge
# HELP storj_sat_audit Storj satellite audit metrics
# TYPE storj_sat_audit gauge
# HELP storj_sat_month_egress Storj satellite egress since current month start
# TYPE storj_sat_month_egress gauge
# HELP storj_sat_month_ingress Storj satellite ingress since current month start
# TYPE storj_sat_month_ingress gauge
# HELP storj_sat_day_storage Storj satellite data stored on disk since current day start
# TYPE storj_sat_day_storage gauge

Got it solved. Changed the storj host to the external ip and cvhanged the storj port to the correct one:

docker run -d --link=Localnode2 --name=storj-exporter-node2 -p 9652:9651 -e STORJ_HOST_ADDRESS=192.168.8.51 -e STORJ_API_PORT=14003 anclrii/storj-exporter:latest

Dont know why it didnt work before. (Maybe the system restart helped)

1 Like