SaltLake Traffic

So I am getting roughly 500Gb ingress a day from saltlake, I don’t mind the traffic, but this is test data right? Not real customer data?

No data coming from saltlake is customer data. The data coming from this satellite is either test data or capacity reservation as far as I know.
In any case, you are payed for any data stored & egress from saltlate as well as if it were from customers, just that how long this data will last solely depends on the parameters defined by the Storj team. It may last for a month or it may expire shortly after.

4 Likes

write “i!” if You GE’d saltlake, because You didn’t like “a stupid just a test satellite!” Lol

Saltlake’s active nodes are ~600 less than EU1 and ~900 less than US1, so I don’t think there will be many takers on that :rofl:

Thanks for the info.
Yeah it’s fine I really do not mind the data, just wondering, it’s quite a lot of data.

Recently there has been a push for performance testing, so that is the reasons we are currently seeing a high amount of data coming from saltlake (Updates on Test Data).

2 Likes

I’ve taken on 3tb today. Hoping my ISP doesnt flag me. Can I slow this thing down?

I think it is not performance testing, it is checking real network capacity (real available, not just reported as available by storjnode software)

That doesn’t make sense.
Try to fill up the whole network (at great expense to them)? Also, the capacity is dynamic. If we all have full nodes we deploy more.
Nah, I think it really is good old “let’s see how we can make this network sing” testing :slight_smile:

1 Like

Saltlake pushed a 1TB today… That’s kind of a real heavy traffic…

I have few nodes in SIA project. There is all clear. Commited space realy avaliable to customers, “sno” cannot commit more than it has.

Storj is different. Only variable\configoption promote information about “availiable” space to sattelites. No one know for sure real storage capacity (by folowing current model). Maybe “new big customers” requested this information before charge)

Maybe im wrong, but I do conclusions from what I see now.

2 Likes

That could be the case, not on mine, I have a bunch of small nodes, each HDD is a node, they would reach the limit and the node would have issue and possible disqualification.

Anyway. I guess I have to add another node drive this week.

I would hang fire unless you have unused HDDs lying around. Remember, this is test data. Most of it will be gone “soon” and if the mystery contract doesn’t happen it won’t be replaced in a hurry.

2 Likes

I got 3.5TB from Saltlake today on both my nodes. Nodes are only like 4 months old. Double my usage space, lol.

1 Like

Only if you specified up to the rim without any reserve.

2 Likes

I have spare drives, and yes it does seem like the test data went “kaput” and now I have 2.5TB free.

Still, no mystery contract in sight?

Never the less, I might just use the tokens I have to backup a few gigs of my own stuff.

Nah, the “mystery client” ended up going to Storj Select due to regulatory requirements (there’s some information on here) so we’re back to almost-square-one (the stress testing was really useful to highlight bottlenecks and suggest optimisations to Nodes and Code).

For SNOs, hope springs eternal! :smile:

3 Likes

There are other not less mystery clients in the sales pipeline with a not less requirements to the capacity and with very similar behavior.

1 Like

I hope the Storj sales team did check back on the requirements of these customers so that they don’t change their minds about which network to use in the very last minute.

That Vivint suprise last time was more than annoying and a terrible experience.

1 Like

I hope this time we can get him. You guys should get more customers from p*rn industry. Lol.

1 Like