Question Storj node power consumption

Hello,
I have question about CPU power consumption on Storj Node;
the same Storj node running on i7 cpu (TDP 65W) or i5 cpu (TDP 35W) same CPU family have the same power consumption ?
thx

Original post with the link

So this means that ie and i5-6600T (TDP35W) an i7-6700 (TDP65W) have the same power usage to run a Storj Node ?

1 Like

Yes and no. Counting only the additional power consumption from the node, it will be similar. But the i7 will almost certainly drain more power passively. If you’re running a CPU just for storj you should always go with the lowest power consumption CPU possible. Any modern i5 and i7 is already quite overkill.

I’m running multiple nodes in docker in celeron right now and the cpu is always 90-100% fully loaded so I was thinking to upgrade but I have to choose between i5 or i7

Same processor architecture and technology size results in same power consumption at the same clock rates. Low TDP Intel processors are not magic, they are just capped by max power and have different behavior in clock scaling/boosting, base clock values, etc.

Storagenode consumes extremely little cpu under normal operation, so the cpu will be practically idle.

All modern CPUs are very good at clock scaling up to shutting down specific cores completely under low load. Low TDP CPUs tend to consume slightly less if they have lower base clock, but the difference is insignificant.

I would not worry about cpu power consumption in other words, other factors far outweigh the difference:

  • constant chipset power consumption (up to 10W)
  • what peripherals can and cannot be powered off on the MLB (can add up to another 10-15W)
  • ram type and clock rate

If you want most power efficient approach - consider arm SoC based systems.

1 Like

ummm one node per core? as required by ToS?

yep. Todays celerons are two cores

I would choose the one with more / fitting number of cores and try to undervolt it, also disabling turbo, if possible.
Maybe an UPS and enabling write chache of the drives may lower cpu load.

Real numbers below (from my node):

Ryzen 3700X with 4x8TB hdd and 2x nvme.
is about 80W power draw on watt-meter.

Uptime: 26 days, 19:49:28
Load average: 0.16 0.15 0.10
Avg: 0.0% sy: 0.1% ni: 0.0% hi: 0.0% si: 0.0% st: 0.0% gu: 0.2% wa: 0.1%

so mostly idle.

Looks pretty similar to mine (x9 supermicro server, with Xeon E3-1230 v2, separate LSI controller, backplane, 8 7200 HDDs, 4 SATA SSD, 3 stock, very loud, fans). Server does a lot more than just storagenode, so look at minimum consumption curve, because storagenode is always running, that’s the baseline

Oh, another unobvious thing — power supply. If you have a cheap unbranded power supply it can be easily 60% efficient or worse. Getting appropriately sized 80 Plus Platinum or Titanium supply (used ones are quite cheap) can easily improve efficiency by 10-30%. At 100W this is 10-30W not being wasted.

I bought used 500W 80 plus titanium supply for $20 on eBay and it broke even in 4.5 month.

The subtlety here is to choose power rating correctly: the vendors gurantee that efficiency only at specific load levels — e.g 20%, 50%, 80% (see datasheet). Pick the supply rating so that your average load is close to those declared values.

Added bonus these higher end supplies are built much better and are more likely to survive power events, improving overall reliability, longevity, better for environment, lighter on an ups due to PF correction, yada yada. There are no downsides.

2 Likes

Some things that can help you.

Get Watt meter, these are cheap, and you can check all you need, even how manufacturers of all kinds of electrical equipment like kettle lie straight to your face.

In addition to that I discovered for example, that my super nice monitor even after entering power save mode when you turn off PC still uses about 20W for whatever, unless you click on menu and select ‘power off’. Then it finally turns off it’s power supply into about 1W mode.
UPS - next 20W of wasted power. And so on. And you can find that after 1 year of being plugged in your USB phone charger will use less power than my PC during writing this post :slight_smile:

2 Likes

This is very true. The kill-a-wat type of devices are cheap and useful.

My active speakers (some Logitech 5.1 set on sleep) burn 25W, UPS about 10w, etc; it all ads up very fast indeed. The whole house consumes 250W when “everything is off”.

I’ve put most severe offenders on HomeKit controlled outlets (that also keep track of and report power usage); now with simple automation they stay off when not in use. Replaced a poe network switch (40w idle!) with a more efficient one.

It’s worth optimizing stuff that is plugged in 24/7.

What brand? Perhaps all that RGB lighting nonsense and lasers that project logo on the ceiling (looking at you, asus)?

My both displays fortunately consume 500mW each when at sleep. Both are different models NEC.

Interesting it the old square 5W apple usb chargers consume obnoxious amount of power doing nothing. Newer ones — zero.

My AOC monitor has a setting called deep sleep or something; maybe yours has something similar and you must turn it on. Check it’s menu; sometimes useful settings that should be ON by default, aren’t.
With Apple you pay for good looks not good engeneering :wink:.

Zero power consumption at idle is “pretty damn good” in my book. Any better will be violating energy conservation laws of physics

I completely disagree with your statement on many levels btw, but let’s stay on topic.

Not necessarily. They could implement the energy generation based on solar panels or converting an electromagnetic radiation around back to the electricity (it’s low, but with a super capacitor it could be useful).

250W? Wow.

Monitor is LG 32", it has usb-c and big power supply so you can connect laptop with usb-c and get video signal + power thru it, probably that is why it has “special” power modes.

No rgb nonsense :slight_smile: IMO while rgb is useful for keyboards (I have Logitech 915 both full size and tkl, I think over 2 years, can recommend them), tolerable for mouse, other devices, like headphones with rgb which you even can’t see on your head is just… for gamers under 15 yo.

On the other hand to be honest, almost all my home lights are rgb with philips hue :stuck_out_tongue: you know, there are times when you are too lazy to turn it manually and that warm color in the evening is great.

Although the best performance has this old MS kbd+mouse wireless combo, I don’t remember changing batteries in it in like 10 years…

I am running one Storj node, Nextcloud and a Tezos Bakery on a intel n100. This is running like a charm. And the power consumption is less then 10 watt. And it is solar powered.

Nah, LG shall stick to making components. They were never great at end user products, in my opinion. That second NEC also has usbc input with pd for up to 65 watts — still half a watt on idle.

I disagree ;))) One is not supposed to look at the keyboard when typing, let alone endure Las Vegas in peripheral vision. It’s a pure gimmick! I used to use mechanical keyboards, had quite a collection. Including various split ones. In recent years everything became rgb, and I lost interest too, now using old slim tenkeyless Magic Keyboard. Battery lasts months, no gimmicks.

Full size is very bad for ergonomics: users tend to also have a mouse on the right from the keyboard, but center for typing, and as a result stretching arms non-symmetrically, leading to back issues later in life.

Yes! As soon as adaptive lighting was introduced to home kit, I went all in on Hue. No lamp left unhued :)! Light strips everywhere, hidden, blasting to the ceiling.

The only problem is Siri sometimes pretending not to understand you. Hey siri, lights off in the kitchen. Hey siri! Lights off in the kitchen! HEY SIRI!!! lights off!! In the kitchen!!! Dangit!

Or this idiocy: hey siri, turn lights on in the kitchen, and set lights to low in the hallway. Then Siri tells you that you can only submit one request at a time. So she understood that there are two requested, why not just … execute them?! Why play these games /rant.