Docker vs Linux installation

Hi, my docker in Windows eats up all memory of 6 GB and the page file grows insanely to 14 GB! Unbelievable! The set up runs perfectly, just the problem with memory. I’m considering using Linux instead of Windows GUI and Docker because I’ve just been hosting for less than a month. I can make a fresh installation. However, I would like to know how it would be. Can any of you here hosting on Linux (no Docker) tell me your experience? Any issue with memory, disk or network? Thanks in advance.

Seems you use wsl2 to run the docker desktop, and the wsl2 has an issue with ram eating:

1 Like

There probably won’t be many people running without docker on Linux as that isn’t yet a fully supported setup. What I can say is that running with docker on Linux would have essentially the same use of resources as without, since it wouldn’t require any virtualization like it does on windows.

For now I would recommend using docker on Linux. I’ve been running like that for years now and never had any RAM issues.

3 Likes

Is there any particular reason you don’t want to run Storj on docker in Linux? As far as I know, it is not recommended to use docker in Windows to run storjnode. On my synology it only use 800MB of RAM.

I run a few nodes directly on Linux (no Docker) and it’s both stable, and easy to install.
If I’m only running one node per machine that’s my default setup. If you want to run multiple nodes per machine then Docker is still far easier.

There is actually something wrong with how docker on Synology reports RAM usage. I am running 3 nodes on my Synology and it claims total RAM usage is about 3.6GB across the relevant containers. However, my total system RAM usage is 2.72GB atm, so that can’t possibly be true. If you click through to the details you find the actual numbers under the process tab. My biggest node is currently only using 40.7MB for the node and 9.5MB for the dashboard I keep open in tmux.

RAM usage really is negligible as long as your HDD’s can keep up. It can go up during busy times, but this is a node storing 15TB of data and it still only uses 40MB.