Hello again.
My nodes are configured with 18gb of RAM.
I have 1 node that is using 95% RAM for normal operation, and it appears to be locking up the node every 20 minutes or so, or at least that is the pattern I see. Using the top command, I can see the cardano node is using the 95%. On my second relay node I’m only using 70%. I have rebooted the problem node several times, but the RAM usage spikes back to 95%. Any ideas on where I can look for an issue?
Hi,
What version of cardano-node?
cardano-node version
1.35.3
I wonder if I should just rebuild it. What has me puzzled is it’s the same configuraion as my other relay.
type sudo systemctl | grep cardano (or cnode) and check what services do u run on both nodes… And check the differences
It’s the same on both and the file is configure identically.
Do you run any RTS flags in your start script?
/usr/local/bin/cardano-node run +RTS ... -RTS --topology
Remove them and restart the node.
Did you make any changes to config.json lately?
wget -N https://book.world.dev.cardano.org/environments/mainnet/config.json
Get new file and restart the node.
These are the parameters in the startup script on both nodes:
/usr/local/bin/cardano-node run +RTS -N -A16m -qg -qb -RTS --topology ${TOPOLOGY} --database-path ${DB_PATH} --socket-path ${SOCKET_PATH} --host-addr ${HOSTADDR} --port ${PORT} --config ${CONFIG}
I did download the new config. After a few minutes the memory spiked back up. I did not remove the settings in the startup script yet. Do I remove the text before --topology?
Usually, RTS flags are meant to optimize the node but maybe not your case. Try to remove the parameters below and restart.
+RTS -N -A16m -qg -qb -RTS
Also check the log for any errors, if you have any share a screenshot.
journalctl -e -f -u cardano-node
The node is a Relay or Block Producer?
It’s a relay node. The memory spiked but it has not locked the node up. If it does, I will try your recommendations and report back. Thank You.
1 Like
I did not figure out the issue. I went ahead and upgraded to 1.35.4, it’s no loner an issue.
1 Like