gLiveView showing 0 peers in, but 24 peers out on relay node

I have 3 problems:

  1. On the relay, gLiveView is reporting only outgoing traffic, but only 1 coming (from my bp node). Consequently I’m having 0 Processed Tx.

image

  1. On my BP node, gLiveView is saying it’s a relay node, when it’s really the bp node.

image

  1. On my BP node, the above shows 27 out and 1 in, but when I click ‘p’ to view the peers, then it shows the correct 1 out and 1 in to my relay. Why is that?

image

Any suggestions for these problems (ordered in priority)? Thanks :slight_smile:

P.S. The first time I ran gLiveView in my relay it was working fine, and transactions were being processed. But then I restarted my docker container for that relay node and now no longer getting transactions :confused:

  1. On the relay, gLiveView is reporting only outgoing traffic, but only 1 coming (from my bp node). Consequently I’m having 0 Processed Tx.

Check the topology updater logs, it must run 1/hour and u should see glade u are staying with us! inside the logs
Also check if the relay has the port opened to accept connections from other public nodes

  1. On my BP node, gLiveView is saying it’s a relay node, when it’s really the bp node.

Check the env file, line POOL NAME=“”
Should be uncommented and btween “” u must write the pool folder name (case sensitive)

  1. On my BP node, the above shows 27 out and 1 in, but when I click ‘p’ to view the peers, then it shows the correct 1 out and 1 in to my relay. Why is that?

Check after node restart

P.S. The first time I ran gLiveView in my relay it was working fine, and transactions were being processed. But then I restarted my docker container for that relay node and now no longer getting transactions :confused:

That’s because u don’t have IN peers anymore

@Alexd1985 Thanks for the reply!

  1. Problem has been solved.
    Cause of issue: crontab for topologyUpdater.sh was not running. After fixing cron job and letting it sit over night, I got 4 incoming nodes and Transactions are now being processed!

  2. Error still present.
    In the env file, I set POOL_NAME="cardano-my-node" since that’s the folder in which gLiveView and startBlockProducer1.sh live but gLiveView still thinks bp node is a relay :confused:

hmmm are u using cntools? I think the start script was not updated with the files path requires for Producer

No I don’t think I’m using cntools. I’ve been following coincashew instructions which I don’t believe use cntools? I’m not too sure

Ok, then show me the script used to start the node

BP Node:

#!/bin/bash
    DIRECTORY=/root/cardano-my-node
    PORT= xxxx
    HOSTADDR=0.0.0.0
    TOPOLOGY=${DIRECTORY}/mainnet-topology-home.json
    DB_PATH=${DIRECTORY}/db
    SOCKET_PATH=${DIRECTORY}/db/socket
    CONFIG=${DIRECTORY}/mainnet-config.json
    KES=${DIRECTORY}/kes.skey
    VRF=${DIRECTORY}/vrf.skey
    CERT=${DIRECTORY}/node.cert
    /usr/local/bin/cardano-node run --topology ${TOPOLOGY} --database-path ${DB_PATH} 
    --socket-path ${SOCKET_PATH} --host-addr ${HOSTADDR} --port ${PORT} --config ${CONFIG}

Should be

cat > $NODE_HOME/startBlockProducingNode.sh << EOF

DIRECTORY=$NODE_HOME

PORT=6000

HOSTADDR=0.0.0.0

TOPOLOGY=${DIRECTORY}/${NODE_CONFIG}-topology.json

DB_PATH=${DIRECTORY}/db

SOCKET_PATH=${DIRECTORY}/db/socket

CONFIG=${DIRECTORY}/${NODE_CONFIG}-config.json

KES=${DIRECTORY}/kes.skey

VRF=${DIRECTORY}/vrf.skey

CERT=${DIRECTORY}/node.cert

cardano-node run --topology ${TOPOLOGY} --database-path ${DB_PATH} --socket-path ${SOCKET_PATH} --host-addr ${HOSTADDR} --port ${PORT} --config ${CONFIG} --shelley-kes-key ${KES} --shelley-vrf-key ${VRF} --shelley-operational-certificate ${CERT}

EOF

1 Like

Ah that fixed my issue! Thanks!

(I was missing --shelley-kes-key ${KES} --shelley-vrf-key ${VRF} --shelley-operational-certificate ${CERT} in the cardano-node-run command)

(Still getting the out / in error though, but I suppse thats an issue for the gLiveView maintainers themselves)

1 Like