Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Everything working cleanly on Unraid except shutdown of container #500

Open
SCUR0 opened this issue Mar 10, 2025 · 1 comment
Open

Everything working cleanly on Unraid except shutdown of container #500

SCUR0 opened this issue Mar 10, 2025 · 1 comment

Comments

@SCUR0
Copy link

SCUR0 commented Mar 10, 2025

Does anyone else have problems with the librenms container not shutting down?
I've gotten everything working correctly and I'm even using the librenms dispatch. Polling, validation, everything is green and working, except when I try to stop the container.
Logs show the container seeing the shutdown requests but it just doesn't shut down.
This is only an issue with librenms/librenms:latest. All other docker containers do not have this issue in the stack.
Exit code is always 137.

2025/03/10 09:44:17 [notice] 626#626: signal 15 (SIGTERM) received from 617, exiting
[10-Mar-2025 09:44:17] NOTICE: Terminating ...
2025/03/10 09:44:17 [notice] 657#657: exiting
2025/03/10 09:44:17 [notice] 658#658: exiting
2025/03/10 09:44:17 [notice] 659#659: exiting
2025/03/10 09:44:17 [notice] 725#725: exiting
2025/03/10 09:44:17 [notice] 676#676: exiting
2025/03/10 09:44:17 [notice] 662#662: exiting
2025/03/10 09:44:17 [notice] 814#814: exiting
2025/03/10 09:44:17 [notice] 664#664: exiting
2025/03/10 09:44:17 [notice] 700#700: exiting
2025/03/10 09:44:17 [notice] 757#757: exiting
2025/03/10 09:44:17 [notice] 787#787: exiting
2025/03/10 09:44:17 [notice] 844#844: exiting
2025/03/10 09:44:17 [notice] 890#890: exiting
2025/03/10 09:44:17 [notice] 919#919: exiting
2025/03/10 09:44:17 [notice] 953#953: exiting
2025/03/10 09:44:17 [notice] 987#987: exiting
2025/03/10 09:44:17 [notice] 657#657: exit
2025/03/10 09:44:17 [notice] 658#658: exit
[10-Mar-2025 09:44:17] NOTICE: exiting, bye-bye!
2025/03/10 09:44:17 [notice] 814#814: exit
2025/03/10 09:44:17 [notice] 787#787: exit
2025/03/10 09:44:17 [notice] 659#659: exit
2025/03/10 09:44:17 [notice] 725#725: exit
2025/03/10 09:44:17 [notice] 676#676: exit
2025/03/10 09:44:17 [notice] 953#953: exit
2025/03/10 09:44:17 [notice] 664#664: exit
2025/03/10 09:44:17 [notice] 700#700: exit
2025/03/10 09:44:17 [notice] 919#919: exit
2025/03/10 09:44:17 [notice] 844#844: exit
2025/03/10 09:44:17 [notice] 757#757: exit
2025/03/10 09:44:17 [notice] 662#662: exit
2025/03/10 09:44:17 [notice] 890#890: exit
2025/03/10 09:44:17 [notice] 987#987: exit
2025/03/10 09:44:17 [notice] 626#626: signal 17 (SIGCHLD) received from 664
2025/03/10 09:44:17 [notice] 626#626: worker process 664 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: worker process 757 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: worker process 787 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: worker process 919 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: signal 29 (SIGIO) received
2025/03/10 09:44:17 [notice] 626#626: signal 17 (SIGCHLD) received from 919
2025/03/10 09:44:17 [notice] 626#626: signal 17 (SIGCHLD) received from 814
2025/03/10 09:44:17 [notice] 626#626: worker process 814 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: worker process 657 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: signal 29 (SIGIO) received
2025/03/10 09:44:17 [notice] 626#626: signal 17 (SIGCHLD) received from 657
2025/03/10 09:44:17 [notice] 626#626: signal 17 (SIGCHLD) received from 658
2025/03/10 09:44:17 [notice] 626#626: worker process 658 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: signal 29 (SIGIO) received
2025/03/10 09:44:17 [notice] 626#626: signal 17 (SIGCHLD) received from 725
2025/03/10 09:44:17 [notice] 626#626: worker process 725 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: worker process 953 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: signal 29 (SIGIO) received
2025/03/10 09:44:17 [notice] 626#626: signal 17 (SIGCHLD) received from 953
2025/03/10 09:44:17 [notice] 626#626: signal 17 (SIGCHLD) received from 700
2025/03/10 09:44:17 [notice] 626#626: worker process 700 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: worker process 662 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: signal 29 (SIGIO) received
2025/03/10 09:44:17 [notice] 626#626: signal 17 (SIGCHLD) received from 662
2025/03/10 09:44:17 [notice] 626#626: signal 17 (SIGCHLD) received from 844
2025/03/10 09:44:17 [notice] 626#626: worker process 844 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: signal 29 (SIGIO) received
2025/03/10 09:44:17 [notice] 626#626: signal 17 (SIGCHLD) received from 890
2025/03/10 09:44:17 [notice] 626#626: worker process 890 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: signal 29 (SIGIO) received
2025/03/10 09:44:17 [notice] 626#626: signal 17 (SIGCHLD) received from 987
2025/03/10 09:44:17 [notice] 626#626: worker process 987 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: signal 29 (SIGIO) received
2025/03/10 09:44:17 [notice] 626#626: signal 17 (SIGCHLD) received from 676
2025/03/10 09:44:17 [notice] 626#626: worker process 676 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: signal 29 (SIGIO) received
2025/03/10 09:44:17 [notice] 626#626: signal 17 (SIGCHLD) received from 659
2025/03/10 09:44:17 [notice] 626#626: worker process 659 exited with code 0
2025/03/10 09:44:17 [notice] 626#626: exit
crond: USER librenms pid 1314 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
crond: USER librenms pid 1317 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
crond: USER librenms pid 1320 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
crond: USER librenms pid 1323 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
crond: USER librenms pid 1326 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
crond: USER librenms pid 1329 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
crond: USER librenms pid 1332 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
crond: USER librenms pid 1335 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
crond: USER librenms pid 1338 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
crond: USER librenms pid 1341 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
crond: USER librenms pid 1344 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
crond: USER librenms pid 1347 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
crond: USER librenms pid 1350 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
crond: USER librenms pid 1353 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
crond: USER librenms pid 1356 cmd php /opt/librenms/artisan schedule:run --no-ansi --no-interaction > /dev/null 2>&1
@SCUR0
Copy link
Author

SCUR0 commented Mar 11, 2025

Client:
 Version:    27.0.3
 Context:    default
 Debug Mode: false
 Plugins:
  buildx: Docker Buildx (Docker Inc.)
    Version:  v0.15.1
    Path:     /usr/libexec/docker/cli-plugins/docker-buildx
  compose: Docker Compose (Docker Inc.)
    Version:  v2.29.2
    Path:     /usr/local/lib/docker/cli-plugins/docker-compose

Server:
 Containers: 14
  Running: 9
  Paused: 0
  Stopped: 5
 Images: 14
 Server Version: 27.0.3
 Storage Driver: btrfs
  Btrfs: 
 Logging Driver: json-file
 Cgroup Driver: cgroupfs
 Cgroup Version: 2
 Plugins:
  Volume: local
  Network: bridge host ipvlan macvlan null overlay
  Log: awslogs fluentd gcplogs gelf journald json-file local splunk syslog
 Swarm: inactive
 Runtimes: io.containerd.runc.v2 nvidia runc
 Default Runtime: runc
 Init Binary: docker-init
 containerd version: ae71819c4f5e67bb4d5ae76a6b735f29cc25774e
 runc version: v1.1.13-0-g58aa920
 init version: de40ad0
 Security Options:
  seccomp
   Profile: builtin
  cgroupns
 Kernel Version: 6.6.78-Unraid
 Operating System: Unraid OS 7.0 x86_64
 OSType: linux
 Architecture: x86_64
 CPUs: 16
 Total Memory: 62.74GiB
 Name: jsmith-unraid
 ID: 57d42c43-6169-47f7-b54e-62714df46d95
 Docker Root Dir: /var/lib/docker
 Debug Mode: false
 Experimental: false
 Insecure Registries:
  127.0.0.0/8
 Live Restore Enabled: false
 Product License: Community Engine

WARNING: No swap limit support
"Config": {
            "Hostname": "a4874f707397",
            "Domainname": "",
            "User": "",
            "AttachStdin": false,
            "AttachStdout": false,
            "AttachStderr": false,
            "ExposedPorts": {
                "162/tcp": {},
                "162/udp": {},
                "514/tcp": {},
                "514/udp": {},
                "8000/tcp": {}
            },
            "Tty": false,
            "OpenStdin": false,
            "StdinOnce": false,
            "Env": [
                "TZ=America/Los_Angeles",
                "HOST_HOSTNAME=jsmith-unraid",
                "LISTEN_IPV6=true",
                "DB_PASSWORD=cwxwSVj9T9Rb",
                "LIBRENMS_SNMP_COMMUNITY=public",
                "PGID=100",
                "HOST_CONTAINERNAME=jsmith-uct-nms",
                "UPLOAD_MAX_SIZE=16M",
                "OPCACHE_MEM_SIZE=128",
                "DB_PORT=3306",
                "DB_USER=librenms",
                "DB_HOST=librenms-db",
                "DB_TIMEOUT=60",
                "LIBRENMS_BASE_URL=/",
                "TCP_PORT_8000=8000",
                "MEMORY_LIMIT=256M",
                "REAL_IP_FROM=192.168.0.0/16",
                "REAL_IP_HEADER=X-Forwarded-For",
                "LOG_IP_VAR=remote_addr",
                "PUID=99",
                "HOST_OS=Unraid",
                "UDP_PORT_514=514",
                "DB_NAME=librenms",
                "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                "S6_BEHAVIOUR_IF_STAGE2_FAILS=2",
                "LIBRENMS_PATH=/opt/librenms",
                "LIBRENMS_DOCKER=1"
            ],
            "Cmd": null,
            "Image": "librenms/librenms:latest",
            "Volumes": {
                "/data": {}
            },
            "WorkingDir": "/opt/librenms",
            "Entrypoint": [
                "/init"
            ],
            "OnBuild": null,
            "Labels": {
                "net.unraid.docker.icon": "https://raw.githubusercontent.com/A75G/docker-templates/master/templates/icons/librenms.png",
                "net.unraid.docker.managed": "dockerman",
                "net.unraid.docker.webui": "http://[IP]:[PORT:8000]/",
                "org.opencontainers.image.created": "2025-02-23T13:01:24.254Z",
                "org.opencontainers.image.description": "Fully featured network monitoring system",
                "org.opencontainers.image.licenses": "MIT",
                "org.opencontainers.image.revision": "4c3b0172b0545461bbc4aec8dcf28879ace1cbff",
                "org.opencontainers.image.source": "https://github.com/librenms/docker",
                "org.opencontainers.image.title": "LibreNMS",
                "org.opencontainers.image.url": "https://github.com/librenms/docker",
                "org.opencontainers.image.vendor": "LibreNMS",
                "org.opencontainers.image.version": "25.2.0"
            }
        },

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant