ansible-playbook [core 2.17.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-BsF executable location = /usr/local/bin/ansible-playbook python version = 3.12.11 (main, Aug 14 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_check_cron.yml ************************************************* 1 plays in /tmp/collections-BsF/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml PLAY [Ensure that the cron is set up] ****************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-BsF/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:3 Saturday 25 October 2025 07:22:09 -0400 (0:00:00.019) 0:00:00.019 ****** [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node1] TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-BsF/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:9 Saturday 25 October 2025 07:22:11 -0400 (0:00:02.250) 0:00:02.269 ****** ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-BsF/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:14 Saturday 25 October 2025 07:22:11 -0400 (0:00:00.459) 0:00:02.729 ****** ok: [managed-node1] => { "ansible_facts": { "__aide_is_ostree": false }, "changed": false } TASK [Install crontabs] ******************************************************** task path: /tmp/collections-BsF/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:18 Saturday 25 October 2025 07:22:11 -0400 (0:00:00.022) 0:00:02.751 ****** fatal: [managed-node1]: FAILED! => { "changed": false, "rc": 1, "results": [] } MSG: Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried PLAY RECAP ********************************************************************* managed-node1 : ok=3 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.14", "end_time": "2025-10-25T11:22:24.332271+00:00Z", "host": "managed-node1", "message": "Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", "rc": 1, "start_time": "2025-10-25T11:22:11.991870+00:00Z", "task_name": "Install crontabs", "task_path": "/tmp/collections-BsF/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:18" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Saturday 25 October 2025 07:22:24 -0400 (0:00:12.342) 0:00:15.093 ****** =============================================================================== Install crontabs ------------------------------------------------------- 12.34s /tmp/collections-BsF/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:18 Gathering Facts --------------------------------------------------------- 2.25s /tmp/collections-BsF/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:3 Check if system is ostree ----------------------------------------------- 0.46s /tmp/collections-BsF/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:9 Set flag to indicate system is ostree ----------------------------------- 0.02s /tmp/collections-BsF/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:14 Oct 25 07:22:08 managed-node1 sshd-session[7356]: Accepted publickey for root from 10.31.45.108 port 39834 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 25 07:22:08 managed-node1 systemd-logind[607]: New session 5 of user root. ░░ Subject: A new session 5 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 5 has been created for the user root. ░░ ░░ The leading process of the session is 7356. Oct 25 07:22:08 managed-node1 systemd[1]: Started Session 5 of User root. ░░ Subject: A start job for unit session-5.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-5.scope has finished successfully. ░░ ░░ The job identifier is 900. Oct 25 07:22:08 managed-node1 sshd-session[7356]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 25 07:22:08 managed-node1 sshd-session[7359]: Received disconnect from 10.31.45.108 port 39834:11: disconnected by user Oct 25 07:22:08 managed-node1 sshd-session[7359]: Disconnected from user root 10.31.45.108 port 39834 Oct 25 07:22:08 managed-node1 sshd-session[7356]: pam_unix(sshd:session): session closed for user root Oct 25 07:22:08 managed-node1 systemd[1]: session-5.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-5.scope has successfully entered the 'dead' state. Oct 25 07:22:08 managed-node1 systemd-logind[607]: Session 5 logged out. Waiting for processes to exit. Oct 25 07:22:08 managed-node1 systemd-logind[607]: Removed session 5. ░░ Subject: Session 5 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 5 has been terminated. Oct 25 07:22:08 managed-node1 sshd-session[7383]: Accepted publickey for root from 10.31.45.108 port 39838 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 25 07:22:08 managed-node1 systemd-logind[607]: New session 6 of user root. ░░ Subject: A new session 6 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 6 has been created for the user root. ░░ ░░ The leading process of the session is 7383. Oct 25 07:22:08 managed-node1 systemd[1]: Started Session 6 of User root. ░░ Subject: A start job for unit session-6.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-6.scope has finished successfully. ░░ ░░ The job identifier is 969. Oct 25 07:22:08 managed-node1 sshd-session[7383]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 25 07:22:08 managed-node1 sshd-session[7386]: Received disconnect from 10.31.45.108 port 39838:11: disconnected by user Oct 25 07:22:08 managed-node1 sshd-session[7386]: Disconnected from user root 10.31.45.108 port 39838 Oct 25 07:22:08 managed-node1 sshd-session[7383]: pam_unix(sshd:session): session closed for user root Oct 25 07:22:08 managed-node1 systemd[1]: session-6.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-6.scope has successfully entered the 'dead' state. Oct 25 07:22:08 managed-node1 systemd-logind[607]: Session 6 logged out. Waiting for processes to exit. Oct 25 07:22:08 managed-node1 systemd-logind[607]: Removed session 6. ░░ Subject: Session 6 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 6 has been terminated. Oct 25 07:22:09 managed-node1 systemd[1]: systemd-hostnamed.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit systemd-hostnamed.service has successfully entered the 'dead' state. Oct 25 07:22:09 managed-node1 sshd-session[7414]: Accepted publickey for root from 10.31.45.108 port 39842 ssh2: ECDSA SHA256:u+sTNkN0tbmV24WDqK9xAcjRFAPOwy223tH2A/R0vIo Oct 25 07:22:09 managed-node1 systemd-logind[607]: New session 7 of user root. ░░ Subject: A new session 7 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 7 has been created for the user root. ░░ ░░ The leading process of the session is 7414. Oct 25 07:22:09 managed-node1 systemd[1]: Started Session 7 of User root. ░░ Subject: A start job for unit session-7.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-7.scope has finished successfully. ░░ ░░ The job identifier is 1038. Oct 25 07:22:09 managed-node1 sshd-session[7414]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 25 07:22:11 managed-node1 python3.9[7591]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 25 07:22:11 managed-node1 python3.9[7766]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 25 07:22:12 managed-node1 python3.9[7915]: ansible-ansible.legacy.dnf Invoked with name=['crontabs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 25 07:22:24 managed-node1 sshd-session[7997]: Accepted publickey for root from 10.31.45.108 port 47076 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 25 07:22:24 managed-node1 systemd-logind[607]: New session 8 of user root. ░░ Subject: A new session 8 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 8 has been created for the user root. ░░ ░░ The leading process of the session is 7997. Oct 25 07:22:24 managed-node1 systemd[1]: Started Session 8 of User root. ░░ Subject: A start job for unit session-8.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-8.scope has finished successfully. ░░ ░░ The job identifier is 1108. Oct 25 07:22:24 managed-node1 sshd-session[7997]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 25 07:22:24 managed-node1 sshd-session[8000]: Received disconnect from 10.31.45.108 port 47076:11: disconnected by user Oct 25 07:22:24 managed-node1 sshd-session[8000]: Disconnected from user root 10.31.45.108 port 47076 Oct 25 07:22:24 managed-node1 sshd-session[7997]: pam_unix(sshd:session): session closed for user root Oct 25 07:22:24 managed-node1 systemd[1]: session-8.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-8.scope has successfully entered the 'dead' state. Oct 25 07:22:24 managed-node1 systemd-logind[607]: Session 8 logged out. Waiting for processes to exit. Oct 25 07:22:24 managed-node1 systemd-logind[607]: Removed session 8. ░░ Subject: Session 8 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 8 has been terminated. Oct 25 07:22:24 managed-node1 sshd-session[8025]: Accepted publickey for root from 10.31.45.108 port 47092 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 25 07:22:24 managed-node1 systemd-logind[607]: New session 9 of user root. ░░ Subject: A new session 9 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 9 has been created for the user root. ░░ ░░ The leading process of the session is 8025. Oct 25 07:22:24 managed-node1 systemd[1]: Started Session 9 of User root. ░░ Subject: A start job for unit session-9.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-9.scope has finished successfully. ░░ ░░ The job identifier is 1177. Oct 25 07:22:24 managed-node1 sshd-session[8025]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)