ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_check_cron.yml ************************************************* 1 plays in /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml PLAY [Ensure that the cron is set up] ****************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:3 Tuesday 13 January 2026 18:32:46 -0500 (0:00:00.017) 0:00:00.017 ******* ok: [managed-node1] META: ran handlers TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:9 Tuesday 13 January 2026 18:32:47 -0500 (0:00:00.998) 0:00:01.016 ******* ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:14 Tuesday 13 January 2026 18:32:47 -0500 (0:00:00.437) 0:00:01.453 ******* ok: [managed-node1] => { "ansible_facts": { "__aide_is_ostree": false }, "changed": false } TASK [Install crontabs] ******************************************************** task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:18 Tuesday 13 January 2026 18:32:47 -0500 (0:00:00.035) 0:00:01.488 ******* ok: [managed-node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Create tempfile for crontab backup] ************************************** task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:24 Tuesday 13 January 2026 18:33:01 -0500 (0:00:14.232) 0:00:15.721 ******* changed: [managed-node1] => { "changed": true, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/aide_tkawgne1_crontab", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Backup crontab] ********************************************************** task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:30 Tuesday 13 January 2026 18:33:02 -0500 (0:00:00.420) 0:00:16.141 ******* changed: [managed-node1] => { "changed": true, "checksum": "09767e814c22a93daba80e274dcbe00e0e7a8b99", "dest": "/tmp/aide_tkawgne1_crontab", "gid": 0, "group": "root", "md5sum": "c39252b11aad842fcb75e05c6a27eef8", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 451, "src": "/etc/crontab", "state": "file", "uid": 0 } TASK [Run the role and set up cron] ******************************************** task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:39 Tuesday 13 January 2026 18:33:02 -0500 (0:00:00.452) 0:00:16.594 ******* TASK [fedora.linux_system_roles.aide : Set platform/version specific variables] *** task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/main.yml:3 Tuesday 13 January 2026 18:33:02 -0500 (0:00:00.031) 0:00:16.625 ******* included: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.aide : Ensure ansible_facts used by role] ****** task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:2 Tuesday 13 January 2026 18:33:02 -0500 (0:00:00.019) 0:00:16.645 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.aide : Check if system is ostree] ************** task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:10 Tuesday 13 January 2026 18:33:02 -0500 (0:00:00.034) 0:00:16.679 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.aide : Set flag to indicate system is ostree] *** task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:15 Tuesday 13 January 2026 18:33:02 -0500 (0:00:00.033) 0:00:16.713 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.aide : Set platform/version specific variables] *** task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:19 Tuesday 13 January 2026 18:33:02 -0500 (0:00:00.037) 0:00:16.751 ******* skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS_8.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.aide : Debug0] ********************************* task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/main.yml:6 Tuesday 13 January 2026 18:33:03 -0500 (0:00:00.051) 0:00:16.802 ******* fatal: [managed-node1]: FAILED! => {} MSG: The task includes an option with an undefined variable. The error was: 'ansible_distribution' is undefined The error appears to be in '/tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/main.yml': line 6, column 3, but may be elsewhere in the file depending on the exact syntax problem. The offending line appears to be: - name: Debug0 ^ here TASK [Restore crontab] ********************************************************* task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:87 Tuesday 13 January 2026 18:33:03 -0500 (0:00:00.031) 0:00:16.834 ******* ok: [managed-node1] => { "changed": false, "checksum": "09767e814c22a93daba80e274dcbe00e0e7a8b99", "dest": "/etc/crontab", "gid": 0, "group": "root", "md5sum": "c39252b11aad842fcb75e05c6a27eef8", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:system_cron_spool_t:s0", "size": 451, "src": "/tmp/aide_tkawgne1_crontab", "state": "file", "uid": 0 } TASK [Delete tempfile] ********************************************************* task path: /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:94 Tuesday 13 January 2026 18:33:03 -0500 (0:00:00.351) 0:00:17.186 ******* changed: [managed-node1] => { "changed": true, "path": "/tmp/aide_tkawgne1_crontab", "state": "absent" } PLAY RECAP ********************************************************************* managed-node1 : ok=9 changed=3 unreachable=0 failed=1 skipped=4 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.9.27", "end_time": "2026-01-13T23:33:03.032628+00:00Z", "host": "managed-node1", "message": "The task includes an option with an undefined variable. The error was: 'ansible_distribution' is undefined\n\nThe error appears to be in '/tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/main.yml': line 6, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n- name: Debug0\n ^ here\n", "start_time": "2026-01-13T23:33:03.003576+00:00Z", "task_name": "Debug0", "task_path": "/tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/main.yml:6" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Tuesday 13 January 2026 18:33:03 -0500 (0:00:00.450) 0:00:17.637 ******* =============================================================================== Install crontabs ------------------------------------------------------- 14.23s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:18 Gathering Facts --------------------------------------------------------- 1.00s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:3 Backup crontab ---------------------------------------------------------- 0.45s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:30 Delete tempfile --------------------------------------------------------- 0.45s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:94 Check if system is ostree ----------------------------------------------- 0.44s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:9 Create tempfile for crontab backup -------------------------------------- 0.42s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:24 Restore crontab --------------------------------------------------------- 0.35s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:87 fedora.linux_system_roles.aide : Set platform/version specific variables --- 0.05s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:19 fedora.linux_system_roles.aide : Set flag to indicate system is ostree --- 0.04s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:15 Set flag to indicate system is ostree ----------------------------------- 0.04s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:14 fedora.linux_system_roles.aide : Ensure ansible_facts used by role ------ 0.03s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:2 fedora.linux_system_roles.aide : Check if system is ostree -------------- 0.03s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:10 fedora.linux_system_roles.aide : Debug0 --------------------------------- 0.03s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/main.yml:6 Run the role and set up cron -------------------------------------------- 0.03s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:39 fedora.linux_system_roles.aide : Set platform/version specific variables --- 0.02s /tmp/collections-Q2o/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/main.yml:3 -- Logs begin at Tue 2026-01-13 18:14:12 EST, end at Tue 2026-01-13 18:33:04 EST. -- Jan 13 18:32:45 managed-node1 sshd[7011]: Accepted publickey for root from 10.31.42.96 port 58112 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 13 18:32:45 managed-node1 systemd[1]: Started Session 5 of user root. -- Subject: Unit session-5.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-5.scope has finished starting up. -- -- The start-up result is done. Jan 13 18:32:45 managed-node1 systemd-logind[592]: New session 5 of user root. -- Subject: A new session 5 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 5 has been created for the user root. -- -- The leading process of the session is 7011. Jan 13 18:32:45 managed-node1 sshd[7011]: pam_unix(sshd:session): session opened for user root by (uid=0) Jan 13 18:32:45 managed-node1 sshd[7014]: Received disconnect from 10.31.42.96 port 58112:11: disconnected by user Jan 13 18:32:45 managed-node1 sshd[7014]: Disconnected from user root 10.31.42.96 port 58112 Jan 13 18:32:45 managed-node1 sshd[7011]: pam_unix(sshd:session): session closed for user root Jan 13 18:32:45 managed-node1 systemd[1]: session-5.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-5.scope has successfully entered the 'dead' state. Jan 13 18:32:45 managed-node1 systemd-logind[592]: Session 5 logged out. Waiting for processes to exit. Jan 13 18:32:45 managed-node1 systemd-logind[592]: Removed session 5. -- Subject: Session 5 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 5 has been terminated. Jan 13 18:32:45 managed-node1 sshd[7034]: Accepted publickey for root from 10.31.42.96 port 58122 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 13 18:32:45 managed-node1 systemd[1]: Started Session 6 of user root. -- Subject: Unit session-6.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-6.scope has finished starting up. -- -- The start-up result is done. Jan 13 18:32:45 managed-node1 systemd-logind[592]: New session 6 of user root. -- Subject: A new session 6 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 6 has been created for the user root. -- -- The leading process of the session is 7034. Jan 13 18:32:45 managed-node1 sshd[7034]: pam_unix(sshd:session): session opened for user root by (uid=0) Jan 13 18:32:45 managed-node1 sshd[7037]: Received disconnect from 10.31.42.96 port 58122:11: disconnected by user Jan 13 18:32:45 managed-node1 sshd[7037]: Disconnected from user root 10.31.42.96 port 58122 Jan 13 18:32:45 managed-node1 sshd[7034]: pam_unix(sshd:session): session closed for user root Jan 13 18:32:45 managed-node1 systemd[1]: session-6.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-6.scope has successfully entered the 'dead' state. Jan 13 18:32:45 managed-node1 systemd-logind[592]: Session 6 logged out. Waiting for processes to exit. Jan 13 18:32:45 managed-node1 systemd-logind[592]: Removed session 6. -- Subject: Session 6 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 6 has been terminated. Jan 13 18:32:46 managed-node1 sshd[7059]: Accepted publickey for root from 10.31.42.96 port 58128 ssh2: ECDSA SHA256:aw3EcqUeAtrbvxBCnHFMIOVSKUr5hYSFhJl4vD7IJig Jan 13 18:32:46 managed-node1 systemd-logind[592]: New session 7 of user root. -- Subject: A new session 7 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 7 has been created for the user root. -- -- The leading process of the session is 7059. Jan 13 18:32:46 managed-node1 systemd[1]: Started Session 7 of user root. -- Subject: Unit session-7.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-7.scope has finished starting up. -- -- The start-up result is done. Jan 13 18:32:46 managed-node1 sshd[7059]: pam_unix(sshd:session): session opened for user root by (uid=0) Jan 13 18:32:46 managed-node1 platform-python[7204]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=* fact_path=/etc/ansible/facts.d Jan 13 18:32:47 managed-node1 platform-python[7352]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jan 13 18:32:48 managed-node1 platform-python[7475]: ansible-dnf Invoked with name=['crontabs'] state=present allow_downgrade=False autoremove=False bugfix=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True lock_timeout=30 conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jan 13 18:33:02 managed-node1 platform-python[7646]: ansible-tempfile Invoked with prefix=aide_ suffix=_crontab state=file path=None Jan 13 18:33:02 managed-node1 platform-python[7769]: ansible-copy Invoked with src=/etc/crontab dest=/tmp/aide_tkawgne1_crontab remote_src=True mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None regexp=None delimiter=None Jan 13 18:33:03 managed-node1 platform-python[7894]: ansible-copy Invoked with src=/tmp/aide_tkawgne1_crontab dest=/etc/crontab remote_src=True mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None regexp=None delimiter=None Jan 13 18:33:03 managed-node1 platform-python[8017]: ansible-file Invoked with path=/tmp/aide_tkawgne1_crontab state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None content=NOT_LOGGING_PARAMETER backup=None remote_src=None regexp=None delimiter=None directory_mode=None Jan 13 18:33:03 managed-node1 sshd[8038]: Accepted publickey for root from 10.31.42.96 port 54596 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 13 18:33:03 managed-node1 systemd[1]: Started Session 8 of user root. -- Subject: Unit session-8.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-8.scope has finished starting up. -- -- The start-up result is done. Jan 13 18:33:03 managed-node1 systemd-logind[592]: New session 8 of user root. -- Subject: A new session 8 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 8 has been created for the user root. -- -- The leading process of the session is 8038. Jan 13 18:33:03 managed-node1 sshd[8038]: pam_unix(sshd:session): session opened for user root by (uid=0) Jan 13 18:33:04 managed-node1 sshd[8041]: Received disconnect from 10.31.42.96 port 54596:11: disconnected by user Jan 13 18:33:04 managed-node1 sshd[8041]: Disconnected from user root 10.31.42.96 port 54596 Jan 13 18:33:04 managed-node1 sshd[8038]: pam_unix(sshd:session): session closed for user root Jan 13 18:33:04 managed-node1 systemd[1]: session-8.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-8.scope has successfully entered the 'dead' state. Jan 13 18:33:04 managed-node1 systemd-logind[592]: Session 8 logged out. Waiting for processes to exit. Jan 13 18:33:04 managed-node1 systemd-logind[592]: Removed session 8. -- Subject: Session 8 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 8 has been terminated. Jan 13 18:33:04 managed-node1 sshd[8062]: Accepted publickey for root from 10.31.42.96 port 54600 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 13 18:33:04 managed-node1 systemd[1]: Started Session 9 of user root. -- Subject: Unit session-9.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-9.scope has finished starting up. -- -- The start-up result is done. Jan 13 18:33:04 managed-node1 systemd-logind[592]: New session 9 of user root. -- Subject: A new session 9 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 9 has been created for the user root. -- -- The leading process of the session is 8062. Jan 13 18:33:04 managed-node1 sshd[8062]: pam_unix(sshd:session): session opened for user root by (uid=0)