ansible-playbook [core 2.17.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-3bU executable location = /usr/local/bin/ansible-playbook python version = 3.12.12 (main, Jan 8 2026, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-14)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_check_cron.yml ************************************************* 1 plays in /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml PLAY [Ensure that the cron is set up] ****************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:3 Tuesday 13 January 2026 18:16:23 -0500 (0:00:00.026) 0:00:00.026 ******* [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node1] TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:9 Tuesday 13 January 2026 18:16:26 -0500 (0:00:02.650) 0:00:02.676 ******* ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:14 Tuesday 13 January 2026 18:16:26 -0500 (0:00:00.558) 0:00:03.235 ******* ok: [managed-node1] => { "ansible_facts": { "__aide_is_ostree": false }, "changed": false } TASK [Install crontabs] ******************************************************** task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:18 Tuesday 13 January 2026 18:16:26 -0500 (0:00:00.031) 0:00:03.267 ******* ok: [managed-node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Create tempfile for crontab backup] ************************************** task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:24 Tuesday 13 January 2026 18:16:38 -0500 (0:00:11.226) 0:00:14.494 ******* changed: [managed-node1] => { "changed": true, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/aide_pju4rrq2_crontab", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Backup crontab] ********************************************************** task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:30 Tuesday 13 January 2026 18:16:38 -0500 (0:00:00.502) 0:00:14.996 ******* changed: [managed-node1] => { "changed": true, "checksum": "09767e814c22a93daba80e274dcbe00e0e7a8b99", "dest": "/tmp/aide_pju4rrq2_crontab", "gid": 0, "group": "root", "md5sum": "c39252b11aad842fcb75e05c6a27eef8", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:system_cron_spool_t:s0", "size": 451, "src": "/etc/crontab", "state": "file", "uid": 0 } TASK [Run the role and set up cron] ******************************************** task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:39 Tuesday 13 January 2026 18:16:39 -0500 (0:00:00.524) 0:00:15.520 ******* included: fedora.linux_system_roles.aide for managed-node1 TASK [fedora.linux_system_roles.aide : Set platform/version specific variables] *** task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/main.yml:3 Tuesday 13 January 2026 18:16:39 -0500 (0:00:00.056) 0:00:15.577 ******* included: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.aide : Ensure ansible_facts used by role] ****** task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:2 Tuesday 13 January 2026 18:16:39 -0500 (0:00:00.029) 0:00:15.606 ******* skipping: [managed-node1] => { "changed": false, "false_condition": "__aide_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.aide : Check if system is ostree] ************** task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:10 Tuesday 13 January 2026 18:16:39 -0500 (0:00:00.060) 0:00:15.666 ******* skipping: [managed-node1] => { "changed": false, "false_condition": "not __aide_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.aide : Set flag to indicate system is ostree] *** task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:15 Tuesday 13 January 2026 18:16:39 -0500 (0:00:00.024) 0:00:15.691 ******* skipping: [managed-node1] => { "changed": false, "false_condition": "not __aide_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.aide : Set platform/version specific variables] *** task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:19 Tuesday 13 January 2026 18:16:39 -0500 (0:00:00.026) 0:00:15.717 ******* skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.aide : Debug0] ********************************* task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/main.yml:6 Tuesday 13 January 2026 18:16:39 -0500 (0:00:00.051) 0:00:15.769 ******* fatal: [managed-node1]: FAILED! => {} MSG: The task includes an option with an undefined variable.. 'ansible_distribution' is undefined The error appears to be in '/tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/main.yml': line 6, column 3, but may be elsewhere in the file depending on the exact syntax problem. The offending line appears to be: - name: Debug0 ^ here TASK [Restore crontab] ********************************************************* task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:87 Tuesday 13 January 2026 18:16:39 -0500 (0:00:00.021) 0:00:15.790 ******* ok: [managed-node1] => { "changed": false, "checksum": "09767e814c22a93daba80e274dcbe00e0e7a8b99", "dest": "/etc/crontab", "gid": 0, "group": "root", "md5sum": "c39252b11aad842fcb75e05c6a27eef8", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:system_cron_spool_t:s0", "size": 451, "src": "/tmp/aide_pju4rrq2_crontab", "state": "file", "uid": 0 } TASK [Delete tempfile] ********************************************************* task path: /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:94 Tuesday 13 January 2026 18:16:39 -0500 (0:00:00.381) 0:00:16.172 ******* changed: [managed-node1] => { "changed": true, "path": "/tmp/aide_pju4rrq2_crontab", "state": "absent" } PLAY RECAP ********************************************************************* managed-node1 : ok=10 changed=3 unreachable=0 failed=1 skipped=4 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.14", "end_time": "2026-01-13T23:16:39.298453+00:00Z", "host": "managed-node1", "message": "The task includes an option with an undefined variable.. 'ansible_distribution' is undefined\n\nThe error appears to be in '/tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/main.yml': line 6, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n- name: Debug0\n ^ here\n", "start_time": "2026-01-13T23:16:39.283961+00:00Z", "task_name": "Debug0", "task_path": "/tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/main.yml:6" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Tuesday 13 January 2026 18:16:40 -0500 (0:00:00.514) 0:00:16.686 ******* =============================================================================== Install crontabs ------------------------------------------------------- 11.23s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:18 Gathering Facts --------------------------------------------------------- 2.65s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:3 Check if system is ostree ----------------------------------------------- 0.56s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:9 Backup crontab ---------------------------------------------------------- 0.52s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:30 Delete tempfile --------------------------------------------------------- 0.51s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:94 Create tempfile for crontab backup -------------------------------------- 0.50s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:24 Restore crontab --------------------------------------------------------- 0.38s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:87 fedora.linux_system_roles.aide : Ensure ansible_facts used by role ------ 0.06s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:2 Run the role and set up cron -------------------------------------------- 0.06s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:39 fedora.linux_system_roles.aide : Set platform/version specific variables --- 0.05s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:19 Set flag to indicate system is ostree ----------------------------------- 0.03s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/tests/aide/tests_check_cron.yml:14 fedora.linux_system_roles.aide : Set platform/version specific variables --- 0.03s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/main.yml:3 fedora.linux_system_roles.aide : Set flag to indicate system is ostree --- 0.03s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:15 fedora.linux_system_roles.aide : Check if system is ostree -------------- 0.02s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/set_vars.yml:10 fedora.linux_system_roles.aide : Debug0 --------------------------------- 0.02s /tmp/collections-3bU/ansible_collections/fedora/linux_system_roles/roles/aide/tasks/main.yml:6 Jan 13 18:16:22 managed-node1 sshd-session[7130]: Accepted publickey for root from 10.31.11.249 port 49256 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 13 18:16:22 managed-node1 systemd-logind[592]: New session 5 of user root. ░░ Subject: A new session 5 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 5 has been created for the user root. ░░ ░░ The leading process of the session is 7130. Jan 13 18:16:22 managed-node1 systemd[1]: Started Session 5 of User root. ░░ Subject: A start job for unit session-5.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-5.scope has finished successfully. ░░ ░░ The job identifier is 907. Jan 13 18:16:22 managed-node1 sshd-session[7130]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Jan 13 18:16:22 managed-node1 sshd-session[7133]: Received disconnect from 10.31.11.249 port 49256:11: disconnected by user Jan 13 18:16:22 managed-node1 sshd-session[7133]: Disconnected from user root 10.31.11.249 port 49256 Jan 13 18:16:22 managed-node1 sshd-session[7130]: pam_unix(sshd:session): session closed for user root Jan 13 18:16:22 managed-node1 systemd[1]: session-5.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-5.scope has successfully entered the 'dead' state. Jan 13 18:16:22 managed-node1 systemd-logind[592]: Session 5 logged out. Waiting for processes to exit. Jan 13 18:16:22 managed-node1 systemd-logind[592]: Removed session 5. ░░ Subject: Session 5 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 5 has been terminated. Jan 13 18:16:22 managed-node1 sshd-session[7157]: Accepted publickey for root from 10.31.11.249 port 49264 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 13 18:16:22 managed-node1 systemd-logind[592]: New session 6 of user root. ░░ Subject: A new session 6 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 6 has been created for the user root. ░░ ░░ The leading process of the session is 7157. Jan 13 18:16:22 managed-node1 systemd[1]: Started Session 6 of User root. ░░ Subject: A start job for unit session-6.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-6.scope has finished successfully. ░░ ░░ The job identifier is 976. Jan 13 18:16:22 managed-node1 sshd-session[7157]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Jan 13 18:16:22 managed-node1 sshd-session[7160]: Received disconnect from 10.31.11.249 port 49264:11: disconnected by user Jan 13 18:16:22 managed-node1 sshd-session[7160]: Disconnected from user root 10.31.11.249 port 49264 Jan 13 18:16:22 managed-node1 sshd-session[7157]: pam_unix(sshd:session): session closed for user root Jan 13 18:16:22 managed-node1 systemd-logind[592]: Session 6 logged out. Waiting for processes to exit. Jan 13 18:16:22 managed-node1 systemd[1]: session-6.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-6.scope has successfully entered the 'dead' state. Jan 13 18:16:22 managed-node1 systemd-logind[592]: Removed session 6. ░░ Subject: Session 6 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 6 has been terminated. Jan 13 18:16:23 managed-node1 sshd-session[7188]: Accepted publickey for root from 10.31.11.249 port 49274 ssh2: ECDSA SHA256:WpDW4nzJZkd96L5Z3w4nPuZVFab1oYYoU9eOWr8eouU Jan 13 18:16:23 managed-node1 systemd-logind[592]: New session 7 of user root. ░░ Subject: A new session 7 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 7 has been created for the user root. ░░ ░░ The leading process of the session is 7188. Jan 13 18:16:23 managed-node1 systemd[1]: Started Session 7 of User root. ░░ Subject: A start job for unit session-7.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-7.scope has finished successfully. ░░ ░░ The job identifier is 1045. Jan 13 18:16:23 managed-node1 sshd-session[7188]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Jan 13 18:16:25 managed-node1 python3.9[7365]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jan 13 18:16:26 managed-node1 python3.9[7540]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jan 13 18:16:27 managed-node1 python3.9[7689]: ansible-ansible.legacy.dnf Invoked with name=['crontabs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Jan 13 18:16:38 managed-node1 python3.9[7873]: ansible-tempfile Invoked with prefix=aide_ suffix=_crontab state=file path=None Jan 13 18:16:38 managed-node1 python3.9[8022]: ansible-ansible.legacy.copy Invoked with src=/etc/crontab dest=/tmp/aide_pju4rrq2_crontab remote_src=True mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jan 13 18:16:39 managed-node1 python3.9[8171]: ansible-ansible.legacy.copy Invoked with src=/tmp/aide_pju4rrq2_crontab dest=/etc/crontab remote_src=True mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jan 13 18:16:40 managed-node1 python3.9[8320]: ansible-file Invoked with path=/tmp/aide_pju4rrq2_crontab state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jan 13 18:16:40 managed-node1 sshd-session[8345]: Accepted publickey for root from 10.31.11.249 port 46384 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 13 18:16:40 managed-node1 systemd-logind[592]: New session 8 of user root. ░░ Subject: A new session 8 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 8 has been created for the user root. ░░ ░░ The leading process of the session is 8345. Jan 13 18:16:40 managed-node1 systemd[1]: Started Session 8 of User root. ░░ Subject: A start job for unit session-8.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-8.scope has finished successfully. ░░ ░░ The job identifier is 1115. Jan 13 18:16:40 managed-node1 sshd-session[8345]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Jan 13 18:16:40 managed-node1 sshd-session[8348]: Received disconnect from 10.31.11.249 port 46384:11: disconnected by user Jan 13 18:16:40 managed-node1 sshd-session[8348]: Disconnected from user root 10.31.11.249 port 46384 Jan 13 18:16:40 managed-node1 sshd-session[8345]: pam_unix(sshd:session): session closed for user root Jan 13 18:16:40 managed-node1 systemd[1]: session-8.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-8.scope has successfully entered the 'dead' state. Jan 13 18:16:40 managed-node1 systemd-logind[592]: Session 8 logged out. Waiting for processes to exit. Jan 13 18:16:40 managed-node1 systemd-logind[592]: Removed session 8. ░░ Subject: Session 8 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 8 has been terminated. Jan 13 18:16:40 managed-node1 sshd-session[8373]: Accepted publickey for root from 10.31.11.249 port 46388 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 13 18:16:40 managed-node1 systemd-logind[592]: New session 9 of user root. ░░ Subject: A new session 9 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 9 has been created for the user root. ░░ ░░ The leading process of the session is 8373. Jan 13 18:16:40 managed-node1 systemd[1]: Started Session 9 of User root. ░░ Subject: A start job for unit session-9.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-9.scope has finished successfully. ░░ ░░ The job identifier is 1184. Jan 13 18:16:40 managed-node1 sshd-session[8373]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)