ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/tests/logging/tests_default.yml PLAY [Ensure that the role runs with default parameters] *********************** META: ran handlers TASK [Default run (NOOP)] ****************************************************** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/tests/logging/tests_default.yml:9 Tuesday 06 January 2026 18:42:07 -0500 (0:00:00.035) 0:00:00.035 ******* TASK [fedora.linux_system_roles.logging : Set global variables] **************** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:2 Tuesday 06 January 2026 18:42:07 -0500 (0:00:00.077) 0:00:00.113 ******* included: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.logging : Run systemctl] *********************** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/set_vars.yml:7 Tuesday 06 January 2026 18:42:07 -0500 (0:00:00.029) 0:00:00.142 ******* ok: [managed-node1] => { "ansible_facts": { "discovered_interpreter_python": "/usr/libexec/platform-python" }, "changed": false, "cmd": [ "systemctl", "is-system-running" ], "delta": "0:00:00.008382", "end": "2026-01-06 18:42:07.749966", "failed_when_result": false, "rc": 0, "start": "2026-01-06 18:42:07.741584" } STDOUT: running TASK [fedora.linux_system_roles.logging : Require installed systemd] *********** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/set_vars.yml:14 Tuesday 06 January 2026 18:42:07 -0500 (0:00:00.594) 0:00:00.736 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Set flag to indicate that systemd runtime operations are available] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/set_vars.yml:19 Tuesday 06 January 2026 18:42:07 -0500 (0:00:00.047) 0:00:00.783 ******* ok: [managed-node1] => { "ansible_facts": { "__logging_is_booted": true }, "changed": false } TASK [fedora.linux_system_roles.logging : Set files output if files output is not defined and logging_inputs is not empty] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:10 Tuesday 06 January 2026 18:42:07 -0500 (0:00:00.063) 0:00:00.847 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Set rsyslog_outputs] ***************** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:17 Tuesday 06 January 2026 18:42:07 -0500 (0:00:00.060) 0:00:00.907 ******* ok: [managed-node1] => { "ansible_facts": { "rsyslog_outputs": [] }, "changed": false } TASK [fedora.linux_system_roles.logging : Set rsyslog_inputs] ****************** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:21 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.047) 0:00:00.955 ******* ok: [managed-node1] => { "ansible_facts": { "rsyslog_inputs": [] }, "changed": false } TASK [fedora.linux_system_roles.logging : Use of rsyslog_custom_config_files is deprecated] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:25 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.079) 0:00:01.035 ******* skipping: [managed-node1] => {} TASK [fedora.linux_system_roles.logging : Use of type custom is deprecated] **** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:32 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.052) 0:00:01.088 ******* skipping: [managed-node1] => {} TASK [fedora.linux_system_roles.logging : Check logging_inputs item in logging_flows.inputs] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:45 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.045) 0:00:01.133 ******* TASK [fedora.linux_system_roles.logging : Gather ports specified in the logging_inputs and outputs vars] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:55 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.047) 0:00:01.181 ******* included: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/gather_ports.yml for managed-node1 TASK [fedora.linux_system_roles.logging : Initialize ports variables] ********** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/gather_ports.yml:3 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.049) 0:00:01.230 ******* ok: [managed-node1] => { "ansible_facts": { "logging_tcp_ports": [], "logging_tls_tcp_ports": [], "logging_tls_udp_ports": [], "logging_udp_ports": [] }, "changed": false } TASK [fedora.linux_system_roles.logging : Parameter 'port' values] ************* task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/gather_ports.yml:15 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.076) 0:00:01.307 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Update port values from outputs] ***** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/gather_ports.yml:35 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.057) 0:00:01.364 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Update port values from inputs] ****** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/gather_ports.yml:55 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.055) 0:00:01.419 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Manage firewall on the gathered ports] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:58 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.058) 0:00:01.478 ******* included: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/firewall.yml for managed-node1 TASK [fedora.linux_system_roles.logging : Initialize logging_firewall_ports] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/firewall.yml:7 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.077) 0:00:01.556 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Add tcp ports to logging_firewall_ports] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/firewall.yml:11 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.062) 0:00:01.618 ******* TASK [fedora.linux_system_roles.logging : Add udp ports to logging_firewall_ports] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/firewall.yml:17 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.051) 0:00:01.670 ******* TASK [Manage firewall for specified ports] ************************************* task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/firewall.yml:23 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.054) 0:00:01.724 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Manage selinux on the gathered ports] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:61 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.086) 0:00:01.811 ******* included: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/selinux.yml for managed-node1 TASK [fedora.linux_system_roles.logging : Initialize logging_selinux_ports] **** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/selinux.yml:7 Tuesday 06 January 2026 18:42:08 -0500 (0:00:00.082) 0:00:01.893 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Add non tls tcp ports to logging_selinux_ports] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/selinux.yml:11 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.063) 0:00:01.957 ******* TASK [fedora.linux_system_roles.logging : Add tls tcp ports to logging_selinux_ports] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/selinux.yml:16 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.051) 0:00:02.009 ******* TASK [fedora.linux_system_roles.logging : Add non tls udp ports to logging_selinux_ports] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/selinux.yml:21 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.049) 0:00:02.058 ******* TASK [fedora.linux_system_roles.logging : Add tls udp ports to logging_selinux_ports] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/selinux.yml:26 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.043) 0:00:02.101 ******* TASK [Manage selinux for specified ports] ************************************** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/selinux.yml:31 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.037) 0:00:02.139 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Generate certificates] *************** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:65 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.055) 0:00:02.195 ******* included: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/certificate.yml for managed-node1 TASK [fedora.linux_system_roles.logging : Certificates are only supported in a booted system] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/certificate.yml:2 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.082) 0:00:02.278 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Generate certificates] *************************************************** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/certificate.yml:9 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.053) 0:00:02.331 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Re-read facts after adding custom fact] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:71 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.057) 0:00:02.388 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Create rsyslog debug dir] ************ task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:75 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.057) 0:00:02.445 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Delete debug file] ******************* task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:81 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.046) 0:00:02.492 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Create rsyslog debug file] *********** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:86 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.048) 0:00:02.541 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Use a debug var to avoid an empty dict in with_dict] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:93 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.054) 0:00:02.595 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.logging : Populate rsyslog debug file] ********* task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:97 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.075) 0:00:02.671 ******* skipping: [managed-node1] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Include Rsyslog role] **************************************************** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:108 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.089) 0:00:02.761 ******* TASK [fedora.linux_system_roles.private_logging_subrole_rsyslog : Set platform/version specific variables] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/private_logging_subrole_rsyslog/tasks/main.yml:4 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.046) 0:00:02.807 ******* included: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/private_logging_subrole_rsyslog/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.private_logging_subrole_rsyslog : Ensure ansible_facts used by role] *** task path: /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/private_logging_subrole_rsyslog/tasks/set_vars.yml:4 Tuesday 06 January 2026 18:42:09 -0500 (0:00:00.019) 0:00:02.827 ******* An exception occurred during task execution. To see the full traceback, use -vvv. The error was: TypeError: Bad subset 'domain' given to Ansible. gather_subset options allowed: all, all_ipv4_addresses, all_ipv6_addresses, apparmor, architecture, caps, chroot, cmdline, date_time, default_ipv4, default_ipv6, devices, distribution, distribution_major_version, distribution_release, distribution_version, dns, effective_group_ids, effective_user_id, env, facter, fibre_channel_wwn, fips, hardware, interfaces, is_chroot, iscsi, kernel, kernel_version, local, lsb, machine, machine_id, mounts, network, nvme, ohai, os_family, pkg_mgr, platform, processor, processor_cores, processor_count, python, python_version, real_user_id, selinux, service_mgr, ssh_host_key_dsa_public, ssh_host_key_ecdsa_public, ssh_host_key_ed25519_public, ssh_host_key_rsa_public, ssh_host_pub_keys, ssh_pub_keys, system, system_capabilities, system_capabilities_enforced, user, user_dir, user_gecos, user_gid, user_id, user_shell, user_uid, virtual, virtualization_role, virtualization_type fatal: [managed-node1]: FAILED! => { "changed": false, "rc": 1 } MSG: MODULE FAILURE See stdout/stderr for the exact error MODULE_STDOUT: Traceback (most recent call last): File "/root/.ansible/tmp/ansible-tmp-1767742929.9572241-21534-75946433917045/AnsiballZ_setup.py", line 102, in _ansiballz_main() File "/root/.ansible/tmp/ansible-tmp-1767742929.9572241-21534-75946433917045/AnsiballZ_setup.py", line 94, in _ansiballz_main invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS) File "/root/.ansible/tmp/ansible-tmp-1767742929.9572241-21534-75946433917045/AnsiballZ_setup.py", line 40, in invoke_module runpy.run_module(mod_name='ansible.modules.system.setup', init_globals=None, run_name='__main__', alter_sys=True) File "/usr/lib64/python3.6/runpy.py", line 205, in run_module return _run_module_code(code, init_globals, run_name, mod_spec) File "/usr/lib64/python3.6/runpy.py", line 96, in _run_module_code mod_name, mod_spec, pkg_name, script_name) File "/usr/lib64/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmp/ansible_setup_payload_xqhsu12m/ansible_setup_payload.zip/ansible/modules/system/setup.py", line 186, in File "/tmp/ansible_setup_payload_xqhsu12m/ansible_setup_payload.zip/ansible/modules/system/setup.py", line 178, in main File "/tmp/ansible_setup_payload_xqhsu12m/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py", line 124, in get_ansible_collector File "/tmp/ansible_setup_payload_xqhsu12m/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py", line 388, in collector_classes_from_gather_subset File "/tmp/ansible_setup_payload_xqhsu12m/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py", line 186, in get_collector_names TypeError: Bad subset 'domain' given to Ansible. gather_subset options allowed: all, all_ipv4_addresses, all_ipv6_addresses, apparmor, architecture, caps, chroot, cmdline, date_time, default_ipv4, default_ipv6, devices, distribution, distribution_major_version, distribution_release, distribution_version, dns, effective_group_ids, effective_user_id, env, facter, fibre_channel_wwn, fips, hardware, interfaces, is_chroot, iscsi, kernel, kernel_version, local, lsb, machine, machine_id, mounts, network, nvme, ohai, os_family, pkg_mgr, platform, processor, processor_cores, processor_count, python, python_version, real_user_id, selinux, service_mgr, ssh_host_key_dsa_public, ssh_host_key_ecdsa_public, ssh_host_key_ed25519_public, ssh_host_key_rsa_public, ssh_host_pub_keys, ssh_pub_keys, system, system_capabilities, system_capabilities_enforced, user, user_dir, user_gecos, user_gid, user_id, user_shell, user_uid, virtual, virtualization_role, virtualization_type MODULE_STDERR: Shared connection to 10.31.9.129 closed. PLAY RECAP ********************************************************************* managed-node1 : ok=11 changed=0 unreachable=0 failed=1 skipped=26 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.9.27", "end_time": "2026-01-06T23:42:10.629348+00:00Z", "host": "managed-node1", "message": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1, "start_time": "2026-01-06T23:42:09.892058+00:00Z", "task_name": "Ensure ansible_facts used by role", "task_path": "/tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/private_logging_subrole_rsyslog/tasks/set_vars.yml:4" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Tuesday 06 January 2026 18:42:10 -0500 (0:00:00.740) 0:00:03.567 ******* =============================================================================== fedora.linux_system_roles.private_logging_subrole_rsyslog : Ensure ansible_facts used by role --- 0.74s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/private_logging_subrole_rsyslog/tasks/set_vars.yml:4 fedora.linux_system_roles.logging : Run systemctl ----------------------- 0.59s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/set_vars.yml:7 fedora.linux_system_roles.logging : Populate rsyslog debug file --------- 0.09s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:97 Manage firewall for specified ports ------------------------------------- 0.09s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/firewall.yml:23 fedora.linux_system_roles.logging : Manage selinux on the gathered ports --- 0.08s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:61 fedora.linux_system_roles.logging : Generate certificates --------------- 0.08s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:65 fedora.linux_system_roles.logging : Set rsyslog_inputs ------------------ 0.08s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:21 Default run (NOOP) ------------------------------------------------------ 0.08s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/tests/logging/tests_default.yml:9 fedora.linux_system_roles.logging : Manage firewall on the gathered ports --- 0.08s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:58 fedora.linux_system_roles.logging : Initialize ports variables ---------- 0.08s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/gather_ports.yml:3 fedora.linux_system_roles.logging : Use a debug var to avoid an empty dict in with_dict --- 0.08s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:93 fedora.linux_system_roles.logging : Set flag to indicate that systemd runtime operations are available --- 0.06s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/set_vars.yml:19 fedora.linux_system_roles.logging : Initialize logging_selinux_ports ---- 0.06s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/selinux.yml:7 fedora.linux_system_roles.logging : Initialize logging_firewall_ports --- 0.06s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/firewall.yml:7 fedora.linux_system_roles.logging : Set files output if files output is not defined and logging_inputs is not empty --- 0.06s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:10 fedora.linux_system_roles.logging : Update port values from inputs ------ 0.06s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/gather_ports.yml:55 fedora.linux_system_roles.logging : Parameter 'port' values ------------- 0.06s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/gather_ports.yml:15 fedora.linux_system_roles.logging : Re-read facts after adding custom fact --- 0.06s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/main.yml:71 Generate certificates --------------------------------------------------- 0.06s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/certificate.yml:9 Manage selinux for specified ports -------------------------------------- 0.06s /tmp/collections-fJJ/ansible_collections/fedora/linux_system_roles/roles/logging/tasks/selinux.yml:31 -- Logs begin at Tue 2026-01-06 18:33:31 EST, end at Tue 2026-01-06 18:42:11 EST. -- Jan 06 18:42:06 managed-node1 rsyslogd[36945]: parameters for built-in module builtin:omfile already set - ignored [v8.2102.0-15.el8 try https://www.rsyslog.com/e/2220 ] Jan 06 18:42:06 managed-node1 rsyslogd[36945]: [origin software="rsyslogd" swVersion="8.2102.0-15.el8" x-pid="36945" x-info="https://www.rsyslog.com"] start Jan 06 18:42:06 managed-node1 systemd[1]: Started System Logging Service. -- Subject: Unit rsyslog.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit rsyslog.service has finished starting up. -- -- The start-up result is done. Jan 06 18:42:06 managed-node1 rsyslogd[36945]: imjournal: journal files changed, reloading... [v8.2102.0-15.el8 try https://www.rsyslog.com/e/0 ] Jan 06 18:42:06 managed-node1 sshd[36968]: Accepted publickey for root from 10.31.8.132 port 34640 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 06 18:42:06 managed-node1 systemd-logind[602]: New session 9 of user root. -- Subject: A new session 9 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 9 has been created for the user root. -- -- The leading process of the session is 36968. Jan 06 18:42:06 managed-node1 systemd[1]: Started Session 9 of user root. -- Subject: Unit session-9.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-9.scope has finished starting up. -- -- The start-up result is done. Jan 06 18:42:06 managed-node1 sshd[36968]: pam_unix(sshd:session): session opened for user root by (uid=0) Jan 06 18:42:06 managed-node1 sshd[36971]: Received disconnect from 10.31.8.132 port 34640:11: disconnected by user Jan 06 18:42:06 managed-node1 sshd[36971]: Disconnected from user root 10.31.8.132 port 34640 Jan 06 18:42:06 managed-node1 sshd[36968]: pam_unix(sshd:session): session closed for user root Jan 06 18:42:06 managed-node1 systemd[1]: session-9.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-9.scope has successfully entered the 'dead' state. Jan 06 18:42:06 managed-node1 systemd-logind[602]: Session 9 logged out. Waiting for processes to exit. Jan 06 18:42:06 managed-node1 systemd-logind[602]: Removed session 9. -- Subject: Session 9 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 9 has been terminated. Jan 06 18:42:07 managed-node1 platform-python[37134]: ansible-command Invoked with _raw_params=systemctl is-system-running warn=True _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jan 06 18:42:10 managed-node1 platform-python[37258]: ansible-setup Invoked with gather_subset=['!all', '!min', 'default_ipv4', 'distribution', 'distribution_major_version', 'distribution_version', 'domain', 'fqdn', 'hostname', 'os_family', 'pkg_mgr'] gather_timeout=10 filter=* fact_path=/etc/ansible/facts.d Jan 06 18:42:10 managed-node1 sshd[37281]: Accepted publickey for root from 10.31.8.132 port 53396 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 06 18:42:10 managed-node1 systemd[1]: Started Session 10 of user root. -- Subject: Unit session-10.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-10.scope has finished starting up. -- -- The start-up result is done. Jan 06 18:42:10 managed-node1 systemd-logind[602]: New session 10 of user root. -- Subject: A new session 10 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 10 has been created for the user root. -- -- The leading process of the session is 37281. Jan 06 18:42:10 managed-node1 sshd[37281]: pam_unix(sshd:session): session opened for user root by (uid=0) Jan 06 18:42:11 managed-node1 sshd[37284]: Received disconnect from 10.31.8.132 port 53396:11: disconnected by user Jan 06 18:42:11 managed-node1 sshd[37284]: Disconnected from user root 10.31.8.132 port 53396 Jan 06 18:42:11 managed-node1 sshd[37281]: pam_unix(sshd:session): session closed for user root Jan 06 18:42:11 managed-node1 systemd[1]: session-10.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-10.scope has successfully entered the 'dead' state. Jan 06 18:42:11 managed-node1 systemd-logind[602]: Session 10 logged out. Waiting for processes to exit. Jan 06 18:42:11 managed-node1 systemd-logind[602]: Removed session 10. -- Subject: Session 10 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 10 has been terminated. Jan 06 18:42:11 managed-node1 sshd[37305]: Accepted publickey for root from 10.31.8.132 port 53412 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 06 18:42:11 managed-node1 systemd[1]: Started Session 11 of user root. -- Subject: Unit session-11.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-11.scope has finished starting up. -- -- The start-up result is done. Jan 06 18:42:11 managed-node1 systemd-logind[602]: New session 11 of user root. -- Subject: A new session 11 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 11 has been created for the user root. -- -- The leading process of the session is 37305. Jan 06 18:42:11 managed-node1 sshd[37305]: pam_unix(sshd:session): session opened for user root by (uid=0)