ansible-playbook [core 2.17.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-IiF executable location = /usr/local/bin/ansible-playbook python version = 3.12.11 (main, Aug 14 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 2 plays in /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/tests/podman/tests_default.yml PLAY [all] ********************************************************************* TASK [Include vault variables] ************************************************* task path: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/tests/podman/tests_default.yml:5 Saturday 11 October 2025 12:28:42 -0400 (0:00:00.029) 0:00:00.029 ****** ok: [managed-node1] => { "ansible_facts": { "__podman_test_password": { "__ansible_vault": "$ANSIBLE_VAULT;1.1;AES256\n35383939616163653333633431363463313831383037386236646138333162396161356130303461\n3932623930643263313563336163316337643562333936360a363538636631313039343233383732\n38666530383538656639363465313230343533386130303833336434303438333161656262346562\n3362626538613031640a663330613638366132356534363534353239616666653466353961323533\n6565\n" }, "mysql_container_root_password": { "__ansible_vault": "$ANSIBLE_VAULT;1.1;AES256\n61333932373230333539663035366431326163363166363036323963623131363530326231303634\n6635326161643165363366323062333334363730376631660a393566366139353861656364656661\n38653463363837336639363032646433666361646535366137303464623261313663643336306465\n6264663730656337310a343962353137386238383064646533366433333437303566656433386233\n34343235326665646661623131643335313236313131353661386338343366316261643634653633\n3832313034366536616531323963333234326461353130303532\n" } }, "ansible_included_var_files": [ "/tmp/podman-Zbw/tests/vars/vault-variables.yml" ], "changed": false } PLAY [Ensure that the role runs with default parameters] *********************** TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Saturday 11 October 2025 12:28:42 -0400 (0:00:00.023) 0:00:00.053 ****** included: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.podman : Ensure ansible_facts used by role] **** task path: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 Saturday 11 October 2025 12:28:42 -0400 (0:00:00.020) 0:00:00.073 ****** [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node1] TASK [fedora.linux_system_roles.podman : Check if system is ostree] ************ task path: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 Saturday 11 October 2025 12:28:42 -0400 (0:00:00.763) 0:00:00.837 ****** ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Set flag to indicate system is ostree] *** task path: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Saturday 11 October 2025 12:28:43 -0400 (0:00:00.432) 0:00:01.270 ****** ok: [managed-node1] => { "ansible_facts": { "__podman_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Check if transactional-update exists in /sbin] *** task path: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:23 Saturday 11 October 2025 12:28:43 -0400 (0:00:00.026) 0:00:01.296 ****** ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.podman : Set flag if transactional-update exists] *** task path: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:28 Saturday 11 October 2025 12:28:43 -0400 (0:00:00.341) 0:00:01.638 ****** ok: [managed-node1] => { "ansible_facts": { "__podman_is_transactional": false }, "changed": false } TASK [fedora.linux_system_roles.podman : Set platform/version specific variables] *** task path: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:32 Saturday 11 October 2025 12:28:43 -0400 (0:00:00.028) 0:00:01.666 ****** [WARNING]: TASK: fedora.linux_system_roles.podman : Set platform/version specific variables: The loop variable '__vars_file' is already in use. You should set the `loop_var` value in the `loop_control` option for the task to something else to avoid variable collisions and unexpected behavior. skipping: [managed-node1] => (item=RedHat.yml) => { "__vars_file": "RedHat.yml", "ansible_loop_var": "__vars_file", "changed": false, "false_condition": "__vars_file is file", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "__vars_file": "CentOS.yml", "ansible_loop_var": "__vars_file", "changed": false, "false_condition": "__vars_file is file", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS_9.yml) => { "__vars_file": "CentOS_9.yml", "ansible_loop_var": "__vars_file", "changed": false, "false_condition": "__vars_file is file", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS_9.yml) => { "__vars_file": "CentOS_9.yml", "ansible_loop_var": "__vars_file", "changed": false, "false_condition": "__vars_file is file", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.podman : Run systemctl] ************************ task path: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:52 Saturday 11 October 2025 12:28:43 -0400 (0:00:00.045) 0:00:01.712 ****** ok: [managed-node1] => { "changed": false, "cmd": [ "systemctl", "is-system-running" ], "delta": "0:00:00.008838", "end": "2025-10-11 12:28:44.325221", "failed_when_result": false, "rc": 0, "start": "2025-10-11 12:28:44.316383" } STDOUT: running TASK [fedora.linux_system_roles.podman : Require installed systemd] ************ task path: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:60 Saturday 11 October 2025 12:28:44 -0400 (0:00:00.521) 0:00:02.233 ****** skipping: [managed-node1] => { "changed": false, "false_condition": "\"No such file or directory\" in __is_system_running.msg | d(\"\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Set flag to indicate that systemd runtime operations are available] *** task path: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:65 Saturday 11 October 2025 12:28:44 -0400 (0:00:00.055) 0:00:02.288 ****** ok: [managed-node1] => { "ansible_facts": { "__podman_is_booted": true }, "changed": false } TASK [fedora.linux_system_roles.podman : Gather the package facts] ************* task path: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 Saturday 11 October 2025 12:28:44 -0400 (0:00:00.031) 0:00:02.320 ****** ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } TASK [fedora.linux_system_roles.podman : Enable copr if requested] ************* task path: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 Saturday 11 October 2025 12:28:45 -0400 (0:00:01.064) 0:00:03.385 ****** skipping: [managed-node1] => { "changed": false, "false_condition": "podman_use_copr | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.podman : Ensure required packages are installed] *** task path: /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 Saturday 11 October 2025 12:28:45 -0400 (0:00:00.070) 0:00:03.455 ****** fatal: [managed-node1]: FAILED! => { "changed": false, "rc": 1, "results": [] } MSG: Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried PLAY RECAP ********************************************************************* managed-node1 : ok=10 changed=0 unreachable=0 failed=1 skipped=3 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.14", "end_time": "2025-10-11T16:28:50.354881+00:00Z", "host": "managed-node1", "message": "Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", "rc": 1, "start_time": "2025-10-11T16:28:45.606618+00:00Z", "task_name": "Ensure required packages are installed", "task_path": "/tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Saturday 11 October 2025 12:28:50 -0400 (0:00:04.750) 0:00:08.206 ****** =============================================================================== fedora.linux_system_roles.podman : Ensure required packages are installed --- 4.75s /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:14 fedora.linux_system_roles.podman : Gather the package facts ------------- 1.06s /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:6 fedora.linux_system_roles.podman : Ensure ansible_facts used by role ---- 0.76s /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:3 fedora.linux_system_roles.podman : Run systemctl ------------------------ 0.52s /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:52 fedora.linux_system_roles.podman : Check if system is ostree ------------ 0.43s /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:11 fedora.linux_system_roles.podman : Check if transactional-update exists in /sbin --- 0.34s /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:23 fedora.linux_system_roles.podman : Enable copr if requested ------------- 0.07s /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:10 fedora.linux_system_roles.podman : Require installed systemd ------------ 0.06s /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:60 fedora.linux_system_roles.podman : Set platform/version specific variables --- 0.05s /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:32 fedora.linux_system_roles.podman : Set flag to indicate that systemd runtime operations are available --- 0.03s /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:65 fedora.linux_system_roles.podman : Set flag if transactional-update exists --- 0.03s /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:28 fedora.linux_system_roles.podman : Set flag to indicate system is ostree --- 0.03s /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/set_vars.yml:16 Include vault variables ------------------------------------------------- 0.02s /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/tests/podman/tests_default.yml:5 fedora.linux_system_roles.podman : Set platform/version specific variables --- 0.02s /tmp/collections-IiF/ansible_collections/fedora/linux_system_roles/roles/podman/tasks/main.yml:3 Oct 11 12:28:41 managed-node1 python3.9[11171]: ansible-file Invoked with state=absent path=/root/.config/containers/auth.json recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 11 12:28:41 managed-node1 python3.9[11320]: ansible-file Invoked with state=absent path=/home/podman_config_files_user/.config/containers/containers.conf.d/50-systemroles.conf recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Oct 11 12:28:41 managed-node1 sshd-session[11345]: Accepted publickey for root from 10.31.45.202 port 38440 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 11 12:28:41 managed-node1 systemd-logind[609]: New session 10 of user root. ░░ Subject: A new session 10 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 10 has been created for the user root. ░░ ░░ The leading process of the session is 11345. Oct 11 12:28:41 managed-node1 systemd[1]: Started Session 10 of User root. ░░ Subject: A start job for unit session-10.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-10.scope has finished successfully. ░░ ░░ The job identifier is 1246. Oct 11 12:28:41 managed-node1 sshd-session[11345]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 11 12:28:41 managed-node1 sshd-session[11348]: Received disconnect from 10.31.45.202 port 38440:11: disconnected by user Oct 11 12:28:41 managed-node1 sshd-session[11348]: Disconnected from user root 10.31.45.202 port 38440 Oct 11 12:28:41 managed-node1 sshd-session[11345]: pam_unix(sshd:session): session closed for user root Oct 11 12:28:41 managed-node1 systemd[1]: session-10.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-10.scope has successfully entered the 'dead' state. Oct 11 12:28:41 managed-node1 systemd-logind[609]: Session 10 logged out. Waiting for processes to exit. Oct 11 12:28:41 managed-node1 systemd-logind[609]: Removed session 10. ░░ Subject: Session 10 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 10 has been terminated. Oct 11 12:28:41 managed-node1 sshd-session[11373]: Accepted publickey for root from 10.31.45.202 port 38456 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 11 12:28:41 managed-node1 systemd-logind[609]: New session 11 of user root. ░░ Subject: A new session 11 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 11 has been created for the user root. ░░ ░░ The leading process of the session is 11373. Oct 11 12:28:41 managed-node1 systemd[1]: Started Session 11 of User root. ░░ Subject: A start job for unit session-11.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-11.scope has finished successfully. ░░ ░░ The job identifier is 1315. Oct 11 12:28:41 managed-node1 sshd-session[11373]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 11 12:28:41 managed-node1 sshd-session[11376]: Received disconnect from 10.31.45.202 port 38456:11: disconnected by user Oct 11 12:28:41 managed-node1 sshd-session[11376]: Disconnected from user root 10.31.45.202 port 38456 Oct 11 12:28:41 managed-node1 sshd-session[11373]: pam_unix(sshd:session): session closed for user root Oct 11 12:28:41 managed-node1 systemd[1]: session-11.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-11.scope has successfully entered the 'dead' state. Oct 11 12:28:41 managed-node1 systemd-logind[609]: Session 11 logged out. Waiting for processes to exit. Oct 11 12:28:41 managed-node1 systemd-logind[609]: Removed session 11. ░░ Subject: Session 11 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 11 has been terminated. Oct 11 12:28:42 managed-node1 chronyd[614]: Selected source 64.44.115.65 (2.centos.pool.ntp.org) Oct 11 12:28:42 managed-node1 python3.9[11574]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 11 12:28:43 managed-node1 python3.9[11725]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 11 12:28:43 managed-node1 python3.9[11874]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 11 12:28:44 managed-node1 python3.9[12023]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 11 12:28:45 managed-node1 sudo[12322]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylevedwzyszwovhhbqxpstbvyengzxql ; /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1760200125.7119613-9642-186348144072586/AnsiballZ_setup.py' Oct 11 12:28:45 managed-node1 sudo[12322]: pam_unix(sudo:session): session opened for user root(uid=0) by root(uid=0) Oct 11 12:28:46 managed-node1 python3.9[12324]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Oct 11 12:28:46 managed-node1 sudo[12322]: pam_unix(sudo:session): session closed for user root Oct 11 12:28:46 managed-node1 sudo[12401]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctfyqtlghvuyciznfwglgnmvknlpwcgz ; /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1760200125.7119613-9642-186348144072586/AnsiballZ_dnf.py' Oct 11 12:28:46 managed-node1 sudo[12401]: pam_unix(sudo:session): session opened for user root(uid=0) by root(uid=0) Oct 11 12:28:46 managed-node1 python3.9[12403]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 11 12:28:50 managed-node1 sudo[12401]: pam_unix(sudo:session): session closed for user root Oct 11 12:28:50 managed-node1 sshd-session[12461]: Accepted publickey for root from 10.31.45.202 port 37654 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 11 12:28:50 managed-node1 systemd-logind[609]: New session 12 of user root. ░░ Subject: A new session 12 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 12 has been created for the user root. ░░ ░░ The leading process of the session is 12461. Oct 11 12:28:50 managed-node1 systemd[1]: Started Session 12 of User root. ░░ Subject: A start job for unit session-12.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-12.scope has finished successfully. ░░ ░░ The job identifier is 1384. Oct 11 12:28:50 managed-node1 sshd-session[12461]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 11 12:28:50 managed-node1 sshd-session[12464]: Received disconnect from 10.31.45.202 port 37654:11: disconnected by user Oct 11 12:28:50 managed-node1 sshd-session[12464]: Disconnected from user root 10.31.45.202 port 37654 Oct 11 12:28:50 managed-node1 sshd-session[12461]: pam_unix(sshd:session): session closed for user root Oct 11 12:28:50 managed-node1 systemd-logind[609]: Session 12 logged out. Waiting for processes to exit. Oct 11 12:28:50 managed-node1 systemd[1]: session-12.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-12.scope has successfully entered the 'dead' state. Oct 11 12:28:50 managed-node1 systemd-logind[609]: Removed session 12. ░░ Subject: Session 12 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 12 has been terminated. Oct 11 12:28:50 managed-node1 sshd-session[12489]: Accepted publickey for root from 10.31.45.202 port 37666 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 11 12:28:50 managed-node1 systemd-logind[609]: New session 13 of user root. ░░ Subject: A new session 13 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 13 has been created for the user root. ░░ ░░ The leading process of the session is 12489. Oct 11 12:28:50 managed-node1 systemd[1]: Started Session 13 of User root. ░░ Subject: A start job for unit session-13.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-13.scope has finished successfully. ░░ ░░ The job identifier is 1453. Oct 11 12:28:50 managed-node1 sshd-session[12489]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)