ansible-playbook [core 2.16.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-7QI executable location = /usr/local/bin/ansible-playbook python version = 3.12.1 (main, Feb 21 2024, 14:18:26) [GCC 8.5.0 20210514 (Red Hat 8.5.0-21)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_swap.yml ******************************************************* 1 plays in /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml PLAY [Test management of swap] ************************************************* TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:2 Tuesday 22 July 2025 08:34:26 -0400 (0:00:00.034) 0:00:00.034 ********** ok: [managed-node13] TASK [Include role to ensure packages are installed] *************************** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:10 Tuesday 22 July 2025 08:34:27 -0400 (0:00:01.627) 0:00:01.662 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Tuesday 22 July 2025 08:34:28 -0400 (0:00:00.172) 0:00:01.834 ********** included: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Tuesday 22 July 2025 08:34:28 -0400 (0:00:00.031) 0:00:01.865 ********** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Tuesday 22 July 2025 08:34:28 -0400 (0:00:00.086) 0:00:01.952 ********** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node13] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Tuesday 22 July 2025 08:34:28 -0400 (0:00:00.209) 0:00:02.161 ********** ok: [managed-node13] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Tuesday 22 July 2025 08:34:29 -0400 (0:00:01.211) 0:00:03.372 ********** ok: [managed-node13] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Tuesday 22 July 2025 08:34:29 -0400 (0:00:00.068) 0:00:03.441 ********** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Tuesday 22 July 2025 08:34:29 -0400 (0:00:00.022) 0:00:03.464 ********** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Tuesday 22 July 2025 08:34:29 -0400 (0:00:00.031) 0:00:03.495 ********** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Tuesday 22 July 2025 08:34:29 -0400 (0:00:00.086) 0:00:03.582 ********** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Tuesday 22 July 2025 08:34:33 -0400 (0:00:03.329) 0:00:06.911 ********** ok: [managed-node13] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Tuesday 22 July 2025 08:34:33 -0400 (0:00:00.094) 0:00:07.005 ********** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Tuesday 22 July 2025 08:34:33 -0400 (0:00:00.079) 0:00:07.084 ********** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Tuesday 22 July 2025 08:34:34 -0400 (0:00:00.908) 0:00:07.993 ********** included: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Tuesday 22 July 2025 08:34:34 -0400 (0:00:00.131) 0:00:08.124 ********** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Tuesday 22 July 2025 08:34:34 -0400 (0:00:00.122) 0:00:08.247 ********** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Tuesday 22 July 2025 08:34:34 -0400 (0:00:00.091) 0:00:08.338 ********** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Tuesday 22 July 2025 08:34:34 -0400 (0:00:00.150) 0:00:08.489 ********** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Tuesday 22 July 2025 08:34:37 -0400 (0:00:03.154) 0:00:11.644 ********** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "auto-cpufreq.service": { "name": "auto-cpufreq.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-activation.service": { "name": "lvm2-activation.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "power-profiles-daemon.service": { "name": "power-profiles-daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "snapd.seeded.service": { "name": "snapd.seeded.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tlp.service": { "name": "tlp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Tuesday 22 July 2025 08:34:39 -0400 (0:00:01.885) 0:00:13.529 ********** ok: [managed-node13] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Tuesday 22 July 2025 08:34:40 -0400 (0:00:00.226) 0:00:13.756 ********** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Tuesday 22 July 2025 08:34:40 -0400 (0:00:00.088) 0:00:13.844 ********** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Tuesday 22 July 2025 08:34:40 -0400 (0:00:00.855) 0:00:14.700 ********** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Tuesday 22 July 2025 08:34:41 -0400 (0:00:00.117) 0:00:14.817 ********** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1753187123.4354699, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4644c54c9838cbc512975c626cc4f46832c5b181", "ctime": 1716969427.210901, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 134, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1716969427.210901, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1321, "uid": 0, "version": "1528175122", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Tuesday 22 July 2025 08:34:41 -0400 (0:00:00.766) 0:00:15.584 ********** skipping: [managed-node13] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Tuesday 22 July 2025 08:34:41 -0400 (0:00:00.080) 0:00:15.664 ********** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Tuesday 22 July 2025 08:34:42 -0400 (0:00:00.050) 0:00:15.714 ********** ok: [managed-node13] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Tuesday 22 July 2025 08:34:42 -0400 (0:00:00.055) 0:00:15.769 ********** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Tuesday 22 July 2025 08:34:42 -0400 (0:00:00.038) 0:00:15.808 ********** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Tuesday 22 July 2025 08:34:42 -0400 (0:00:00.032) 0:00:15.840 ********** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Tuesday 22 July 2025 08:34:42 -0400 (0:00:00.061) 0:00:15.902 ********** skipping: [managed-node13] => { "changed": false, "false_condition": "blivet_output['mounts'] | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Tuesday 22 July 2025 08:34:42 -0400 (0:00:00.061) 0:00:15.964 ********** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Tuesday 22 July 2025 08:34:42 -0400 (0:00:00.071) 0:00:16.036 ********** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Tuesday 22 July 2025 08:34:42 -0400 (0:00:00.074) 0:00:16.110 ********** skipping: [managed-node13] => { "changed": false, "false_condition": "blivet_output['mounts'] | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Tuesday 22 July 2025 08:34:42 -0400 (0:00:00.072) 0:00:16.182 ********** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1753187351.2378895, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Tuesday 22 July 2025 08:34:42 -0400 (0:00:00.414) 0:00:16.596 ********** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Tuesday 22 July 2025 08:34:42 -0400 (0:00:00.017) 0:00:16.614 ********** ok: [managed-node13] TASK [Mark tasks to be skipped] ************************************************ task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:14 Tuesday 22 July 2025 08:34:43 -0400 (0:00:00.817) 0:00:17.432 ********** ok: [managed-node13] => { "ansible_facts": { "storage_skip_checks": [ "blivet_available", "packages_installed", "service_facts" ] }, "changed": false } TASK [Get unused disks for swap] *********************************************** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:22 Tuesday 22 July 2025 08:34:43 -0400 (0:00:00.051) 0:00:17.484 ********** included: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node13 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Tuesday 22 July 2025 08:34:43 -0400 (0:00:00.028) 0:00:17.513 ********** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Tuesday 22 July 2025 08:34:46 -0400 (0:00:02.834) 0:00:20.347 ********** ok: [managed-node13] => { "changed": false, "disks": "Unable to find unused disk", "info": [ "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Tuesday 22 July 2025 08:34:47 -0400 (0:00:00.547) 0:00:20.895 ********** ok: [managed-node13] => { "changed": false, "cmd": "set -x\nexec 1>&2\nlsblk -p --pairs --bytes -o NAME,TYPE,SIZE,FSTYPE,LOG-SEC\njournalctl -ex\n", "delta": "0:00:00.022368", "end": "2025-07-22 08:34:47.590039", "rc": 0, "start": "2025-07-22 08:34:47.567671" } STDERR: + exec + lsblk -p --pairs --bytes -o NAME,TYPE,SIZE,FSTYPE,LOG-SEC NAME="/dev/xvda" TYPE="disk" SIZE="268435456000" FSTYPE="" LOG-SEC="512" NAME="/dev/xvda1" TYPE="part" SIZE="268434390528" FSTYPE="xfs" LOG-SEC="512" + journalctl -ex -- Logs begin at Tue 2025-07-22 08:25:04 EDT, end at Tue 2025-07-22 08:34:47 EDT. -- Jul 22 08:29:40 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 671 Jan 15 2024 usr/lib/systemd/system/systemd-fsck@.service Jul 22 08:29:40 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 588 Jan 15 2024 usr/lib/systemd/system/systemd-halt.service Jul 22 08:29:40 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 647 Jan 15 2024 usr/lib/systemd/system/systemd-journald-audit.socket Jul 22 08:29:40 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 1130 Jun 22 2018 usr/lib/systemd/system/systemd-journald-dev-log.socket Jul 22 08:29:40 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 1537 Jan 15 2024 usr/lib/systemd/system/systemd-journald.service Jul 22 08:29:40 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 882 Jun 22 2018 usr/lib/systemd/system/systemd-journald.socket Jul 22 08:29:40 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 601 Jan 15 2024 usr/lib/systemd/system/systemd-kexec.service Jul 22 08:29:40 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 1011 Jan 15 2024 usr/lib/systemd/system/systemd-modules-load.service Jul 22 08:29:40 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 556 Jan 15 2024 usr/lib/systemd/system/systemd-poweroff.service Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 800 Jan 15 2024 usr/lib/systemd/system/systemd-random-seed.service Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 551 Jan 15 2024 usr/lib/systemd/system/systemd-reboot.service Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 697 Jan 15 2024 usr/lib/systemd/system/systemd-sysctl.service Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 771 Jan 15 2024 usr/lib/systemd/system/systemd-tmpfiles-setup-dev.service Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 751 Jan 15 2024 usr/lib/systemd/system/systemd-tmpfiles-setup.service Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 867 Jan 15 2024 usr/lib/systemd/system/systemd-udev-settle.service Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 771 Jan 15 2024 usr/lib/systemd/system/systemd-udev-trigger.service Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 635 Jun 22 2018 usr/lib/systemd/system/systemd-udevd-control.socket Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 610 Jun 22 2018 usr/lib/systemd/system/systemd-udevd-kernel.socket Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 1070 Jan 15 2024 usr/lib/systemd/system/systemd-udevd.service Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 622 Jan 15 2024 usr/lib/systemd/system/systemd-vconsole-setup.service Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 445 Jun 22 2018 usr/lib/systemd/system/timers.target Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 457 Jun 22 2018 usr/lib/systemd/system/umount.target Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 1609456 Jan 15 2024 usr/lib/systemd/systemd Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 12232 Jan 15 2024 usr/lib/systemd/systemd-cgroups-agent Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 62608 Jan 15 2024 usr/lib/systemd/systemd-coredump Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 25296 Jan 15 2024 usr/lib/systemd/systemd-fsck Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 158096 Jan 15 2024 usr/lib/systemd/systemd-journald Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 20880 Jan 15 2024 usr/lib/systemd/systemd-modules-load Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 12224 Jan 15 2024 usr/lib/systemd/systemd-reply-password Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 62624 Jan 15 2024 usr/lib/systemd/systemd-shutdown Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 21240 Jan 15 2024 usr/lib/systemd/systemd-sysctl Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 418272 Jan 15 2024 usr/lib/systemd/systemd-udevd Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 20952 Jan 15 2024 usr/lib/systemd/systemd-vconsole-setup Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 2 root root 0 Jan 15 2024 usr/lib/tmpfiles.d Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 135 Oct 8 2018 usr/lib/tmpfiles.d/dracut-tmpfiles.conf Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 1676 Jan 15 2024 usr/lib/tmpfiles.d/systemd.conf Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 3 root root 0 Jan 15 2024 usr/lib/udev Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 41704 Jan 15 2024 usr/lib/udev/ata_id Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 33288 Jan 15 2024 usr/lib/udev/cdrom_id Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 2 root root 0 Jan 15 2024 usr/lib/udev/rules.d Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 1834 Jan 15 2024 usr/lib/udev/rules.d/40-redhat.rules Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 3750 Jan 15 2024 usr/lib/udev/rules.d/50-udev-default.rules Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 626 Jan 15 2024 usr/lib/udev/rules.d/60-block.rules Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 6528 Jun 22 2018 usr/lib/udev/rules.d/60-persistent-storage.rules Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 2671 Jun 22 2018 usr/lib/udev/rules.d/70-uaccess.rules Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 995 May 11 2019 usr/lib/udev/rules.d/71-biosdevname.rules Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 2758 Jan 15 2024 usr/lib/udev/rules.d/71-seat.rules Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 636 Jan 15 2024 usr/lib/udev/rules.d/73-seat-late.rules Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 452 Jun 22 2018 usr/lib/udev/rules.d/75-net-description.rules Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 615 Jun 22 2018 usr/lib/udev/rules.d/80-drivers.rules Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 292 Jun 22 2018 usr/lib/udev/rules.d/80-net-setup-link.rules Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 2013 Oct 11 2022 usr/lib/udev/rules.d/85-nm-unmanaged.rules Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 510 Jan 15 2024 usr/lib/udev/rules.d/90-vconsole.rules Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 4367 Jan 15 2024 usr/lib/udev/rules.d/99-systemd.rules Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 54976 Jan 15 2024 usr/lib/udev/scsi_id Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 5 root root 0 Jan 15 2024 usr/lib64 Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 3 root root 0 Jan 15 2024 usr/lib64/NetworkManager Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 2 root root 0 Jan 15 2024 usr/lib64/NetworkManager/1.40.16-15.el8 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 49264 Jan 15 2024 usr/lib64/NetworkManager/1.40.16-15.el8/libnm-device-plugin-team.so Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 306184 Jan 15 2024 usr/lib64/NetworkManager/1.40.16-15.el8/libnm-settings-plugin-ifcfg-rh.so Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 2 root root 0 Jan 15 2024 usr/lib64/bind9-export Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 25 Jan 15 2024 usr/lib64/bind9-export/libdns-export.so.1115 -> libdns-export.so.1115.0.3 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 2369968 Jan 15 2024 usr/lib64/bind9-export/libdns-export.so.1115.0.3 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 24 Jan 15 2024 usr/lib64/bind9-export/libirs-export.so.161 -> libirs-export.so.161.0.1 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 49072 Jan 15 2024 usr/lib64/bind9-export/libirs-export.so.161.0.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 25 Jan 15 2024 usr/lib64/bind9-export/libisc-export.so.1107 -> libisc-export.so.1107.0.7 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 471832 Jan 15 2024 usr/lib64/bind9-export/libisc-export.so.1107.0.7 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 27 Jan 15 2024 usr/lib64/bind9-export/libisccfg-export.so.163 -> libisccfg-export.so.163.0.8 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 186520 Jan 15 2024 usr/lib64/bind9-export/libisccfg-export.so.163.0.8 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 226472 Jan 15 2024 usr/lib64/ld-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 10 Jan 15 2024 usr/lib64/ld-linux-x86-64.so.2 -> ld-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 18 Jan 15 2024 usr/lib64/libacl.so.1 -> libacl.so.1.1.2253 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 37232 Oct 6 2023 usr/lib64/libacl.so.1.1.2253 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 19 Jan 15 2024 usr/lib64/libattr.so.1 -> libattr.so.1.1.2448 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 24616 May 10 2019 usr/lib64/libattr.so.1.1.2448 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libaudit.so.1 -> libaudit.so.1.0.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 135344 Nov 6 2023 usr/lib64/libaudit.so.1.0.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libblkid.so.1 -> libblkid.so.1.1.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 343144 Jan 15 2024 usr/lib64/libblkid.so.1.1.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 15 Jan 15 2024 usr/lib64/libbpf.so.0 -> libbpf.so.0.5.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 307768 Jul 1 2022 usr/lib64/libbpf.so.0.5.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 24 Jan 15 2024 usr/lib64/libbrotlicommon.so.1 -> libbrotlicommon.so.1.0.6 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 134944 Jan 12 2021 usr/lib64/libbrotlicommon.so.1.0.6 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 21 Jan 15 2024 usr/lib64/libbrotlidec.so.1 -> libbrotlidec.so.1.0.6 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 53192 Jan 12 2021 usr/lib64/libbrotlidec.so.1.0.6 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 15 Jan 15 2024 usr/lib64/libbz2.so.1 -> libbz2.so.1.0.6 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 72816 May 10 2019 usr/lib64/libbz2.so.1.0.6 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 2164600 Jan 15 2024 usr/lib64/libc-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 253 Jan 15 2024 usr/lib64/libc.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 12 Jan 15 2024 usr/lib64/libc.so.6 -> libc-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 18 Jan 15 2024 usr/lib64/libcap-ng.so.0 -> libcap-ng.so.0.0.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 24368 Jun 7 2021 usr/lib64/libcap-ng.so.0.0.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 14 Jan 15 2024 usr/lib64/libcap.so.2 -> libcap.so.2.48 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 33208 Jul 12 2023 usr/lib64/libcap.so.2.48 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libcom_err.so -> libcom_err.so.2.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libcom_err.so.2 -> libcom_err.so.2.1 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 16168 May 22 2022 usr/lib64/libcom_err.so.2.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libcrypt.so -> libcrypt.so.1.1.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libcrypt.so.1 -> libcrypt.so.1.1.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 135616 May 5 2021 usr/lib64/libcrypt.so.1.1.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 19 Jan 15 2024 usr/lib64/libcrypto.so -> libcrypto.so.1.1.1k Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 19 Jan 15 2024 usr/lib64/libcrypto.so.1.1 -> libcrypto.so.1.1.1k Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 3087096 Nov 30 2023 usr/lib64/libcrypto.so.1.1.1k Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 23 Jan 15 2024 usr/lib64/libcryptsetup.so.12 -> libcryptsetup.so.12.6.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 525864 Jul 11 2023 usr/lib64/libcryptsetup.so.12.6.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/libcurl.so.4 -> libcurl.so.4.5.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 596088 Dec 11 2023 usr/lib64/libcurl.so.4.5.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 18 Jan 15 2024 usr/lib64/libdaemon.so.0 -> libdaemon.so.0.5.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 28672 May 11 2019 usr/lib64/libdaemon.so.0.5.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 19 Jan 15 2024 usr/lib64/libdbus-1.so.3 -> libdbus-1.so.3.19.7 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 351032 Jun 19 2023 usr/lib64/libdbus-1.so.3.19.7 Jul 22 08:29:41 managed-node13 dracut[8701]: -r-xr-xr-x 1 root root 370608 Jan 15 2024 usr/lib64/libdevmapper.so.1.02 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 19104 Jan 15 2024 usr/lib64/libdl-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 13 Jan 15 2024 usr/lib64/libdl.so -> libdl-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 13 Jan 15 2024 usr/lib64/libdl.so.2 -> libdl-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 678192 Dec 12 2023 usr/lib64/libdw-0.190.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 14 Jan 15 2024 usr/lib64/libdw.so -> libdw-0.190.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 14 Jan 15 2024 usr/lib64/libdw.so.1 -> libdw-0.190.so Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 108368 Dec 12 2023 usr/lib64/libelf-0.190.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 15 Jan 15 2024 usr/lib64/libelf.so -> libelf-0.190.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 15 Jan 15 2024 usr/lib64/libelf.so.1 -> libelf-0.190.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 15 Jan 15 2024 usr/lib64/libffi.so.6 -> libffi.so.6.0.2 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 36600 Dec 6 2022 usr/lib64/libffi.so.6.0.2 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 11752 Jan 15 2024 usr/lib64/libfreebl3.so Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 84 Jan 15 2024 usr/lib64/libfreeblpriv3.chk Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 824008 Jan 15 2024 usr/lib64/libfreeblpriv3.so Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 99632 Jan 15 2024 usr/lib64/libgcc_s-8-20210514.so.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 24 Jan 15 2024 usr/lib64/libgcc_s.so.1 -> libgcc_s-8-20210514.so.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 19 Jan 15 2024 usr/lib64/libgcrypt.so.20 -> libgcrypt.so.20.2.5 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 1186624 Jun 27 2022 usr/lib64/libgcrypt.so.20.2.5 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 22 Jan 15 2024 usr/lib64/libgio-2.0.so.0 -> libgio-2.0.so.0.5600.4 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 1769696 Jan 15 2024 usr/lib64/libgio-2.0.so.0.5600.4 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 23 Jan 15 2024 usr/lib64/libglib-2.0.so.0 -> libglib-2.0.so.0.5600.4 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 1171328 Jan 15 2024 usr/lib64/libglib-2.0.so.0.5600.4 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 26 Jan 15 2024 usr/lib64/libgmodule-2.0.so.0 -> libgmodule-2.0.so.0.5600.4 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 16064 Jan 15 2024 usr/lib64/libgmodule-2.0.so.0.5600.4 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/libgmp.so.10 -> libgmp.so.10.3.2 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 628752 Jan 15 2024 usr/lib64/libgmp.so.10.3.2 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 20 Jan 15 2024 usr/lib64/libgnutls.so.30 -> libgnutls.so.30.28.2 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 2051280 Jan 15 2024 usr/lib64/libgnutls.so.30.28.2 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 26 Jan 15 2024 usr/lib64/libgobject-2.0.so.0 -> libgobject-2.0.so.0.5600.4 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 346760 Jan 15 2024 usr/lib64/libgobject-2.0.so.0.5600.4 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 22 Jan 15 2024 usr/lib64/libgpg-error.so.0 -> libgpg-error.so.0.24.2 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 138368 May 10 2019 usr/lib64/libgpg-error.so.0.24.2 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 21 Jan 15 2024 usr/lib64/libgssapi_krb5.so -> libgssapi_krb5.so.2.2 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 21 Jan 15 2024 usr/lib64/libgssapi_krb5.so.2 -> libgssapi_krb5.so.2.2 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 355432 Jan 15 2024 usr/lib64/libgssapi_krb5.so.2.2 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libhogweed.so.4 -> libhogweed.so.4.5 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 197048 Jul 15 2021 usr/lib64/libhogweed.so.4.5 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/libidn2.so.0 -> libidn2.so.0.3.6 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 123248 Nov 8 2019 usr/lib64/libidn2.so.0.3.6 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 20 Jan 15 2024 usr/lib64/libjansson.so.4 -> libjansson.so.4.14.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 58432 Dec 2 2021 usr/lib64/libjansson.so.4.14.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 18 Jan 15 2024 usr/lib64/libjson-c.so.4 -> libjson-c.so.4.0.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 66456 Nov 11 2021 usr/lib64/libjson-c.so.4.0.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 18 Jan 15 2024 usr/lib64/libk5crypto.so -> libk5crypto.so.3.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 18 Jan 15 2024 usr/lib64/libk5crypto.so.3 -> libk5crypto.so.3.1 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 95792 Jan 15 2024 usr/lib64/libk5crypto.so.3.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 18 Jan 15 2024 usr/lib64/libkeyutils.so -> libkeyutils.so.1.6 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 18 Jan 15 2024 usr/lib64/libkeyutils.so.1 -> libkeyutils.so.1.6 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 16192 Jun 19 2021 usr/lib64/libkeyutils.so.1.6 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/libkmod.so.2 -> libkmod.so.2.3.3 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 99832 Oct 24 2023 usr/lib64/libkmod.so.2.3.3 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 14 Jan 15 2024 usr/lib64/libkrb5.so -> libkrb5.so.3.3 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 14 Jan 15 2024 usr/lib64/libkrb5.so.3 -> libkrb5.so.3.3 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 975040 Jan 15 2024 usr/lib64/libkrb5.so.3.3 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 21 Jan 15 2024 usr/lib64/libkrb5support.so -> libkrb5support.so.0.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 21 Jan 15 2024 usr/lib64/libkrb5support.so.0 -> libkrb5support.so.0.1 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 70928 Jan 15 2024 usr/lib64/libkrb5support.so.0.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 21 Jan 15 2024 usr/lib64/liblber-2.4.so.2 -> liblber-2.4.so.2.10.9 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 66808 Aug 10 2021 usr/lib64/liblber-2.4.so.2.10.9 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 21 Jan 15 2024 usr/lib64/libldap-2.4.so.2 -> libldap-2.4.so.2.10.9 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 328600 Aug 10 2021 usr/lib64/libldap-2.4.so.2.10.9 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 15 Jan 15 2024 usr/lib64/liblz4.so.1 -> liblz4.so.1.8.3 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 119336 Jun 29 2021 usr/lib64/liblz4.so.1.8.3 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/liblzma.so -> liblzma.so.5.2.4 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/liblzma.so.5 -> liblzma.so.5.2.4 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 161568 Jun 27 2022 usr/lib64/liblzma.so.5.2.4 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 1597840 Jan 15 2024 usr/lib64/libm-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 141 Jan 15 2024 usr/lib64/libm.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 12 Jan 15 2024 usr/lib64/libm.so.6 -> libm-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 15 Jan 15 2024 usr/lib64/libmnl.so.0 -> libmnl.so.0.2.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 24688 May 11 2019 usr/lib64/libmnl.so.0.2.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libmount.so.1 -> libmount.so.1.1.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 370952 Jan 15 2024 usr/lib64/libmount.so.1.1.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libncurses.so.6 -> libncurses.so.6.1 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 179816 Aug 15 2023 usr/lib64/libncurses.so.6.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 15 Jan 15 2024 usr/lib64/libndp.so.0 -> libndp.so.0.1.1 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 28696 May 2 2021 usr/lib64/libndp.so.0.1.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/libnettle.so.6 -> libnettle.so.6.5 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 238640 Jul 15 2021 usr/lib64/libnettle.so.6.5 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 21 Jan 15 2024 usr/lib64/libnghttp2.so.14 -> libnghttp2.so.14.17.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 166616 Jan 15 2024 usr/lib64/libnghttp2.so.14.17.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 19 Jan 15 2024 usr/lib64/libnl-3.so.200 -> libnl-3.so.200.26.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 146264 Jul 7 2022 usr/lib64/libnl-3.so.200.26.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 23 Jan 15 2024 usr/lib64/libnl-cli-3.so.200 -> libnl-cli-3.so.200.26.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 49152 Jul 7 2022 usr/lib64/libnl-cli-3.so.200.26.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 24 Jan 15 2024 usr/lib64/libnl-genl-3.so.200 -> libnl-genl-3.so.200.26.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 29744 Jul 7 2022 usr/lib64/libnl-genl-3.so.200.26.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 22 Jan 15 2024 usr/lib64/libnl-nf-3.so.200 -> libnl-nf-3.so.200.26.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 115184 Jul 7 2022 usr/lib64/libnl-nf-3.so.200.26.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 25 Jan 15 2024 usr/lib64/libnl-route-3.so.200 -> libnl-route-3.so.200.26.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 600504 Jul 7 2022 usr/lib64/libnl-route-3.so.200.26.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 28792 Jan 15 2024 usr/lib64/libnss_dns-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 18 Jan 15 2024 usr/lib64/libnss_dns.so.2 -> libnss_dns-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 53888 Jan 15 2024 usr/lib64/libnss_files-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 20 Jan 15 2024 usr/lib64/libnss_files.so.2 -> libnss_files-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libomapi.so.0 -> libomapi.so.0.0.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 136368 Jan 15 2024 usr/lib64/libomapi.so.0.0.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 19 Jan 15 2024 usr/lib64/libp11-kit.so.0 -> libp11-kit.so.0.3.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 1258912 Nov 29 2023 usr/lib64/libp11-kit.so.0.3.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/libpam.so.0 -> libpam.so.0.84.2 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 65960 Jan 15 2024 usr/lib64/libpam.so.0.84.2 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 15 Jan 15 2024 usr/lib64/libpci.so.3 -> libpci.so.3.7.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 58968 Sep 23 2022 usr/lib64/libpci.so.3.7.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libpcre.so.1 -> libpcre.so.1.2.10 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 464256 Jun 7 2021 usr/lib64/libpcre.so.1.2.10 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 19 Jan 15 2024 usr/lib64/libpcre2-8.so -> libpcre2-8.so.0.7.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 19 Jan 15 2024 usr/lib64/libpcre2-8.so.0 -> libpcre2-8.so.0.7.1 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 542840 Jun 27 2022 usr/lib64/libpcre2-8.so.0.7.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 27 Jan 15 2024 usr/lib64/libply-splash-core.so.5 -> libply-splash-core.so.5.0.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 143576 Feb 25 2022 usr/lib64/libply-splash-core.so.5.0.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 15 Jan 15 2024 usr/lib64/libply.so.5 -> libply.so.5.0.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 115456 Feb 25 2022 usr/lib64/libply.so.5.0.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 18 Jan 15 2024 usr/lib64/libprocps.so.7 -> libprocps.so.7.1.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 82848 Aug 15 2023 usr/lib64/libprocps.so.7.1.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 15 Jan 15 2024 usr/lib64/libpsl.so.5 -> libpsl.so.5.3.1 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 69672 Jun 16 2020 usr/lib64/libpsl.so.5.3.1 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 149960 Jan 15 2024 usr/lib64/libpthread-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 18 Jan 15 2024 usr/lib64/libpthread.so -> libpthread-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 18 Jan 15 2024 usr/lib64/libpthread.so.0 -> libpthread-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 18 Jan 15 2024 usr/lib64/libreadline.so.7 -> libreadline.so.7.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 324792 May 10 2019 usr/lib64/libreadline.so.7.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 92328 Jan 15 2024 usr/lib64/libresolv-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libresolv.so -> libresolv-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libresolv.so.2 -> libresolv-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 42720 Jan 15 2024 usr/lib64/librt-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 13 Jan 15 2024 usr/lib64/librt.so -> librt-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 13 Jan 15 2024 usr/lib64/librt.so.1 -> librt-2.28.so Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libsasl2.so.3 -> libsasl2.so.3.0.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 124904 Feb 24 2022 usr/lib64/libsasl2.so.3.0.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 19 Jan 15 2024 usr/lib64/libseccomp.so.2 -> libseccomp.so.2.5.2 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 128328 Nov 11 2021 usr/lib64/libseccomp.so.2.5.2 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 15 Jan 15 2024 usr/lib64/libselinux.so -> libselinux.so.1 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 171480 Dec 13 2022 usr/lib64/libselinux.so.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 13 Jan 15 2024 usr/lib64/libsepol.so -> libsepol.so.1 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 734512 Aug 24 2021 usr/lib64/libsepol.so.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 21 Jan 15 2024 usr/lib64/libsmartcols.so.1 -> libsmartcols.so.1.1.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 221096 Jan 15 2024 usr/lib64/libsmartcols.so.1.1.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 15 Jan 15 2024 usr/lib64/libssh.so.4 -> libssh.so.4.8.7 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 468264 Jan 15 2024 usr/lib64/libssh.so.4.8.7 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/libssl.so -> libssl.so.1.1.1k Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/libssl.so.1.1 -> libssl.so.1.1.1k Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 615200 Nov 30 2023 usr/lib64/libssl.so.1.1.1k Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 20 Jan 15 2024 usr/lib64/libsystemd.so.0 -> libsystemd.so.0.23.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 1386936 Jan 15 2024 usr/lib64/libsystemd.so.0.23.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libtasn1.so.6 -> libtasn1.so.6.5.5 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 78344 Jan 17 2023 usr/lib64/libtasn1.so.6.5.5 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/libteam.so.5 -> libteam.so.5.6.1 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 57448 Dec 8 2022 usr/lib64/libteam.so.5.6.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 20 Jan 15 2024 usr/lib64/libteamdctl.so.0 -> libteamdctl.so.0.1.5 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 24064 Dec 8 2022 usr/lib64/libteamdctl.so.0.1.5 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 15 Jan 15 2024 usr/lib64/libtinfo.so.6 -> libtinfo.so.6.1 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 187088 Aug 15 2023 usr/lib64/libtinfo.so.6.1 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 17 Jan 15 2024 usr/lib64/libudev.so.1 -> libudev.so.1.6.11 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 661792 Jan 15 2024 usr/lib64/libudev.so.1.6.11 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 21 Jan 15 2024 usr/lib64/libunistring.so.2 -> libunistring.so.2.1.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 1580256 May 10 2019 usr/lib64/libunistring.so.2.1.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/libuuid.so.1 -> libuuid.so.1.3.0 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 32864 Jan 15 2024 usr/lib64/libuuid.so.1.3.0 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/libxml2.so.2 -> libxml2.so.2.9.7 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 1502896 Sep 20 2023 usr/lib64/libxml2.so.2.9.7 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 14 Jan 15 2024 usr/lib64/libz.so -> libz.so.1.2.11 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 14 Jan 15 2024 usr/lib64/libz.so.1 -> libz.so.1.2.11 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 99112 May 17 2023 usr/lib64/libz.so.1.2.11 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/libzstd.so -> libzstd.so.1.4.4 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/lib64/libzstd.so.1 -> libzstd.so.1.4.4 Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 678016 Jun 17 2020 usr/lib64/libzstd.so.1.4.4 Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 2 root root 0 Jan 15 2024 usr/lib64/plymouth Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 16312 Feb 25 2022 usr/lib64/plymouth/details.so Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 20496 Feb 25 2022 usr/lib64/plymouth/text.so Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 2 root root 0 Jan 15 2024 usr/libexec Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 16568 Jan 15 2024 usr/libexec/nm-dhcp-helper Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 757736 Jan 15 2024 usr/libexec/nm-initrd-generator Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 2 root root 0 Jan 15 2024 usr/sbin Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 3578904 Jan 15 2024 usr/sbin/NetworkManager Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 42192 May 11 2019 usr/sbin/biosdevname Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 100352 Jan 15 2024 usr/sbin/blkid Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 41920 Jan 18 2023 usr/sbin/chroot Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 11 Jan 15 2024 usr/sbin/depmod -> ../bin/kmod Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 464032 Jan 15 2024 usr/sbin/dhclient Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 54096 Jan 15 2024 usr/sbin/fsck Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 1968 Jun 8 2023 usr/sbin/fsck.xfs Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/sbin/halt -> ../bin/systemctl Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 22 Jan 15 2024 usr/sbin/init -> ../lib/systemd/systemd Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 1163 Oct 8 2018 usr/sbin/initqueue Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 11 Jan 15 2024 usr/sbin/insmod -> ../bin/kmod Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 193 Oct 8 2018 usr/sbin/insmodpost.sh Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 709320 Sep 25 2023 usr/sbin/ip Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 199104 Jan 15 2024 usr/sbin/kexec Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 492 Oct 8 2018 usr/sbin/loginit Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 92104 Jan 15 2024 usr/sbin/losetup Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 11 Jan 15 2024 usr/sbin/lsmod -> ../bin/kmod Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 11 Jan 15 2024 usr/sbin/modinfo -> ../bin/kmod Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 11 Jan 15 2024 usr/sbin/modprobe -> ../bin/kmod Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 2677 Jan 15 2024 usr/sbin/netroot Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 12088 Jan 15 2024 usr/sbin/nologin Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 144656 Feb 25 2022 usr/sbin/plymouthd Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/sbin/poweroff -> ../bin/systemctl Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 1346 Oct 8 2018 usr/sbin/rdsosreport Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 16 Jan 15 2024 usr/sbin/reboot -> ../bin/systemctl Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 11 Jan 15 2024 usr/sbin/rmmod -> ../bin/kmod Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 119360 Mar 7 2023 usr/sbin/rngd Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 20600 Jan 15 2024 usr/sbin/swapoff Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 14 Jan 15 2024 usr/sbin/udevadm -> ../bin/udevadm Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 778024 Jun 8 2023 usr/sbin/xfs_db Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 782 Jun 8 2023 usr/sbin/xfs_metadump Jul 22 08:29:41 managed-node13 dracut[8701]: -rwxr-xr-x 1 root root 731720 Jun 8 2023 usr/sbin/xfs_repair Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 4 root root 0 Jan 15 2024 usr/share Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 25 Jan 15 2024 usr/share/consolefonts -> /usr/lib/kbd/consolefonts Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 25 Jan 15 2024 usr/share/consoletrans -> /usr/lib/kbd/consoletrans Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 20 Jan 15 2024 usr/share/keymaps -> /usr/lib/kbd/keymaps Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 3 root root 0 Jan 15 2024 usr/share/plymouth Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 134 Feb 25 2022 usr/share/plymouth/plymouthd.defaults Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 4 root root 0 Jan 15 2024 usr/share/plymouth/themes Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 2 root root 0 Jan 15 2024 usr/share/plymouth/themes/details Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 84 May 30 2020 usr/share/plymouth/themes/details/details.plymouth Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 2 root root 0 Jan 15 2024 usr/share/plymouth/themes/text Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 94 May 30 2020 usr/share/plymouth/themes/text/text.plymouth Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 4 root root 0 Jan 15 2024 usr/share/terminfo Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 2 root root 0 Jan 15 2024 usr/share/terminfo/l Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 1822 Aug 15 2023 usr/share/terminfo/l/linux Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 2 root root 0 Jan 15 2024 usr/share/terminfo/v Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 1190 Aug 15 2023 usr/share/terminfo/v/vt100 Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 1184 Aug 15 2023 usr/share/terminfo/v/vt102 Jul 22 08:29:41 managed-node13 dracut[8701]: -rw-r--r-- 1 root root 1377 Aug 15 2023 usr/share/terminfo/v/vt220 Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 20 Jan 15 2024 usr/share/unimaps -> /usr/lib/kbd/unimaps Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 3 root root 0 Jan 15 2024 var Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 11 Jan 15 2024 var/lock -> ../run/lock Jul 22 08:29:41 managed-node13 dracut[8701]: lrwxrwxrwx 1 root root 6 Jan 15 2024 var/run -> ../run Jul 22 08:29:41 managed-node13 dracut[8701]: drwxr-xr-x 2 root root 0 Jan 15 2024 var/tmp Jul 22 08:29:41 managed-node13 dracut[8701]: ======================================================================== Jul 22 08:29:41 managed-node13 dracut[8701]: *** Creating initramfs image file '/boot/initramfs-4.18.0-553.5.1.el8.x86_64.tmp' done *** Jul 22 08:29:44 managed-node13 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. -- Subject: Unit run-rcbd9c0f7a3c4415f98c1584aa6bd3b21.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit run-rcbd9c0f7a3c4415f98c1584aa6bd3b21.service has finished starting up. -- -- The start-up result is done. Jul 22 08:29:44 managed-node13 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jul 22 08:29:44 managed-node13 systemd[1]: Starting man-db-cache-update.service... -- Subject: Unit man-db-cache-update.service has begun start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has begun starting up. Jul 22 08:29:44 managed-node13 systemd[1]: Reloading. Jul 22 08:29:45 managed-node13 sudo[8299]: pam_unix(sudo:session): session closed for user root Jul 22 08:29:45 managed-node13 systemd[1]: man-db-cache-update.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit man-db-cache-update.service has successfully entered the 'dead' state. Jul 22 08:29:45 managed-node13 systemd[1]: Started man-db-cache-update.service. -- Subject: Unit man-db-cache-update.service has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit man-db-cache-update.service has finished starting up. -- -- The start-up result is done. Jul 22 08:29:45 managed-node13 systemd[1]: run-rcbd9c0f7a3c4415f98c1584aa6bd3b21.service: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit run-rcbd9c0f7a3c4415f98c1584aa6bd3b21.service has successfully entered the 'dead' state. Jul 22 08:29:45 managed-node13 sudo[17860]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjbuvyiemaamhuqfabvqejolgntefnwd ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187385.3229856-12503-47620421384671/AnsiballZ_blivet.py' Jul 22 08:29:45 managed-node13 sudo[17860]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:29:45 managed-node13 kernel: device-mapper: uevent: version 1.0.3 Jul 22 08:29:45 managed-node13 kernel: device-mapper: ioctl: 4.46.0-ioctl (2022-02-22) initialised: dm-devel@redhat.com Jul 22 08:29:45 managed-node13 systemd-udevd[536]: Network interface NamePolicy= disabled on kernel command line, ignoring. Jul 22 08:29:45 managed-node13 platform-python[17863]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} packages_only=True uses_kmod_kvdo=True safe_mode=True diskvolume_mkfs_option_map={} Jul 22 08:29:45 managed-node13 sudo[17860]: pam_unix(sudo:session): session closed for user root Jul 22 08:29:46 managed-node13 sudo[17991]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhgulnazlhsavazvahufozibuwfreefe ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187386.3355696-12576-92934675331276/AnsiballZ_dnf.py' Jul 22 08:29:46 managed-node13 sudo[17991]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:29:46 managed-node13 platform-python[17994]: ansible-ansible.legacy.dnf Invoked with name=['kpartx'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:29:49 managed-node13 sudo[17991]: pam_unix(sudo:session): session closed for user root Jul 22 08:29:49 managed-node13 sudo[18118]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcsxultamliewhmrncarprcjwhybznbt ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187389.2734473-12801-240246976119524/AnsiballZ_service_facts.py' Jul 22 08:29:49 managed-node13 sudo[18118]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:29:49 managed-node13 platform-python[18121]: ansible-service_facts Invoked Jul 22 08:29:50 managed-node13 sudo[18118]: pam_unix(sudo:session): session closed for user root Jul 22 08:29:51 managed-node13 sudo[18341]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idowmvygusywacqbcbhsqxorjdrtmjhm ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187391.1946256-13186-35859909147124/AnsiballZ_blivet.py' Jul 22 08:29:51 managed-node13 sudo[18341]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:29:51 managed-node13 platform-python[18344]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} safe_mode=False uses_kmod_kvdo=True packages_only=False diskvolume_mkfs_option_map={} Jul 22 08:29:51 managed-node13 sudo[18341]: pam_unix(sudo:session): session closed for user root Jul 22 08:29:51 managed-node13 sudo[18469]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvczjelkyicgtqvvxpevgctdctliaqnj ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187391.791596-13215-187738751137933/AnsiballZ_stat.py' Jul 22 08:29:52 managed-node13 sudo[18469]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:29:52 managed-node13 platform-python[18472]: ansible-stat Invoked with path=/etc/fstab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:29:52 managed-node13 sudo[18469]: pam_unix(sudo:session): session closed for user root Jul 22 08:29:53 managed-node13 sudo[18597]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdkerrtnpjghblkeitdrcazzrdzpcqmd ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187393.607815-13346-225819057124392/AnsiballZ_stat.py' Jul 22 08:29:53 managed-node13 sudo[18597]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:29:54 managed-node13 platform-python[18600]: ansible-stat Invoked with path=/etc/crypttab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:29:54 managed-node13 sudo[18597]: pam_unix(sudo:session): session closed for user root Jul 22 08:29:54 managed-node13 sudo[18725]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftebryusdcjtmxovlrxvdevpxllxyjno ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187394.2184603-13487-215749378243055/AnsiballZ_setup.py' Jul 22 08:29:54 managed-node13 sudo[18725]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:29:54 managed-node13 platform-python[18728]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:29:55 managed-node13 sudo[18725]: pam_unix(sudo:session): session closed for user root Jul 22 08:29:55 managed-node13 sudo[18883]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdajpvjrdrdypcqhqgoiagojxenwxxnw ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187395.4582226-13651-29524365039954/AnsiballZ_dnf.py' Jul 22 08:29:55 managed-node13 sudo[18883]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:29:55 managed-node13 platform-python[18886]: ansible-ansible.legacy.dnf Invoked with name=['util-linux'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:29:58 managed-node13 sudo[18883]: pam_unix(sudo:session): session closed for user root Jul 22 08:29:59 managed-node13 sudo[19010]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfihcfqqpznydukpixzdjukownppsrmn ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187398.5851023-14046-142705942879029/AnsiballZ_find_unused_disk.py' Jul 22 08:29:59 managed-node13 sudo[19010]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:29:59 managed-node13 platform-python[19013]: ansible-fedora.linux_system_roles.find_unused_disk Invoked with min_size=5g max_return=1 with_interface=scsi max_size=0 match_sector_size=False Jul 22 08:29:59 managed-node13 sudo[19010]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:00 managed-node13 sudo[19138]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cumnmtcfvenbjunwlyhtpnvvkdiysjcm ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187399.7067552-14138-44116524758704/AnsiballZ_command.py' Jul 22 08:30:00 managed-node13 sudo[19138]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:30:00 managed-node13 platform-python[19141]: ansible-ansible.legacy.command Invoked with _raw_params=set -x exec 1>&2 lsblk -p --pairs --bytes -o NAME,TYPE,SIZE,FSTYPE,LOG-SEC journalctl -ex _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 22 08:30:00 managed-node13 sudo[19138]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:01 managed-node13 sshd[19164]: Accepted publickey for root from 10.31.14.27 port 48486 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:30:01 managed-node13 systemd-logind[603]: New session 9 of user root. -- Subject: A new session 9 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 9 has been created for the user root. -- -- The leading process of the session is 19164. Jul 22 08:30:01 managed-node13 sshd[19164]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:30:01 managed-node13 systemd[1]: Started Session 9 of user root. -- Subject: Unit session-9.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-9.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:30:01 managed-node13 sshd[19167]: Received disconnect from 10.31.14.27 port 48486:11: disconnected by user Jul 22 08:30:01 managed-node13 sshd[19167]: Disconnected from user root 10.31.14.27 port 48486 Jul 22 08:30:01 managed-node13 sshd[19164]: pam_unix(sshd:session): session closed for user root Jul 22 08:30:01 managed-node13 systemd[1]: session-9.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-9.scope has successfully entered the 'dead' state. Jul 22 08:30:01 managed-node13 systemd-logind[603]: Session 9 logged out. Waiting for processes to exit. Jul 22 08:30:01 managed-node13 systemd-logind[603]: Removed session 9. -- Subject: Session 9 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 9 has been terminated. Jul 22 08:30:02 managed-node13 sshd[19188]: Accepted publickey for root from 10.31.14.27 port 48502 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:30:02 managed-node13 systemd[1]: Started Session 10 of user root. -- Subject: Unit session-10.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-10.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:30:02 managed-node13 systemd-logind[603]: New session 10 of user root. -- Subject: A new session 10 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 10 has been created for the user root. -- -- The leading process of the session is 19188. Jul 22 08:30:02 managed-node13 sshd[19188]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:30:02 managed-node13 sshd[19191]: Received disconnect from 10.31.14.27 port 48502:11: disconnected by user Jul 22 08:30:02 managed-node13 sshd[19191]: Disconnected from user root 10.31.14.27 port 48502 Jul 22 08:30:02 managed-node13 sshd[19188]: pam_unix(sshd:session): session closed for user root Jul 22 08:30:02 managed-node13 systemd[1]: session-10.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-10.scope has successfully entered the 'dead' state. Jul 22 08:30:02 managed-node13 systemd-logind[603]: Session 10 logged out. Waiting for processes to exit. Jul 22 08:30:02 managed-node13 systemd-logind[603]: Removed session 10. -- Subject: Session 10 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 10 has been terminated. Jul 22 08:30:05 managed-node13 sudo[19353]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrmezzwedzgsfdiolhvnbctcrljppcoo ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187404.3768682-14707-256653514911801/AnsiballZ_setup.py' Jul 22 08:30:05 managed-node13 sudo[19353]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:30:06 managed-node13 platform-python[19356]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:30:06 managed-node13 sudo[19353]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:07 managed-node13 sudo[19511]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcayocoxvvzeydjzlevvozzbrpsidobb ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187407.2083507-14966-49331310925671/AnsiballZ_stat.py' Jul 22 08:30:07 managed-node13 sudo[19511]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:30:07 managed-node13 platform-python[19514]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:30:07 managed-node13 sudo[19511]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:09 managed-node13 sudo[19637]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfssfcftecpisgpznswnuyvyhzxogxsv ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187408.547416-15086-83328120492029/AnsiballZ_dnf.py' Jul 22 08:30:09 managed-node13 sudo[19637]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:30:09 managed-node13 platform-python[19640]: ansible-ansible.legacy.dnf Invoked with name=['python3-blivet', 'libblockdev-crypto', 'libblockdev-dm', 'libblockdev-lvm', 'libblockdev-mdraid', 'libblockdev-swap', 'vdo', 'kmod-kvdo', 'xfsprogs', 'stratisd', 'stratis-cli', 'libblockdev'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:30:11 managed-node13 sudo[19637]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:12 managed-node13 sudo[19764]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buqrmelrlovehggmpavihzacquxqsmic ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187412.21809-15494-46820343654212/AnsiballZ_blivet.py' Jul 22 08:30:12 managed-node13 sudo[19764]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:30:13 managed-node13 platform-python[19767]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} packages_only=True uses_kmod_kvdo=True safe_mode=True diskvolume_mkfs_option_map={} Jul 22 08:30:13 managed-node13 sudo[19764]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:14 managed-node13 sudo[19892]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqfzdmvnjgjoirmsshchfvwcbdyzfjxp ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187413.9659958-15757-153361732274919/AnsiballZ_dnf.py' Jul 22 08:30:14 managed-node13 sudo[19892]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:30:14 managed-node13 platform-python[19895]: ansible-ansible.legacy.dnf Invoked with name=['kpartx'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:30:17 managed-node13 sudo[19892]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:18 managed-node13 sudo[20019]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lybunkypologbttcdqusfwisrdypwokw ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187417.502047-16264-180087303657703/AnsiballZ_service_facts.py' Jul 22 08:30:18 managed-node13 sudo[20019]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:30:18 managed-node13 platform-python[20022]: ansible-service_facts Invoked Jul 22 08:30:19 managed-node13 sudo[20019]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:21 managed-node13 sudo[20242]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuqccvacnpapktqjgglxjvaobodrunab ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187421.20751-16584-229632267526152/AnsiballZ_blivet.py' Jul 22 08:30:21 managed-node13 sudo[20242]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:30:22 managed-node13 platform-python[20245]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} safe_mode=False uses_kmod_kvdo=True packages_only=False diskvolume_mkfs_option_map={} Jul 22 08:30:22 managed-node13 sudo[20242]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:23 managed-node13 sudo[20370]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlmgoimvolmrhnsmzknedygqxrpvvjah ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187422.747427-16781-35266649351749/AnsiballZ_stat.py' Jul 22 08:30:23 managed-node13 sudo[20370]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:30:23 managed-node13 platform-python[20373]: ansible-stat Invoked with path=/etc/fstab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:30:23 managed-node13 sudo[20370]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:26 managed-node13 sudo[20498]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyziffsxtcomlgljvpijzjoouaxyyshe ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187426.2432525-17240-127607745926315/AnsiballZ_stat.py' Jul 22 08:30:26 managed-node13 sudo[20498]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:30:26 managed-node13 platform-python[20501]: ansible-stat Invoked with path=/etc/crypttab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:30:26 managed-node13 sudo[20498]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:27 managed-node13 sudo[20626]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwbuidcjkewkikcshtbertqfdczmnabg ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187427.1854866-17377-230497932109019/AnsiballZ_setup.py' Jul 22 08:30:27 managed-node13 sudo[20626]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:30:27 managed-node13 platform-python[20629]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:30:28 managed-node13 sudo[20626]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:29 managed-node13 sudo[20784]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egdrqgqtvksksoyjyyujhuaztjouoeyt ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187428.8205612-17536-247430579717553/AnsiballZ_dnf.py' Jul 22 08:30:29 managed-node13 sudo[20784]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:30:29 managed-node13 platform-python[20787]: ansible-ansible.legacy.dnf Invoked with name=['util-linux'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:30:32 managed-node13 sudo[20784]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:33 managed-node13 sudo[20911]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goembezcalfyrzpaptzfttykvdidbujs ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187432.482636-17966-243154987433844/AnsiballZ_find_unused_disk.py' Jul 22 08:30:33 managed-node13 sudo[20911]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:30:33 managed-node13 platform-python[20914]: ansible-fedora.linux_system_roles.find_unused_disk Invoked with min_size=10g max_return=1 max_size=0 match_sector_size=False with_interface=None Jul 22 08:30:33 managed-node13 sudo[20911]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:35 managed-node13 sudo[21039]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irocuziogystgsvjevovbdyfwipofagv ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187434.3195336-18179-210033904615901/AnsiballZ_command.py' Jul 22 08:30:35 managed-node13 sudo[21039]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:30:36 managed-node13 platform-python[21042]: ansible-ansible.legacy.command Invoked with _raw_params=set -x exec 1>&2 lsblk -p --pairs --bytes -o NAME,TYPE,SIZE,FSTYPE,LOG-SEC journalctl -ex _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 22 08:30:36 managed-node13 sudo[21039]: pam_unix(sudo:session): session closed for user root Jul 22 08:30:37 managed-node13 sshd[21065]: Accepted publickey for root from 10.31.14.27 port 51042 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:30:37 managed-node13 systemd-logind[603]: New session 11 of user root. -- Subject: A new session 11 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 11 has been created for the user root. -- -- The leading process of the session is 21065. Jul 22 08:30:37 managed-node13 systemd[1]: Started Session 11 of user root. -- Subject: Unit session-11.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-11.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:30:37 managed-node13 sshd[21065]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:30:37 managed-node13 sshd[21068]: Received disconnect from 10.31.14.27 port 51042:11: disconnected by user Jul 22 08:30:37 managed-node13 sshd[21068]: Disconnected from user root 10.31.14.27 port 51042 Jul 22 08:30:37 managed-node13 sshd[21065]: pam_unix(sshd:session): session closed for user root Jul 22 08:30:37 managed-node13 systemd[1]: session-11.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-11.scope has successfully entered the 'dead' state. Jul 22 08:30:37 managed-node13 systemd-logind[603]: Session 11 logged out. Waiting for processes to exit. Jul 22 08:30:37 managed-node13 systemd-logind[603]: Removed session 11. -- Subject: Session 11 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 11 has been terminated. Jul 22 08:30:38 managed-node13 sshd[21089]: Accepted publickey for root from 10.31.14.27 port 51056 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:30:38 managed-node13 systemd-logind[603]: New session 12 of user root. -- Subject: A new session 12 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 12 has been created for the user root. -- -- The leading process of the session is 21089. Jul 22 08:30:38 managed-node13 systemd[1]: Started Session 12 of user root. -- Subject: Unit session-12.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-12.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:30:38 managed-node13 sshd[21089]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:30:38 managed-node13 sshd[21092]: Received disconnect from 10.31.14.27 port 51056:11: disconnected by user Jul 22 08:30:38 managed-node13 sshd[21092]: Disconnected from user root 10.31.14.27 port 51056 Jul 22 08:30:38 managed-node13 sshd[21089]: pam_unix(sshd:session): session closed for user root Jul 22 08:30:38 managed-node13 systemd[1]: session-12.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-12.scope has successfully entered the 'dead' state. Jul 22 08:30:38 managed-node13 systemd-logind[603]: Session 12 logged out. Waiting for processes to exit. Jul 22 08:30:38 managed-node13 systemd-logind[603]: Removed session 12. -- Subject: Session 12 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 12 has been terminated. Jul 22 08:30:50 managed-node13 platform-python[21255]: ansible-setup Invoked with gather_subset=['!all', '!min', 'architecture', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:30:51 managed-node13 platform-python[21383]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:30:53 managed-node13 platform-python[21506]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Jul 22 08:30:54 managed-node13 platform-python[21574]: ansible-ansible.legacy.dnf Invoked with name=['python3-blivet', 'libblockdev-crypto', 'libblockdev-dm', 'libblockdev-lvm', 'libblockdev-mdraid', 'libblockdev-swap', 'vdo', 'kmod-kvdo', 'xfsprogs', 'stratisd', 'stratis-cli', 'libblockdev'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:30:58 managed-node13 platform-python[21698]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} packages_only=True uses_kmod_kvdo=True safe_mode=True diskvolume_mkfs_option_map={} Jul 22 08:31:00 managed-node13 platform-python[21823]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Jul 22 08:31:00 managed-node13 platform-python[21891]: ansible-ansible.legacy.dnf Invoked with name=['kpartx'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:31:04 managed-node13 platform-python[22015]: ansible-service_facts Invoked Jul 22 08:31:07 managed-node13 platform-python[22235]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} safe_mode=True uses_kmod_kvdo=True packages_only=False diskvolume_mkfs_option_map={} Jul 22 08:31:08 managed-node13 platform-python[22360]: ansible-stat Invoked with path=/etc/fstab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:31:10 managed-node13 platform-python[22485]: ansible-stat Invoked with path=/etc/crypttab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:31:11 managed-node13 platform-python[22610]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:31:12 managed-node13 sshd[22663]: Accepted publickey for root from 10.31.14.27 port 44192 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:31:12 managed-node13 systemd-logind[603]: New session 13 of user root. -- Subject: A new session 13 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 13 has been created for the user root. -- -- The leading process of the session is 22663. Jul 22 08:31:12 managed-node13 systemd[1]: Started Session 13 of user root. -- Subject: Unit session-13.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-13.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:31:12 managed-node13 sshd[22663]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:31:12 managed-node13 sshd[22666]: Received disconnect from 10.31.14.27 port 44192:11: disconnected by user Jul 22 08:31:12 managed-node13 sshd[22666]: Disconnected from user root 10.31.14.27 port 44192 Jul 22 08:31:12 managed-node13 sshd[22663]: pam_unix(sshd:session): session closed for user root Jul 22 08:31:12 managed-node13 systemd[1]: session-13.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-13.scope has successfully entered the 'dead' state. Jul 22 08:31:12 managed-node13 systemd-logind[603]: Session 13 logged out. Waiting for processes to exit. Jul 22 08:31:12 managed-node13 systemd-logind[603]: Removed session 13. -- Subject: Session 13 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 13 has been terminated. Jul 22 08:31:21 managed-node13 sshd[22688]: Accepted publickey for root from 10.31.14.27 port 55372 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:31:21 managed-node13 systemd[1]: Started Session 14 of user root. -- Subject: Unit session-14.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-14.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:31:21 managed-node13 systemd-logind[603]: New session 14 of user root. -- Subject: A new session 14 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 14 has been created for the user root. -- -- The leading process of the session is 22688. Jul 22 08:31:21 managed-node13 sshd[22688]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:31:21 managed-node13 sshd[22691]: Received disconnect from 10.31.14.27 port 55372:11: disconnected by user Jul 22 08:31:21 managed-node13 sshd[22691]: Disconnected from user root 10.31.14.27 port 55372 Jul 22 08:31:21 managed-node13 sshd[22688]: pam_unix(sshd:session): session closed for user root Jul 22 08:31:21 managed-node13 systemd[1]: session-14.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-14.scope has successfully entered the 'dead' state. Jul 22 08:31:21 managed-node13 systemd-logind[603]: Session 14 logged out. Waiting for processes to exit. Jul 22 08:31:21 managed-node13 systemd-logind[603]: Removed session 14. -- Subject: Session 14 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 14 has been terminated. Jul 22 08:31:23 managed-node13 platform-python[22853]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:31:24 managed-node13 sudo[23008]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dljshfjhskfatokitvkzcvwhatqlwhhm ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187484.012517-25143-247960988522272/AnsiballZ_setup.py' Jul 22 08:31:24 managed-node13 sudo[23008]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:24 managed-node13 platform-python[23011]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:31:24 managed-node13 sudo[23008]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:26 managed-node13 sudo[23166]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txvdguoefwrngntxcnfxjesrpuelviux ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187485.6566684-25248-133860606162870/AnsiballZ_stat.py' Jul 22 08:31:26 managed-node13 sudo[23166]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:26 managed-node13 platform-python[23169]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:31:26 managed-node13 sudo[23166]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:27 managed-node13 sudo[23292]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzejmijsmqatcdefbfmquaksazbsbgfx ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187486.798662-25361-73741044081586/AnsiballZ_dnf.py' Jul 22 08:31:27 managed-node13 sudo[23292]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:27 managed-node13 platform-python[23295]: ansible-ansible.legacy.dnf Invoked with name=['python3-blivet', 'libblockdev-crypto', 'libblockdev-dm', 'libblockdev-lvm', 'libblockdev-mdraid', 'libblockdev-swap', 'vdo', 'kmod-kvdo', 'xfsprogs', 'stratisd', 'stratis-cli', 'libblockdev'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:31:30 managed-node13 sudo[23292]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:31 managed-node13 sudo[23419]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyrjmonhdqvltcatcdmftgnqvybumwxe ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187491.011933-25909-208204228576952/AnsiballZ_blivet.py' Jul 22 08:31:31 managed-node13 sudo[23419]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:32 managed-node13 platform-python[23422]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=True disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} packages_only=True uses_kmod_kvdo=True safe_mode=True diskvolume_mkfs_option_map={} Jul 22 08:31:32 managed-node13 sudo[23419]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:32 managed-node13 sudo[23547]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ummpzenmliycmrxbjuntzogozoyeboxa ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187492.6928117-26316-86916207768327/AnsiballZ_dnf.py' Jul 22 08:31:32 managed-node13 sudo[23547]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:33 managed-node13 platform-python[23550]: ansible-ansible.legacy.dnf Invoked with name=['kpartx'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:31:35 managed-node13 sudo[23547]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:36 managed-node13 sudo[23674]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmcafxqvlwlprpvsjpdrlunvsdskqfqi ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187495.8359253-26761-274232880296188/AnsiballZ_service_facts.py' Jul 22 08:31:36 managed-node13 sudo[23674]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:36 managed-node13 platform-python[23677]: ansible-service_facts Invoked Jul 22 08:31:37 managed-node13 sudo[23674]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:38 managed-node13 sudo[23897]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzopuawtwzpjkpdfavtntbzjloxycfqq ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187498.2394497-27118-132713064649605/AnsiballZ_blivet.py' Jul 22 08:31:38 managed-node13 sudo[23897]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:38 managed-node13 platform-python[23900]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=True disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} safe_mode=False uses_kmod_kvdo=True packages_only=False diskvolume_mkfs_option_map={} Jul 22 08:31:38 managed-node13 sudo[23897]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:39 managed-node13 sudo[24025]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akbuihkiemjksqivypoqjtovwotdbgaq ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187499.062069-27242-271510085436812/AnsiballZ_stat.py' Jul 22 08:31:39 managed-node13 sudo[24025]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:39 managed-node13 platform-python[24028]: ansible-stat Invoked with path=/etc/fstab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:31:39 managed-node13 sudo[24025]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:40 managed-node13 sudo[24153]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmfrdogjucpzlkjvnntnfrdcmrhrvxsa ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187500.0673966-27435-57300055977871/AnsiballZ_stat.py' Jul 22 08:31:40 managed-node13 sudo[24153]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:40 managed-node13 platform-python[24156]: ansible-stat Invoked with path=/etc/crypttab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:31:40 managed-node13 sudo[24153]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:41 managed-node13 sudo[24281]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shucktzbbxagihbnnoyigyabfpmninvx ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187501.011757-27511-273312361905585/AnsiballZ_setup.py' Jul 22 08:31:41 managed-node13 sudo[24281]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:41 managed-node13 platform-python[24284]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:31:42 managed-node13 sudo[24281]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:43 managed-node13 sudo[24439]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilioipjcoiizzqpvpvniaunqzlgmaqin ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187502.9115794-27704-110740655467427/AnsiballZ_dnf.py' Jul 22 08:31:43 managed-node13 sudo[24439]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:43 managed-node13 platform-python[24442]: ansible-ansible.legacy.dnf Invoked with name=['util-linux'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:31:45 managed-node13 sudo[24439]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:46 managed-node13 sudo[24566]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhrgupztsfxihlzmqxbamupuhwjobwbp ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187505.9343972-28231-130447824279708/AnsiballZ_find_unused_disk.py' Jul 22 08:31:46 managed-node13 sudo[24566]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:46 managed-node13 platform-python[24569]: ansible-fedora.linux_system_roles.find_unused_disk Invoked with max_return=2 with_interface=scsi min_size=0 max_size=0 match_sector_size=False Jul 22 08:31:46 managed-node13 sudo[24566]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:47 managed-node13 sudo[24694]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyrttkioycrmyqpkctubyxxycvtdurbz ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187507.2275624-28328-11837960772161/AnsiballZ_command.py' Jul 22 08:31:47 managed-node13 sudo[24694]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:48 managed-node13 platform-python[24697]: ansible-ansible.legacy.command Invoked with _raw_params=set -x exec 1>&2 lsblk -p --pairs --bytes -o NAME,TYPE,SIZE,FSTYPE,LOG-SEC journalctl -ex _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 22 08:31:48 managed-node13 sudo[24694]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:49 managed-node13 sshd[24720]: Accepted publickey for root from 10.31.14.27 port 46916 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:31:49 managed-node13 systemd-logind[603]: New session 15 of user root. -- Subject: A new session 15 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 15 has been created for the user root. -- -- The leading process of the session is 24720. Jul 22 08:31:49 managed-node13 systemd[1]: Started Session 15 of user root. -- Subject: Unit session-15.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-15.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:31:49 managed-node13 sshd[24720]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:31:49 managed-node13 sshd[24723]: Received disconnect from 10.31.14.27 port 46916:11: disconnected by user Jul 22 08:31:49 managed-node13 sshd[24723]: Disconnected from user root 10.31.14.27 port 46916 Jul 22 08:31:49 managed-node13 sshd[24720]: pam_unix(sshd:session): session closed for user root Jul 22 08:31:49 managed-node13 systemd[1]: session-15.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-15.scope has successfully entered the 'dead' state. Jul 22 08:31:49 managed-node13 systemd-logind[603]: Session 15 logged out. Waiting for processes to exit. Jul 22 08:31:49 managed-node13 systemd-logind[603]: Removed session 15. -- Subject: Session 15 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 15 has been terminated. Jul 22 08:31:49 managed-node13 sshd[24744]: Accepted publickey for root from 10.31.14.27 port 46924 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:31:49 managed-node13 systemd[1]: Started Session 16 of user root. -- Subject: Unit session-16.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-16.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:31:49 managed-node13 systemd-logind[603]: New session 16 of user root. -- Subject: A new session 16 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 16 has been created for the user root. -- -- The leading process of the session is 24744. Jul 22 08:31:49 managed-node13 sshd[24744]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:31:49 managed-node13 sshd[24747]: Received disconnect from 10.31.14.27 port 46924:11: disconnected by user Jul 22 08:31:49 managed-node13 sshd[24747]: Disconnected from user root 10.31.14.27 port 46924 Jul 22 08:31:49 managed-node13 sshd[24744]: pam_unix(sshd:session): session closed for user root Jul 22 08:31:49 managed-node13 systemd[1]: session-16.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-16.scope has successfully entered the 'dead' state. Jul 22 08:31:49 managed-node13 systemd-logind[603]: Session 16 logged out. Waiting for processes to exit. Jul 22 08:31:49 managed-node13 systemd-logind[603]: Removed session 16. -- Subject: Session 16 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 16 has been terminated. Jul 22 08:31:53 managed-node13 platform-python[24909]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:31:54 managed-node13 sudo[25064]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rukouoircdfjlodsycjuisfefowilgxl ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187513.9285758-29166-214359704967667/AnsiballZ_setup.py' Jul 22 08:31:54 managed-node13 sudo[25064]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:54 managed-node13 platform-python[25067]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:31:54 managed-node13 sudo[25064]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:56 managed-node13 sudo[25222]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qenupysmzrvyxdvlinfzevhijixhetdf ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187516.2294338-29320-159446414947897/AnsiballZ_stat.py' Jul 22 08:31:56 managed-node13 sudo[25222]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:57 managed-node13 platform-python[25225]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:31:57 managed-node13 sudo[25222]: pam_unix(sudo:session): session closed for user root Jul 22 08:31:58 managed-node13 sudo[25348]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxpfpckctbadhtxnvmslcrtrkcklhqfr ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187517.821753-29535-134281162240011/AnsiballZ_dnf.py' Jul 22 08:31:58 managed-node13 sudo[25348]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:31:58 managed-node13 platform-python[25351]: ansible-ansible.legacy.dnf Invoked with name=['python3-blivet', 'libblockdev-crypto', 'libblockdev-dm', 'libblockdev-lvm', 'libblockdev-mdraid', 'libblockdev-swap', 'vdo', 'kmod-kvdo', 'xfsprogs', 'stratisd', 'stratis-cli', 'libblockdev'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:32:01 managed-node13 sudo[25348]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:02 managed-node13 sudo[25475]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdmnhavlsznmyknfkfbuozkfblxamony ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187521.471522-30188-235684611714846/AnsiballZ_blivet.py' Jul 22 08:32:02 managed-node13 sudo[25475]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:02 managed-node13 platform-python[25478]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} packages_only=True uses_kmod_kvdo=True safe_mode=True diskvolume_mkfs_option_map={} Jul 22 08:32:02 managed-node13 sudo[25475]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:02 managed-node13 sudo[25603]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdxztcytjknijamyxwtyxtnivinnrlrx ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187522.8585398-30380-199778552110094/AnsiballZ_dnf.py' Jul 22 08:32:03 managed-node13 sudo[25603]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:03 managed-node13 platform-python[25606]: ansible-ansible.legacy.dnf Invoked with name=['kpartx'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:32:05 managed-node13 sudo[25603]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:06 managed-node13 sudo[25730]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvrkkumrpbzetcphtefxdwzymyorvpmv ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187525.8076258-30675-144650422952019/AnsiballZ_service_facts.py' Jul 22 08:32:06 managed-node13 sudo[25730]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:06 managed-node13 platform-python[25733]: ansible-service_facts Invoked Jul 22 08:32:07 managed-node13 sudo[25730]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:09 managed-node13 sudo[25953]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxifwtfoqbnogibrevyiwjamvqpccwhe ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187528.6708763-31027-128048869902559/AnsiballZ_blivet.py' Jul 22 08:32:09 managed-node13 sudo[25953]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:09 managed-node13 platform-python[25956]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} safe_mode=True uses_kmod_kvdo=True packages_only=False diskvolume_mkfs_option_map={} Jul 22 08:32:09 managed-node13 sudo[25953]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:10 managed-node13 sudo[26081]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfrxkhlkfkjesccmhjbzrfdprectnjyr ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187529.868107-31143-256638486141115/AnsiballZ_stat.py' Jul 22 08:32:10 managed-node13 sudo[26081]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:10 managed-node13 platform-python[26084]: ansible-stat Invoked with path=/etc/fstab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:32:10 managed-node13 sudo[26081]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:12 managed-node13 sudo[26209]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgcgjrnznsohlfspruqmjbrgdtjjfmbe ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187531.78667-31357-11943230107503/AnsiballZ_stat.py' Jul 22 08:32:12 managed-node13 sudo[26209]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:12 managed-node13 platform-python[26212]: ansible-stat Invoked with path=/etc/crypttab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:32:12 managed-node13 sudo[26209]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:12 managed-node13 sudo[26337]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbyswtjpauvhritrurjtacfdqtruzrza ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187532.5434258-31464-179110712715474/AnsiballZ_setup.py' Jul 22 08:32:12 managed-node13 sudo[26337]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:12 managed-node13 platform-python[26340]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:32:13 managed-node13 sudo[26337]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:13 managed-node13 sudo[26495]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezuyizfrrdqarywirwnubbgqatdkswgy ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187533.784869-31709-82214895630340/AnsiballZ_dnf.py' Jul 22 08:32:13 managed-node13 sudo[26495]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:14 managed-node13 platform-python[26498]: ansible-ansible.legacy.dnf Invoked with name=['util-linux'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:32:16 managed-node13 sudo[26495]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:17 managed-node13 sudo[26622]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orhiegqoclmmezivgkobnxptutvoxfsm ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187536.7427425-32100-254990396616390/AnsiballZ_find_unused_disk.py' Jul 22 08:32:17 managed-node13 sudo[26622]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:17 managed-node13 platform-python[26625]: ansible-fedora.linux_system_roles.find_unused_disk Invoked with min_size=5g max_return=1 with_interface=scsi max_size=0 match_sector_size=False Jul 22 08:32:18 managed-node13 sudo[26622]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:19 managed-node13 sudo[26750]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeghmmckhbgoqvdtrfgqhjxlyeyqlzzh ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187538.9444413-32339-191035508161213/AnsiballZ_command.py' Jul 22 08:32:19 managed-node13 sudo[26750]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:19 managed-node13 platform-python[26753]: ansible-ansible.legacy.command Invoked with _raw_params=set -x exec 1>&2 lsblk -p --pairs --bytes -o NAME,TYPE,SIZE,FSTYPE,LOG-SEC journalctl -ex _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 22 08:32:19 managed-node13 sudo[26750]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:20 managed-node13 sshd[26776]: Accepted publickey for root from 10.31.14.27 port 51784 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:32:20 managed-node13 systemd-logind[603]: New session 17 of user root. -- Subject: A new session 17 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 17 has been created for the user root. -- -- The leading process of the session is 26776. Jul 22 08:32:20 managed-node13 systemd[1]: Started Session 17 of user root. -- Subject: Unit session-17.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-17.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:32:20 managed-node13 sshd[26776]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:32:20 managed-node13 sshd[26779]: Received disconnect from 10.31.14.27 port 51784:11: disconnected by user Jul 22 08:32:20 managed-node13 sshd[26779]: Disconnected from user root 10.31.14.27 port 51784 Jul 22 08:32:20 managed-node13 sshd[26776]: pam_unix(sshd:session): session closed for user root Jul 22 08:32:20 managed-node13 systemd[1]: session-17.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-17.scope has successfully entered the 'dead' state. Jul 22 08:32:20 managed-node13 systemd-logind[603]: Session 17 logged out. Waiting for processes to exit. Jul 22 08:32:20 managed-node13 systemd-logind[603]: Removed session 17. -- Subject: Session 17 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 17 has been terminated. Jul 22 08:32:20 managed-node13 sshd[26800]: Accepted publickey for root from 10.31.14.27 port 51792 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:32:20 managed-node13 systemd[1]: Started Session 18 of user root. -- Subject: Unit session-18.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-18.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:32:20 managed-node13 systemd-logind[603]: New session 18 of user root. -- Subject: A new session 18 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 18 has been created for the user root. -- -- The leading process of the session is 26800. Jul 22 08:32:20 managed-node13 sshd[26800]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:32:20 managed-node13 sshd[26803]: Received disconnect from 10.31.14.27 port 51792:11: disconnected by user Jul 22 08:32:20 managed-node13 sshd[26803]: Disconnected from user root 10.31.14.27 port 51792 Jul 22 08:32:20 managed-node13 sshd[26800]: pam_unix(sshd:session): session closed for user root Jul 22 08:32:20 managed-node13 systemd[1]: session-18.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-18.scope has successfully entered the 'dead' state. Jul 22 08:32:20 managed-node13 systemd-logind[603]: Session 18 logged out. Waiting for processes to exit. Jul 22 08:32:20 managed-node13 systemd-logind[603]: Removed session 18. -- Subject: Session 18 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 18 has been terminated. Jul 22 08:32:23 managed-node13 sshd[26824]: Accepted publickey for root from 10.31.14.27 port 51794 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:32:23 managed-node13 systemd[1]: Started Session 19 of user root. -- Subject: Unit session-19.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-19.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:32:23 managed-node13 systemd-logind[603]: New session 19 of user root. -- Subject: A new session 19 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 19 has been created for the user root. -- -- The leading process of the session is 26824. Jul 22 08:32:23 managed-node13 sshd[26824]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:32:23 managed-node13 sshd[26827]: Received disconnect from 10.31.14.27 port 51794:11: disconnected by user Jul 22 08:32:23 managed-node13 sshd[26827]: Disconnected from user root 10.31.14.27 port 51794 Jul 22 08:32:23 managed-node13 sshd[26824]: pam_unix(sshd:session): session closed for user root Jul 22 08:32:23 managed-node13 systemd[1]: session-19.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-19.scope has successfully entered the 'dead' state. Jul 22 08:32:23 managed-node13 systemd-logind[603]: Session 19 logged out. Waiting for processes to exit. Jul 22 08:32:23 managed-node13 systemd-logind[603]: Removed session 19. -- Subject: Session 19 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 19 has been terminated. Jul 22 08:32:28 managed-node13 sudo[26989]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgzackmwqgwtrajeicsfcpjofpajkrav ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187547.2515187-33404-78624766987723/AnsiballZ_setup.py' Jul 22 08:32:28 managed-node13 sudo[26989]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:28 managed-node13 platform-python[26992]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:32:28 managed-node13 sudo[26989]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:30 managed-node13 sudo[27147]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vycqxivmmrnbeirvwhsksrlooqcfmfww ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187549.860531-33730-235411971855754/AnsiballZ_stat.py' Jul 22 08:32:30 managed-node13 sudo[27147]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:30 managed-node13 platform-python[27150]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:32:30 managed-node13 sudo[27147]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:31 managed-node13 sudo[27273]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idptazyndpjbkedngndmxmjqcerkqnuy ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187550.8127637-33886-213062716856312/AnsiballZ_dnf.py' Jul 22 08:32:31 managed-node13 sudo[27273]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:31 managed-node13 platform-python[27276]: ansible-ansible.legacy.dnf Invoked with name=['python3-blivet', 'libblockdev-crypto', 'libblockdev-dm', 'libblockdev-lvm', 'libblockdev-mdraid', 'libblockdev-swap', 'vdo', 'kmod-kvdo', 'xfsprogs', 'stratisd', 'stratis-cli', 'libblockdev'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:32:34 managed-node13 sudo[27273]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:35 managed-node13 sudo[27400]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhhhbogtxuhzzivygxtyynuhxobviggt ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187554.5900586-34360-270220222355455/AnsiballZ_blivet.py' Jul 22 08:32:35 managed-node13 sudo[27400]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:35 managed-node13 platform-python[27403]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} packages_only=True uses_kmod_kvdo=True safe_mode=True diskvolume_mkfs_option_map={} Jul 22 08:32:35 managed-node13 sudo[27400]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:36 managed-node13 sudo[27528]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-losaimxfkbgrfuzfytvilxpwpomfdnlq ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187556.0911622-34607-22239793363661/AnsiballZ_dnf.py' Jul 22 08:32:36 managed-node13 sudo[27528]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:36 managed-node13 platform-python[27531]: ansible-ansible.legacy.dnf Invoked with name=['kpartx'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:32:39 managed-node13 sudo[27528]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:39 managed-node13 sudo[27655]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smywejwsjoildlszhwwetmgtgvmlqxal ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187559.3330934-34989-186681876224397/AnsiballZ_service_facts.py' Jul 22 08:32:39 managed-node13 sudo[27655]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:40 managed-node13 platform-python[27658]: ansible-service_facts Invoked Jul 22 08:32:41 managed-node13 sudo[27655]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:41 managed-node13 sudo[27878]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqpnpepddbesxqxoccfinwdwsepxyesf ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187561.6129732-35436-206522650101702/AnsiballZ_blivet.py' Jul 22 08:32:41 managed-node13 sudo[27878]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:42 managed-node13 platform-python[27881]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} safe_mode=True uses_kmod_kvdo=True packages_only=False diskvolume_mkfs_option_map={} Jul 22 08:32:42 managed-node13 sudo[27878]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:42 managed-node13 sudo[28006]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrmwqekeyyrvqnprlubryrdpdpyeajhg ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187562.4189396-35583-36474317039796/AnsiballZ_stat.py' Jul 22 08:32:42 managed-node13 sudo[28006]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:42 managed-node13 platform-python[28009]: ansible-stat Invoked with path=/etc/fstab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:32:43 managed-node13 sudo[28006]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:44 managed-node13 sudo[28134]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylropldfytmzlgcrmjafnrhijarktbyy ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187564.4642582-35727-16779439354643/AnsiballZ_stat.py' Jul 22 08:32:44 managed-node13 sudo[28134]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:44 managed-node13 platform-python[28137]: ansible-stat Invoked with path=/etc/crypttab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:32:44 managed-node13 sudo[28134]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:45 managed-node13 sudo[28262]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-larmndxkhtlbussrjqirkrbjzsjcqqms ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187565.2369792-35847-142815819300892/AnsiballZ_setup.py' Jul 22 08:32:45 managed-node13 sudo[28262]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:45 managed-node13 platform-python[28265]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:32:46 managed-node13 sudo[28262]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:46 managed-node13 sudo[28420]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jitpqafbfpftbcpyfbjqvwpleymlfpkv ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187566.5157568-36053-75903522006307/AnsiballZ_dnf.py' Jul 22 08:32:46 managed-node13 sudo[28420]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:46 managed-node13 platform-python[28423]: ansible-ansible.legacy.dnf Invoked with name=['util-linux'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:32:49 managed-node13 sudo[28420]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:50 managed-node13 sudo[28547]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldvpdqdjchpzvcbkzmnrbucihehskfsw ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187569.6513417-36436-246956917654427/AnsiballZ_find_unused_disk.py' Jul 22 08:32:50 managed-node13 sudo[28547]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:32:50 managed-node13 platform-python[28550]: ansible-fedora.linux_system_roles.find_unused_disk Invoked with min_size=5g max_return=2 match_sector_size=True max_size=0 with_interface=None Jul 22 08:32:50 managed-node13 sudo[28547]: pam_unix(sudo:session): session closed for user root Jul 22 08:32:51 managed-node13 sshd[28573]: Accepted publickey for root from 10.31.14.27 port 55514 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:32:51 managed-node13 systemd-logind[603]: New session 20 of user root. -- Subject: A new session 20 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 20 has been created for the user root. -- -- The leading process of the session is 28573. Jul 22 08:32:51 managed-node13 systemd[1]: Started Session 20 of user root. -- Subject: Unit session-20.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-20.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:32:51 managed-node13 sshd[28573]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:32:51 managed-node13 sshd[28576]: Received disconnect from 10.31.14.27 port 55514:11: disconnected by user Jul 22 08:32:51 managed-node13 sshd[28576]: Disconnected from user root 10.31.14.27 port 55514 Jul 22 08:32:51 managed-node13 sshd[28573]: pam_unix(sshd:session): session closed for user root Jul 22 08:32:51 managed-node13 systemd[1]: session-20.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-20.scope has successfully entered the 'dead' state. Jul 22 08:32:51 managed-node13 systemd-logind[603]: Session 20 logged out. Waiting for processes to exit. Jul 22 08:32:51 managed-node13 systemd-logind[603]: Removed session 20. -- Subject: Session 20 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 20 has been terminated. Jul 22 08:32:51 managed-node13 sshd[28597]: Accepted publickey for root from 10.31.14.27 port 55516 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:32:51 managed-node13 systemd[1]: Started Session 21 of user root. -- Subject: Unit session-21.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-21.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:32:51 managed-node13 systemd-logind[603]: New session 21 of user root. -- Subject: A new session 21 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 21 has been created for the user root. -- -- The leading process of the session is 28597. Jul 22 08:32:51 managed-node13 sshd[28597]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:32:51 managed-node13 sshd[28600]: Received disconnect from 10.31.14.27 port 55516:11: disconnected by user Jul 22 08:32:51 managed-node13 sshd[28600]: Disconnected from user root 10.31.14.27 port 55516 Jul 22 08:32:51 managed-node13 sshd[28597]: pam_unix(sshd:session): session closed for user root Jul 22 08:32:51 managed-node13 systemd[1]: session-21.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-21.scope has successfully entered the 'dead' state. Jul 22 08:32:51 managed-node13 systemd-logind[603]: Session 21 logged out. Waiting for processes to exit. Jul 22 08:32:51 managed-node13 systemd-logind[603]: Removed session 21. -- Subject: Session 21 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 21 has been terminated. Jul 22 08:32:57 managed-node13 sshd[28621]: Accepted publickey for root from 10.31.14.27 port 54134 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:32:57 managed-node13 systemd[1]: Started Session 22 of user root. -- Subject: Unit session-22.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-22.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:32:57 managed-node13 systemd-logind[603]: New session 22 of user root. -- Subject: A new session 22 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 22 has been created for the user root. -- -- The leading process of the session is 28621. Jul 22 08:32:57 managed-node13 sshd[28621]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:32:57 managed-node13 sshd[28624]: Received disconnect from 10.31.14.27 port 54134:11: disconnected by user Jul 22 08:32:57 managed-node13 sshd[28624]: Disconnected from user root 10.31.14.27 port 54134 Jul 22 08:32:57 managed-node13 sshd[28621]: pam_unix(sshd:session): session closed for user root Jul 22 08:32:57 managed-node13 systemd[1]: session-22.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-22.scope has successfully entered the 'dead' state. Jul 22 08:32:57 managed-node13 systemd-logind[603]: Session 22 logged out. Waiting for processes to exit. Jul 22 08:32:57 managed-node13 systemd-logind[603]: Removed session 22. -- Subject: Session 22 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 22 has been terminated. Jul 22 08:32:59 managed-node13 platform-python[28786]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:33:00 managed-node13 sudo[28941]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvrxwzmlnhdlqfyxqzhkfcvefrvqavnj ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187580.479519-37990-251176859656817/AnsiballZ_setup.py' Jul 22 08:33:00 managed-node13 sudo[28941]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:01 managed-node13 platform-python[28944]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:33:01 managed-node13 sudo[28941]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:02 managed-node13 sudo[29099]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zltwwyikkybhlpufwtmjphdxtuixdsmi ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187581.8254185-38160-164079132376070/AnsiballZ_stat.py' Jul 22 08:33:02 managed-node13 sudo[29099]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:02 managed-node13 platform-python[29102]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:33:02 managed-node13 sudo[29099]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:03 managed-node13 sudo[29225]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpsefmtuzqrdyaottcwglkqoaofyvovv ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187583.322672-38265-180533253493515/AnsiballZ_dnf.py' Jul 22 08:33:03 managed-node13 sudo[29225]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:04 managed-node13 platform-python[29228]: ansible-ansible.legacy.dnf Invoked with name=['python3-blivet', 'libblockdev-crypto', 'libblockdev-dm', 'libblockdev-lvm', 'libblockdev-mdraid', 'libblockdev-swap', 'vdo', 'kmod-kvdo', 'xfsprogs', 'stratisd', 'stratis-cli', 'libblockdev'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:33:06 managed-node13 sudo[29225]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:07 managed-node13 sudo[29352]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgdnuwgafdwltupgjqfwwbxiaiynqbzy ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187586.8403955-38743-91643855173105/AnsiballZ_blivet.py' Jul 22 08:33:07 managed-node13 sudo[29352]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:07 managed-node13 platform-python[29355]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} packages_only=True uses_kmod_kvdo=True safe_mode=True diskvolume_mkfs_option_map={} Jul 22 08:33:07 managed-node13 sudo[29352]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:08 managed-node13 sudo[29480]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpvoimsnjimlduqoatcqsfewuayxxgbv ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187588.429606-38915-270168476966438/AnsiballZ_dnf.py' Jul 22 08:33:08 managed-node13 sudo[29480]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:08 managed-node13 platform-python[29483]: ansible-ansible.legacy.dnf Invoked with name=['kpartx'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:33:11 managed-node13 sudo[29480]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:12 managed-node13 sudo[29607]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpgeyhugbjyolzpulqjclkvbjnazzwam ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187591.4078498-39694-29053202693478/AnsiballZ_service_facts.py' Jul 22 08:33:12 managed-node13 sudo[29607]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:12 managed-node13 platform-python[29610]: ansible-service_facts Invoked Jul 22 08:33:13 managed-node13 sudo[29607]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:14 managed-node13 sudo[29830]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqhaclatwgwlfkhxzxbylixqvdxjjnye ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187593.9436913-39959-259090167469384/AnsiballZ_blivet.py' Jul 22 08:33:14 managed-node13 sudo[29830]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:14 managed-node13 platform-python[29833]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} safe_mode=False uses_kmod_kvdo=True packages_only=False diskvolume_mkfs_option_map={} Jul 22 08:33:14 managed-node13 sudo[29830]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:15 managed-node13 sudo[29958]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znhidzqrtlelqzyqornomrlxxnjugrga ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187594.947014-40096-120381175542856/AnsiballZ_stat.py' Jul 22 08:33:15 managed-node13 sudo[29958]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:15 managed-node13 platform-python[29961]: ansible-stat Invoked with path=/etc/fstab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:33:15 managed-node13 sudo[29958]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:16 managed-node13 sudo[30086]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyclgezxdgoaztsrowninxtobvqrwemi ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187596.2758489-40247-136868370506015/AnsiballZ_stat.py' Jul 22 08:33:16 managed-node13 sudo[30086]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:16 managed-node13 platform-python[30089]: ansible-stat Invoked with path=/etc/crypttab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:33:16 managed-node13 sudo[30086]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:17 managed-node13 sudo[30214]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtoheohykzvkzvsqmocbglrmocghpwzl ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187596.9236765-40335-66278103846438/AnsiballZ_setup.py' Jul 22 08:33:17 managed-node13 sudo[30214]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:17 managed-node13 platform-python[30217]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:33:17 managed-node13 sudo[30214]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:18 managed-node13 sudo[30372]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edosrkpplgvqmexntmdicpktyipymnnm ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187598.120971-40499-175333921528256/AnsiballZ_dnf.py' Jul 22 08:33:18 managed-node13 sudo[30372]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:18 managed-node13 platform-python[30375]: ansible-ansible.legacy.dnf Invoked with name=['util-linux'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:33:21 managed-node13 sudo[30372]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:21 managed-node13 sudo[30499]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psasiupwcpruviubtzslslthlyqqxrud ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187601.288337-41001-55964910187838/AnsiballZ_find_unused_disk.py' Jul 22 08:33:21 managed-node13 sudo[30499]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:22 managed-node13 platform-python[30502]: ansible-fedora.linux_system_roles.find_unused_disk Invoked with min_size=10g max_return=1 with_interface=scsi max_size=0 match_sector_size=False Jul 22 08:33:22 managed-node13 sudo[30499]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:22 managed-node13 sudo[30627]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swoimbqdukkegesxumombgofwlgicbim ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187602.245601-41092-246706563530912/AnsiballZ_command.py' Jul 22 08:33:22 managed-node13 sudo[30627]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:22 managed-node13 platform-python[30630]: ansible-ansible.legacy.command Invoked with _raw_params=set -x exec 1>&2 lsblk -p --pairs --bytes -o NAME,TYPE,SIZE,FSTYPE,LOG-SEC journalctl -ex _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 22 08:33:22 managed-node13 sudo[30627]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:24 managed-node13 sudo[30755]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubqbhtyowsbsyuibdzlbzizsxcrvgswt ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187604.3472393-41311-105183099479403/AnsiballZ_blivet.py' Jul 22 08:33:24 managed-node13 sudo[30755]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:24 managed-node13 platform-python[30758]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} safe_mode=False uses_kmod_kvdo=True packages_only=False diskvolume_mkfs_option_map={} Jul 22 08:33:25 managed-node13 sudo[30755]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:25 managed-node13 sudo[30883]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkhpnufmluqfkjtducdhxichbplmmnng ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187605.2746775-41388-249097324450156/AnsiballZ_stat.py' Jul 22 08:33:25 managed-node13 sudo[30883]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:25 managed-node13 platform-python[30886]: ansible-stat Invoked with path=/etc/fstab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:33:25 managed-node13 sudo[30883]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:26 managed-node13 sudo[31011]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlszeniflbcprecesxbmdkejnvjukknz ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187606.4025223-41523-145212727075576/AnsiballZ_stat.py' Jul 22 08:33:26 managed-node13 sudo[31011]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:26 managed-node13 platform-python[31014]: ansible-stat Invoked with path=/etc/crypttab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:33:26 managed-node13 sudo[31011]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:27 managed-node13 sudo[31139]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djyunhtdbnoitjopzfxpfihfopctkfia ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187607.2217891-41574-33539208306345/AnsiballZ_setup.py' Jul 22 08:33:27 managed-node13 sudo[31139]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:27 managed-node13 platform-python[31142]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:33:28 managed-node13 sudo[31139]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:28 managed-node13 sshd[31195]: Accepted publickey for root from 10.31.14.27 port 40858 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:33:28 managed-node13 systemd-logind[603]: New session 23 of user root. -- Subject: A new session 23 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 23 has been created for the user root. -- -- The leading process of the session is 31195. Jul 22 08:33:28 managed-node13 systemd[1]: Started Session 23 of user root. -- Subject: Unit session-23.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-23.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:33:28 managed-node13 sshd[31195]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:33:28 managed-node13 sshd[31198]: Received disconnect from 10.31.14.27 port 40858:11: disconnected by user Jul 22 08:33:28 managed-node13 sshd[31198]: Disconnected from user root 10.31.14.27 port 40858 Jul 22 08:33:28 managed-node13 sshd[31195]: pam_unix(sshd:session): session closed for user root Jul 22 08:33:28 managed-node13 systemd[1]: session-23.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-23.scope has successfully entered the 'dead' state. Jul 22 08:33:28 managed-node13 systemd-logind[603]: Session 23 logged out. Waiting for processes to exit. Jul 22 08:33:28 managed-node13 systemd-logind[603]: Removed session 23. -- Subject: Session 23 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 23 has been terminated. Jul 22 08:33:29 managed-node13 sshd[31219]: Accepted publickey for root from 10.31.14.27 port 40874 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:33:29 managed-node13 systemd[1]: Started Session 24 of user root. -- Subject: Unit session-24.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-24.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:33:29 managed-node13 systemd-logind[603]: New session 24 of user root. -- Subject: A new session 24 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 24 has been created for the user root. -- -- The leading process of the session is 31219. Jul 22 08:33:29 managed-node13 sshd[31219]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:33:29 managed-node13 sshd[31222]: Received disconnect from 10.31.14.27 port 40874:11: disconnected by user Jul 22 08:33:29 managed-node13 sshd[31222]: Disconnected from user root 10.31.14.27 port 40874 Jul 22 08:33:29 managed-node13 sshd[31219]: pam_unix(sshd:session): session closed for user root Jul 22 08:33:29 managed-node13 systemd[1]: session-24.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-24.scope has successfully entered the 'dead' state. Jul 22 08:33:29 managed-node13 systemd-logind[603]: Session 24 logged out. Waiting for processes to exit. Jul 22 08:33:29 managed-node13 systemd-logind[603]: Removed session 24. -- Subject: Session 24 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 24 has been terminated. Jul 22 08:33:32 managed-node13 platform-python[31384]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:33:32 managed-node13 sudo[31539]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qplwhjkteovagyxblzjxvcfgufqjrdnl ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187612.7380188-42413-108005077009526/AnsiballZ_setup.py' Jul 22 08:33:32 managed-node13 sudo[31539]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:33 managed-node13 platform-python[31542]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:33:33 managed-node13 sudo[31539]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:34 managed-node13 sudo[31697]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fntbiapyylyfbwrymfwalkmdxrvfylbq ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187613.9986644-42470-16344947869516/AnsiballZ_stat.py' Jul 22 08:33:34 managed-node13 sudo[31697]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:34 managed-node13 platform-python[31700]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:33:34 managed-node13 sudo[31697]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:36 managed-node13 sudo[31823]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-firekppluwpfobmltfubtynrtcapwzpq ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187615.211526-42578-262162539860121/AnsiballZ_dnf.py' Jul 22 08:33:36 managed-node13 sudo[31823]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:36 managed-node13 platform-python[31826]: ansible-ansible.legacy.dnf Invoked with name=['python3-blivet', 'libblockdev-crypto', 'libblockdev-dm', 'libblockdev-lvm', 'libblockdev-mdraid', 'libblockdev-swap', 'vdo', 'kmod-kvdo', 'xfsprogs', 'stratisd', 'stratis-cli', 'libblockdev'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:33:38 managed-node13 sudo[31823]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:39 managed-node13 sudo[31950]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwxkvjphtcfcptnpdsnlbkzovavbzjvl ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187619.2496326-43186-220283035709263/AnsiballZ_blivet.py' Jul 22 08:33:39 managed-node13 sudo[31950]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:39 managed-node13 platform-python[31953]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=True disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} packages_only=True uses_kmod_kvdo=True safe_mode=True diskvolume_mkfs_option_map={} Jul 22 08:33:39 managed-node13 sudo[31950]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:41 managed-node13 sudo[32078]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alyvfjkkndnwfcgegsiysotldppzwzvn ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187620.6418667-43330-277892729594696/AnsiballZ_dnf.py' Jul 22 08:33:41 managed-node13 sudo[32078]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:41 managed-node13 platform-python[32081]: ansible-ansible.legacy.dnf Invoked with name=['kpartx'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:33:43 managed-node13 sudo[32078]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:44 managed-node13 sudo[32205]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khjvtegqsrivvxsffxzbznubeazkluyi ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187623.8506474-43767-129087718226580/AnsiballZ_service_facts.py' Jul 22 08:33:44 managed-node13 sudo[32205]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:44 managed-node13 platform-python[32208]: ansible-service_facts Invoked Jul 22 08:33:45 managed-node13 sudo[32205]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:46 managed-node13 sudo[32428]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pefwxctbyfkcdlcequnltigmxnnzrkit ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187626.3420506-44179-280178474917143/AnsiballZ_blivet.py' Jul 22 08:33:46 managed-node13 sudo[32428]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:46 managed-node13 platform-python[32431]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=True disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} safe_mode=False uses_kmod_kvdo=True packages_only=False diskvolume_mkfs_option_map={} Jul 22 08:33:46 managed-node13 sudo[32428]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:47 managed-node13 sudo[32556]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqghctxohkqxykzqnykvbenpjddctisa ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187627.0318687-44206-204818748341942/AnsiballZ_stat.py' Jul 22 08:33:47 managed-node13 sudo[32556]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:47 managed-node13 platform-python[32559]: ansible-stat Invoked with path=/etc/fstab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:33:47 managed-node13 sudo[32556]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:48 managed-node13 sudo[32684]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lulvysznahaieakavnnerwmzcrtffozx ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187628.2646399-44315-155457919486634/AnsiballZ_stat.py' Jul 22 08:33:48 managed-node13 sudo[32684]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:48 managed-node13 platform-python[32687]: ansible-stat Invoked with path=/etc/crypttab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:33:48 managed-node13 sudo[32684]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:49 managed-node13 sudo[32812]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yebokfibhoznlxhcqctozefeawwzvlbn ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187629.1066737-44382-163850161619512/AnsiballZ_setup.py' Jul 22 08:33:49 managed-node13 sudo[32812]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:49 managed-node13 platform-python[32815]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:33:50 managed-node13 sudo[32812]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:51 managed-node13 sudo[32970]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znywskiivjukhxfjnwfvfnttqzqnfnzu ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187630.8098383-44523-212220283275139/AnsiballZ_dnf.py' Jul 22 08:33:51 managed-node13 sudo[32970]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:51 managed-node13 platform-python[32973]: ansible-ansible.legacy.dnf Invoked with name=['util-linux'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:33:53 managed-node13 sudo[32970]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:54 managed-node13 sudo[33097]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seiklaprtpwllzvmmgggdquwsstyyfot ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187633.9206483-45137-149998531319315/AnsiballZ_find_unused_disk.py' Jul 22 08:33:54 managed-node13 sudo[33097]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:54 managed-node13 platform-python[33100]: ansible-fedora.linux_system_roles.find_unused_disk Invoked with max_return=3 with_interface=scsi min_size=0 max_size=0 match_sector_size=False Jul 22 08:33:54 managed-node13 sudo[33097]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:55 managed-node13 sudo[33225]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pygjekkxvfctsxdceghbrdprqnmbjcvv ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187634.6079128-45173-174171663065572/AnsiballZ_command.py' Jul 22 08:33:55 managed-node13 sudo[33225]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:33:55 managed-node13 platform-python[33228]: ansible-ansible.legacy.command Invoked with _raw_params=set -x exec 1>&2 lsblk -p --pairs --bytes -o NAME,TYPE,SIZE,FSTYPE,LOG-SEC journalctl -ex _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 22 08:33:55 managed-node13 sudo[33225]: pam_unix(sudo:session): session closed for user root Jul 22 08:33:56 managed-node13 sshd[33251]: Accepted publickey for root from 10.31.14.27 port 48444 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:33:56 managed-node13 systemd-logind[603]: New session 25 of user root. -- Subject: A new session 25 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 25 has been created for the user root. -- -- The leading process of the session is 33251. Jul 22 08:33:56 managed-node13 systemd[1]: Started Session 25 of user root. -- Subject: Unit session-25.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-25.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:33:56 managed-node13 sshd[33251]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:33:56 managed-node13 sshd[33254]: Received disconnect from 10.31.14.27 port 48444:11: disconnected by user Jul 22 08:33:56 managed-node13 sshd[33254]: Disconnected from user root 10.31.14.27 port 48444 Jul 22 08:33:56 managed-node13 sshd[33251]: pam_unix(sshd:session): session closed for user root Jul 22 08:33:56 managed-node13 systemd[1]: session-25.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-25.scope has successfully entered the 'dead' state. Jul 22 08:33:56 managed-node13 systemd-logind[603]: Session 25 logged out. Waiting for processes to exit. Jul 22 08:33:56 managed-node13 systemd-logind[603]: Removed session 25. -- Subject: Session 25 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 25 has been terminated. Jul 22 08:33:56 managed-node13 sshd[33275]: Accepted publickey for root from 10.31.14.27 port 48458 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:33:56 managed-node13 systemd[1]: Started Session 26 of user root. -- Subject: Unit session-26.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-26.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:33:56 managed-node13 systemd-logind[603]: New session 26 of user root. -- Subject: A new session 26 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 26 has been created for the user root. -- -- The leading process of the session is 33275. Jul 22 08:33:56 managed-node13 sshd[33275]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:33:56 managed-node13 sshd[33278]: Received disconnect from 10.31.14.27 port 48458:11: disconnected by user Jul 22 08:33:56 managed-node13 sshd[33278]: Disconnected from user root 10.31.14.27 port 48458 Jul 22 08:33:56 managed-node13 sshd[33275]: pam_unix(sshd:session): session closed for user root Jul 22 08:33:56 managed-node13 systemd[1]: session-26.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-26.scope has successfully entered the 'dead' state. Jul 22 08:33:56 managed-node13 systemd-logind[603]: Session 26 logged out. Waiting for processes to exit. Jul 22 08:33:56 managed-node13 systemd-logind[603]: Removed session 26. -- Subject: Session 26 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 26 has been terminated. Jul 22 08:33:58 managed-node13 platform-python[33440]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:34:00 managed-node13 sudo[33595]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhecjiwpkgvknvimvfcluhgcrgpglake ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187639.6674237-45935-141504525886221/AnsiballZ_setup.py' Jul 22 08:34:00 managed-node13 sudo[33595]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:34:00 managed-node13 platform-python[33598]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:34:00 managed-node13 sudo[33595]: pam_unix(sudo:session): session closed for user root Jul 22 08:34:01 managed-node13 sudo[33753]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmurnkehskxbkxgqnwssoeljvqddhbjk ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187641.3781402-46070-78933310074097/AnsiballZ_stat.py' Jul 22 08:34:01 managed-node13 sudo[33753]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:34:02 managed-node13 platform-python[33756]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:34:02 managed-node13 sudo[33753]: pam_unix(sudo:session): session closed for user root Jul 22 08:34:03 managed-node13 sudo[33879]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjhkegayhhwahbievbvvhhucsersiugl ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187642.7525434-46191-189262242890111/AnsiballZ_dnf.py' Jul 22 08:34:03 managed-node13 sudo[33879]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:34:03 managed-node13 platform-python[33882]: ansible-ansible.legacy.dnf Invoked with name=['python3-blivet', 'libblockdev-crypto', 'libblockdev-dm', 'libblockdev-lvm', 'libblockdev-mdraid', 'libblockdev-swap', 'vdo', 'kmod-kvdo', 'xfsprogs', 'stratisd', 'stratis-cli', 'libblockdev'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:34:06 managed-node13 sudo[33879]: pam_unix(sudo:session): session closed for user root Jul 22 08:34:07 managed-node13 sudo[34006]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcyullbplrfvywdyngrrjivphvglzhjr ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187646.609046-46818-6767529426082/AnsiballZ_blivet.py' Jul 22 08:34:07 managed-node13 sudo[34006]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:34:07 managed-node13 platform-python[34009]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=True disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} packages_only=True uses_kmod_kvdo=True safe_mode=True diskvolume_mkfs_option_map={} Jul 22 08:34:07 managed-node13 sudo[34006]: pam_unix(sudo:session): session closed for user root Jul 22 08:34:08 managed-node13 sudo[34134]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwneuljdgxqkbslnowzzfvvfopsdbjun ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187648.278414-46969-220581657988435/AnsiballZ_dnf.py' Jul 22 08:34:08 managed-node13 sudo[34134]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:34:08 managed-node13 platform-python[34137]: ansible-ansible.legacy.dnf Invoked with name=['kpartx'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:34:11 managed-node13 sudo[34134]: pam_unix(sudo:session): session closed for user root Jul 22 08:34:12 managed-node13 sudo[34261]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ielzavyzqptiognslzakcdwatpwcffwi ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187651.516518-47377-17963786994972/AnsiballZ_service_facts.py' Jul 22 08:34:12 managed-node13 sudo[34261]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:34:12 managed-node13 platform-python[34264]: ansible-service_facts Invoked Jul 22 08:34:13 managed-node13 sudo[34261]: pam_unix(sudo:session): session closed for user root Jul 22 08:34:13 managed-node13 sudo[34484]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzasdhthpbrkziiwsefcpdfrorqttjpt ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187653.7645433-47787-166869301835858/AnsiballZ_blivet.py' Jul 22 08:34:13 managed-node13 sudo[34484]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:34:14 managed-node13 platform-python[34487]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=True disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} safe_mode=False uses_kmod_kvdo=True packages_only=False diskvolume_mkfs_option_map={} Jul 22 08:34:14 managed-node13 sudo[34484]: pam_unix(sudo:session): session closed for user root Jul 22 08:34:14 managed-node13 sudo[34612]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gieikxnzkeuyhxpxwojpyssgscpyfeil ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187654.4574425-47823-197092900642978/AnsiballZ_stat.py' Jul 22 08:34:14 managed-node13 sudo[34612]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:34:14 managed-node13 platform-python[34615]: ansible-stat Invoked with path=/etc/fstab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:34:14 managed-node13 sudo[34612]: pam_unix(sudo:session): session closed for user root Jul 22 08:34:16 managed-node13 sudo[34740]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lymnikkbruomfdodjwnmqdecxphahagg ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187656.489363-47997-36511447840656/AnsiballZ_stat.py' Jul 22 08:34:16 managed-node13 sudo[34740]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:34:16 managed-node13 platform-python[34743]: ansible-stat Invoked with path=/etc/crypttab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:34:16 managed-node13 sudo[34740]: pam_unix(sudo:session): session closed for user root Jul 22 08:34:17 managed-node13 sudo[34868]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiowqetkxjaiuwrguxietjaylswwgkou ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187656.959401-48119-149700814480606/AnsiballZ_setup.py' Jul 22 08:34:17 managed-node13 sudo[34868]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:34:17 managed-node13 platform-python[34871]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:34:17 managed-node13 sudo[34868]: pam_unix(sudo:session): session closed for user root Jul 22 08:34:18 managed-node13 sudo[35026]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-simzaipwwogxkxtnizopbqbgtliqyvol ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187658.3260727-48272-33009636018605/AnsiballZ_dnf.py' Jul 22 08:34:18 managed-node13 sudo[35026]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:34:18 managed-node13 platform-python[35029]: ansible-ansible.legacy.dnf Invoked with name=['util-linux'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:34:21 managed-node13 sudo[35026]: pam_unix(sudo:session): session closed for user root Jul 22 08:34:22 managed-node13 sudo[35153]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeeitblfvfcodtnmabowfandskvxblcd ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187661.38523-48804-142672544963265/AnsiballZ_find_unused_disk.py' Jul 22 08:34:22 managed-node13 sudo[35153]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:34:22 managed-node13 platform-python[35156]: ansible-fedora.linux_system_roles.find_unused_disk Invoked with max_return=1 with_interface=scsi min_size=0 max_size=0 match_sector_size=False Jul 22 08:34:22 managed-node13 sudo[35153]: pam_unix(sudo:session): session closed for user root Jul 22 08:34:23 managed-node13 sudo[35281]: root : TTY=pts/0 ; PWD=/root ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjxnhlguecpcymiybnljrtvxrqarmrbm ; /usr/libexec/platform-python /root/.ansible/tmp/ansible-tmp-1753187662.4649854-48860-28804098602969/AnsiballZ_command.py' Jul 22 08:34:23 managed-node13 sudo[35281]: pam_unix(sudo:session): session opened for user root by root(uid=0) Jul 22 08:34:23 managed-node13 platform-python[35284]: ansible-ansible.legacy.command Invoked with _raw_params=set -x exec 1>&2 lsblk -p --pairs --bytes -o NAME,TYPE,SIZE,FSTYPE,LOG-SEC journalctl -ex _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 22 08:34:23 managed-node13 sudo[35281]: pam_unix(sudo:session): session closed for user root Jul 22 08:34:24 managed-node13 sshd[35307]: Accepted publickey for root from 10.31.14.27 port 38768 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:34:24 managed-node13 systemd-logind[603]: New session 27 of user root. -- Subject: A new session 27 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 27 has been created for the user root. -- -- The leading process of the session is 35307. Jul 22 08:34:24 managed-node13 systemd[1]: Started Session 27 of user root. -- Subject: Unit session-27.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-27.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:34:24 managed-node13 sshd[35307]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:34:24 managed-node13 sshd[35310]: Received disconnect from 10.31.14.27 port 38768:11: disconnected by user Jul 22 08:34:24 managed-node13 sshd[35310]: Disconnected from user root 10.31.14.27 port 38768 Jul 22 08:34:24 managed-node13 sshd[35307]: pam_unix(sshd:session): session closed for user root Jul 22 08:34:24 managed-node13 systemd[1]: session-27.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-27.scope has successfully entered the 'dead' state. Jul 22 08:34:24 managed-node13 systemd-logind[603]: Session 27 logged out. Waiting for processes to exit. Jul 22 08:34:24 managed-node13 systemd-logind[603]: Removed session 27. -- Subject: Session 27 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 27 has been terminated. Jul 22 08:34:24 managed-node13 sshd[35331]: Accepted publickey for root from 10.31.14.27 port 38778 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:34:24 managed-node13 systemd[1]: Started Session 28 of user root. -- Subject: Unit session-28.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-28.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:34:24 managed-node13 systemd-logind[603]: New session 28 of user root. -- Subject: A new session 28 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 28 has been created for the user root. -- -- The leading process of the session is 35331. Jul 22 08:34:24 managed-node13 sshd[35331]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:34:24 managed-node13 sshd[35334]: Received disconnect from 10.31.14.27 port 38778:11: disconnected by user Jul 22 08:34:24 managed-node13 sshd[35334]: Disconnected from user root 10.31.14.27 port 38778 Jul 22 08:34:24 managed-node13 sshd[35331]: pam_unix(sshd:session): session closed for user root Jul 22 08:34:24 managed-node13 systemd[1]: session-28.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-28.scope has successfully entered the 'dead' state. Jul 22 08:34:24 managed-node13 systemd-logind[603]: Session 28 logged out. Waiting for processes to exit. Jul 22 08:34:24 managed-node13 systemd-logind[603]: Removed session 28. -- Subject: Session 28 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 28 has been terminated. Jul 22 08:34:27 managed-node13 platform-python[35496]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:34:29 managed-node13 platform-python[35651]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:34:30 managed-node13 platform-python[35774]: ansible-ansible.legacy.dnf Invoked with name=['python3-blivet', 'libblockdev-crypto', 'libblockdev-dm', 'libblockdev-lvm', 'libblockdev-mdraid', 'libblockdev-swap', 'vdo', 'kmod-kvdo', 'xfsprogs', 'stratisd', 'stratis-cli', 'libblockdev'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:34:34 managed-node13 platform-python[35898]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} packages_only=True uses_kmod_kvdo=True safe_mode=True diskvolume_mkfs_option_map={} Jul 22 08:34:35 managed-node13 platform-python[36023]: ansible-ansible.legacy.dnf Invoked with name=['kpartx'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:34:38 managed-node13 platform-python[36147]: ansible-service_facts Invoked Jul 22 08:34:40 managed-node13 platform-python[36367]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} safe_mode=False uses_kmod_kvdo=True packages_only=False diskvolume_mkfs_option_map={} Jul 22 08:34:41 managed-node13 platform-python[36492]: ansible-stat Invoked with path=/etc/fstab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:34:42 managed-node13 platform-python[36617]: ansible-stat Invoked with path=/etc/crypttab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:34:43 managed-node13 platform-python[36742]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:34:44 managed-node13 platform-python[36897]: ansible-ansible.legacy.dnf Invoked with name=['util-linux'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:34:47 managed-node13 platform-python[37021]: ansible-fedora.linux_system_roles.find_unused_disk Invoked with min_size=5g max_size=0 max_return=1 match_sector_size=False with_interface=None Jul 22 08:34:47 managed-node13 platform-python[37146]: ansible-ansible.legacy.command Invoked with _raw_params=set -x exec 1>&2 lsblk -p --pairs --bytes -o NAME,TYPE,SIZE,FSTYPE,LOG-SEC journalctl -ex _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Tuesday 22 July 2025 08:34:47 -0400 (0:00:00.494) 0:00:21.389 ********** skipping: [managed-node13] => { "changed": false, "false_condition": "'Unable to find unused disk' not in unused_disks_return.disks", "skip_reason": "Conditional result was False" } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Tuesday 22 July 2025 08:34:47 -0400 (0:00:00.043) 0:00:21.432 ********** fatal: [managed-node13]: FAILED! => { "changed": false } MSG: Unable to find enough unused disks. Exiting playbook. PLAY RECAP ********************************************************************* managed-node13 : ok=28 changed=0 unreachable=0 failed=1 skipped=15 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.16.14", "end_time": "2025-07-22T12:34:47.820684+00:00Z", "host": "managed-node13", "message": "Unable to find enough unused disks. Exiting playbook.", "start_time": "2025-07-22T12:34:47.729777+00:00Z", "task_name": "Exit playbook when there's not enough unused disks in the system", "task_path": "/tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Tuesday 22 July 2025 08:34:47 -0400 (0:00:00.103) 0:00:21.535 ********** =============================================================================== fedora.linux_system_roles.storage : Make sure blivet is available ------- 3.33s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.storage : Make sure required packages are installed --- 3.15s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Ensure test packages ---------------------------------------------------- 2.83s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 fedora.linux_system_roles.storage : Get service facts ------------------- 1.89s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Gathering Facts --------------------------------------------------------- 1.63s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:2 fedora.linux_system_roles.storage : Check if system is ostree ----------- 1.21s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 fedora.linux_system_roles.storage : Get required packages --------------- 0.91s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 0.86s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Update facts ------------------------ 0.82s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 fedora.linux_system_roles.storage : Check if /etc/fstab is present ------ 0.77s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Find unused disks in the system ----------------------------------------- 0.55s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Debug why there are no unused disks ------------------------------------- 0.49s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file --- 0.41s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 fedora.linux_system_roles.storage : Set storage_cryptsetup_services ----- 0.23s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 fedora.linux_system_roles.storage : Set platform/version specific variables --- 0.21s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Include role to ensure packages are installed --------------------------- 0.17s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/tests/storage/tests_swap.yml:10 fedora.linux_system_roles.storage : Enable COPRs ------------------------ 0.15s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 fedora.linux_system_roles.storage : Enable copr repositories if needed --- 0.13s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 fedora.linux_system_roles.storage : Check if the COPR support packages should be installed --- 0.12s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 fedora.linux_system_roles.storage : Workaround for udev issue on some platforms --- 0.12s /tmp/collections-7QI/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 -- Logs begin at Tue 2025-07-22 08:25:04 EDT, end at Tue 2025-07-22 08:34:48 EDT. -- Jul 22 08:34:27 managed-node13 platform-python[35496]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:34:29 managed-node13 platform-python[35651]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:34:30 managed-node13 platform-python[35774]: ansible-ansible.legacy.dnf Invoked with name=['python3-blivet', 'libblockdev-crypto', 'libblockdev-dm', 'libblockdev-lvm', 'libblockdev-mdraid', 'libblockdev-swap', 'vdo', 'kmod-kvdo', 'xfsprogs', 'stratisd', 'stratis-cli', 'libblockdev'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:34:34 managed-node13 platform-python[35898]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} packages_only=True uses_kmod_kvdo=True safe_mode=True diskvolume_mkfs_option_map={} Jul 22 08:34:35 managed-node13 platform-python[36023]: ansible-ansible.legacy.dnf Invoked with name=['kpartx'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:34:38 managed-node13 platform-python[36147]: ansible-service_facts Invoked Jul 22 08:34:40 managed-node13 platform-python[36367]: ansible-fedora.linux_system_roles.blivet Invoked with pools=[] volumes=[] use_partitions=None disklabel_type=None pool_defaults={'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False} volume_defaults={'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []} safe_mode=False uses_kmod_kvdo=True packages_only=False diskvolume_mkfs_option_map={} Jul 22 08:34:41 managed-node13 platform-python[36492]: ansible-stat Invoked with path=/etc/fstab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:34:42 managed-node13 platform-python[36617]: ansible-stat Invoked with path=/etc/crypttab follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jul 22 08:34:43 managed-node13 platform-python[36742]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jul 22 08:34:44 managed-node13 platform-python[36897]: ansible-ansible.legacy.dnf Invoked with name=['util-linux'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Jul 22 08:34:47 managed-node13 platform-python[37021]: ansible-fedora.linux_system_roles.find_unused_disk Invoked with min_size=5g max_size=0 max_return=1 match_sector_size=False with_interface=None Jul 22 08:34:47 managed-node13 platform-python[37146]: ansible-ansible.legacy.command Invoked with _raw_params=set -x exec 1>&2 lsblk -p --pairs --bytes -o NAME,TYPE,SIZE,FSTYPE,LOG-SEC journalctl -ex _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Jul 22 08:34:48 managed-node13 sshd[37169]: Accepted publickey for root from 10.31.14.27 port 44880 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:34:48 managed-node13 systemd[1]: Started Session 29 of user root. -- Subject: Unit session-29.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-29.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:34:48 managed-node13 systemd-logind[603]: New session 29 of user root. -- Subject: A new session 29 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 29 has been created for the user root. -- -- The leading process of the session is 37169. Jul 22 08:34:48 managed-node13 sshd[37169]: pam_unix(sshd:session): session opened for user root by (uid=0) Jul 22 08:34:48 managed-node13 sshd[37172]: Received disconnect from 10.31.14.27 port 44880:11: disconnected by user Jul 22 08:34:48 managed-node13 sshd[37172]: Disconnected from user root 10.31.14.27 port 44880 Jul 22 08:34:48 managed-node13 sshd[37169]: pam_unix(sshd:session): session closed for user root Jul 22 08:34:48 managed-node13 systemd[1]: session-29.scope: Succeeded. -- Subject: Unit succeeded -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- The unit session-29.scope has successfully entered the 'dead' state. Jul 22 08:34:48 managed-node13 systemd-logind[603]: Session 29 logged out. Waiting for processes to exit. Jul 22 08:34:48 managed-node13 systemd-logind[603]: Removed session 29. -- Subject: Session 29 has been terminated -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 29 has been terminated. Jul 22 08:34:48 managed-node13 sshd[37193]: Accepted publickey for root from 10.31.14.27 port 44888 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jul 22 08:34:48 managed-node13 systemd[1]: Started Session 30 of user root. -- Subject: Unit session-30.scope has finished start-up -- Defined-By: systemd -- Support: https://access.redhat.com/support -- -- Unit session-30.scope has finished starting up. -- -- The start-up result is done. Jul 22 08:34:48 managed-node13 systemd-logind[603]: New session 30 of user root. -- Subject: A new session 30 has been created for user root -- Defined-By: systemd -- Support: https://access.redhat.com/support -- Documentation: https://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 30 has been created for the user root. -- -- The leading process of the session is 37193. Jul 22 08:34:48 managed-node13 sshd[37193]: pam_unix(sshd:session): session opened for user root by (uid=0)