ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks2.yml ****************************************************** 1 plays in /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml PLAY [Test LUKS2] ************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:2 Sunday 15 June 2025 07:49:50 -0400 (0:00:00.219) 0:00:00.219 *********** ok: [managed-node14] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:20 Sunday 15 June 2025 07:49:54 -0400 (0:00:04.182) 0:00:04.401 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:28 Sunday 15 June 2025 07:49:54 -0400 (0:00:00.213) 0:00:04.615 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:39 Sunday 15 June 2025 07:49:54 -0400 (0:00:00.151) 0:00:04.766 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:43 Sunday 15 June 2025 07:49:55 -0400 (0:00:00.255) 0:00:05.021 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:53 Sunday 15 June 2025 07:49:55 -0400 (0:00:00.269) 0:00:05.291 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:59 Sunday 15 June 2025 07:49:55 -0400 (0:00:00.369) 0:00:05.660 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:68 Sunday 15 June 2025 07:49:56 -0400 (0:00:00.344) 0:00:06.005 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:72 Sunday 15 June 2025 07:49:56 -0400 (0:00:00.441) 0:00:06.446 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 07:49:56 -0400 (0:00:00.374) 0:00:06.821 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 07:49:57 -0400 (0:00:00.309) 0:00:07.131 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 07:49:57 -0400 (0:00:00.336) 0:00:07.468 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 07:49:57 -0400 (0:00:00.450) 0:00:07.918 *********** ok: [managed-node14] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 07:50:00 -0400 (0:00:02.960) 0:00:10.878 *********** ok: [managed-node14] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 07:50:01 -0400 (0:00:00.336) 0:00:11.215 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 07:50:01 -0400 (0:00:00.099) 0:00:11.314 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 07:50:01 -0400 (0:00:00.177) 0:00:11.492 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 07:50:02 -0400 (0:00:00.590) 0:00:12.082 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 07:50:06 -0400 (0:00:04.521) 0:00:16.604 *********** ok: [managed-node14] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 07:50:07 -0400 (0:00:00.390) 0:00:16.994 *********** ok: [managed-node14] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 07:50:07 -0400 (0:00:00.394) 0:00:17.389 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 07:50:11 -0400 (0:00:03.589) 0:00:20.979 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 07:50:11 -0400 (0:00:00.767) 0:00:21.746 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 07:50:11 -0400 (0:00:00.128) 0:00:21.875 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 07:50:12 -0400 (0:00:00.209) 0:00:22.085 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 07:50:12 -0400 (0:00:00.122) 0:00:22.207 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 07:50:16 -0400 (0:00:04.613) 0:00:26.820 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 07:50:20 -0400 (0:00:04.017) 0:00:30.837 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 07:50:21 -0400 (0:00:00.436) 0:00:31.274 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 07:50:21 -0400 (0:00:00.103) 0:00:31.378 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 15 June 2025 07:50:23 -0400 (0:00:01.796) 0:00:33.174 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 15 June 2025 07:50:23 -0400 (0:00:00.234) 0:00:33.409 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749987916.8895779, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1749987915.1485813, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1749987915.1485813, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "2912082900", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 15 June 2025 07:50:25 -0400 (0:00:01.648) 0:00:35.058 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 07:50:25 -0400 (0:00:00.249) 0:00:35.307 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 15 June 2025 07:50:25 -0400 (0:00:00.215) 0:00:35.523 *********** ok: [managed-node14] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 15 June 2025 07:50:25 -0400 (0:00:00.318) 0:00:35.841 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 15 June 2025 07:50:26 -0400 (0:00:00.288) 0:00:36.130 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 15 June 2025 07:50:26 -0400 (0:00:00.114) 0:00:36.244 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 15 June 2025 07:50:26 -0400 (0:00:00.111) 0:00:36.356 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 15 June 2025 07:50:26 -0400 (0:00:00.296) 0:00:36.653 *********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 15 June 2025 07:50:26 -0400 (0:00:00.217) 0:00:36.870 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 15 June 2025 07:50:27 -0400 (0:00:00.192) 0:00:37.062 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 15 June 2025 07:50:27 -0400 (0:00:00.278) 0:00:37.340 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749987023.4667485, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 15 June 2025 07:50:28 -0400 (0:00:01.346) 0:00:38.687 *********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 15 June 2025 07:50:28 -0400 (0:00:00.154) 0:00:38.842 *********** ok: [managed-node14] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:76 Sunday 15 June 2025 07:50:30 -0400 (0:00:01.692) 0:00:40.534 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node14 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Sunday 15 June 2025 07:50:30 -0400 (0:00:00.335) 0:00:40.870 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: util-linux TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Sunday 15 June 2025 07:50:35 -0400 (0:00:04.408) 0:00:45.279 *********** ok: [managed-node14] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Sunday 15 June 2025 07:50:37 -0400 (0:00:01.927) 0:00:47.207 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Sunday 15 June 2025 07:50:37 -0400 (0:00:00.190) 0:00:47.397 *********** ok: [managed-node14] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Sunday 15 June 2025 07:50:37 -0400 (0:00:00.219) 0:00:47.617 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Sunday 15 June 2025 07:50:37 -0400 (0:00:00.308) 0:00:47.926 *********** ok: [managed-node14] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:85 Sunday 15 June 2025 07:50:38 -0400 (0:00:00.284) 0:00:48.210 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node14 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 15 June 2025 07:50:38 -0400 (0:00:00.398) 0:00:48.608 *********** ok: [managed-node14] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 15 June 2025 07:50:38 -0400 (0:00:00.296) 0:00:48.905 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 07:50:39 -0400 (0:00:00.380) 0:00:49.285 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 07:50:39 -0400 (0:00:00.410) 0:00:49.696 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 07:50:40 -0400 (0:00:00.310) 0:00:50.006 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 07:50:40 -0400 (0:00:00.876) 0:00:50.883 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 07:50:41 -0400 (0:00:00.326) 0:00:51.209 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 07:50:41 -0400 (0:00:00.261) 0:00:51.470 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 07:50:41 -0400 (0:00:00.224) 0:00:51.695 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 07:50:42 -0400 (0:00:00.275) 0:00:51.970 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 07:50:42 -0400 (0:00:00.489) 0:00:52.460 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 07:50:46 -0400 (0:00:04.325) 0:00:56.786 *********** ok: [managed-node14] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 07:50:47 -0400 (0:00:00.216) 0:00:57.003 *********** ok: [managed-node14] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 07:50:47 -0400 (0:00:00.150) 0:00:57.153 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 07:50:52 -0400 (0:00:05.377) 0:01:02.531 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 07:50:52 -0400 (0:00:00.434) 0:01:02.965 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 07:50:53 -0400 (0:00:00.258) 0:01:03.223 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 07:50:53 -0400 (0:00:00.350) 0:01:03.574 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 07:50:53 -0400 (0:00:00.148) 0:01:03.723 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 07:50:58 -0400 (0:00:04.667) 0:01:08.391 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 07:51:01 -0400 (0:00:03.055) 0:01:11.447 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 07:51:01 -0400 (0:00:00.443) 0:01:11.890 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 07:51:02 -0400 (0:00:00.217) 0:01:12.107 *********** fatal: [managed-node14]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 15 June 2025 07:51:07 -0400 (0:00:05.358) 0:01:17.465 *********** fatal: [managed-node14]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'foo' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 07:51:07 -0400 (0:00:00.215) 0:01:17.681 *********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 15 June 2025 07:51:07 -0400 (0:00:00.216) 0:01:17.897 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 15 June 2025 07:51:08 -0400 (0:00:00.228) 0:01:18.126 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 15 June 2025 07:51:08 -0400 (0:00:00.341) 0:01:18.468 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:101 Sunday 15 June 2025 07:51:08 -0400 (0:00:00.240) 0:01:18.708 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 07:51:09 -0400 (0:00:00.901) 0:01:19.610 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 07:51:09 -0400 (0:00:00.261) 0:01:19.872 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 07:51:10 -0400 (0:00:00.245) 0:01:20.117 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 07:51:10 -0400 (0:00:00.380) 0:01:20.498 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 07:51:10 -0400 (0:00:00.176) 0:01:20.675 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 07:51:11 -0400 (0:00:00.349) 0:01:21.024 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 07:51:11 -0400 (0:00:00.158) 0:01:21.183 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 07:51:11 -0400 (0:00:00.203) 0:01:21.386 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 07:51:11 -0400 (0:00:00.439) 0:01:21.825 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 07:51:16 -0400 (0:00:04.490) 0:01:26.316 *********** ok: [managed-node14] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 07:51:16 -0400 (0:00:00.264) 0:01:26.580 *********** ok: [managed-node14] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 07:51:16 -0400 (0:00:00.359) 0:01:26.940 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 07:51:22 -0400 (0:00:05.261) 0:01:32.202 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 07:51:22 -0400 (0:00:00.265) 0:01:32.467 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 07:51:22 -0400 (0:00:00.110) 0:01:32.578 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 07:51:22 -0400 (0:00:00.178) 0:01:32.756 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 07:51:22 -0400 (0:00:00.081) 0:01:32.837 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 07:51:27 -0400 (0:00:05.116) 0:01:37.954 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 07:51:30 -0400 (0:00:02.883) 0:01:40.837 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 07:51:31 -0400 (0:00:00.404) 0:01:41.242 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 07:51:31 -0400 (0:00:00.187) 0:01:41.430 *********** changed: [managed-node14] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 15 June 2025 07:51:45 -0400 (0:00:13.538) 0:01:54.969 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 15 June 2025 07:51:45 -0400 (0:00:00.203) 0:01:55.173 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749987916.8895779, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1749987915.1485813, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1749987915.1485813, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "2912082900", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 15 June 2025 07:51:46 -0400 (0:00:01.649) 0:01:56.822 *********** ok: [managed-node14] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 07:51:49 -0400 (0:00:02.948) 0:01:59.771 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 15 June 2025 07:51:50 -0400 (0:00:00.270) 0:02:00.041 *********** ok: [managed-node14] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 15 June 2025 07:51:50 -0400 (0:00:00.224) 0:02:00.266 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 15 June 2025 07:51:50 -0400 (0:00:00.312) 0:02:00.579 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 15 June 2025 07:51:51 -0400 (0:00:00.411) 0:02:00.990 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 15 June 2025 07:51:51 -0400 (0:00:00.126) 0:02:01.117 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 15 June 2025 07:51:54 -0400 (0:00:03.097) 0:02:04.215 *********** changed: [managed-node14] => (item={'src': '/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 15 June 2025 07:51:56 -0400 (0:00:02.140) 0:02:06.356 *********** skipping: [managed-node14] => (item={'src': '/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 15 June 2025 07:51:56 -0400 (0:00:00.293) 0:02:06.649 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 15 June 2025 07:51:58 -0400 (0:00:02.048) 0:02:08.698 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749987023.4667485, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 15 June 2025 07:52:00 -0400 (0:00:01.618) 0:02:10.316 *********** changed: [managed-node14] => (item={'backing_device': '/dev/sda', 'name': 'luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 15 June 2025 07:52:01 -0400 (0:00:01.634) 0:02:11.950 *********** ok: [managed-node14] TASK [Verify role results] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:114 Sunday 15 June 2025 07:52:03 -0400 (0:00:01.920) 0:02:13.871 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node14 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 15 June 2025 07:52:04 -0400 (0:00:00.431) 0:02:14.302 *********** skipping: [managed-node14] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 15 June 2025 07:52:04 -0400 (0:00:00.313) 0:02:14.616 *********** ok: [managed-node14] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 15 June 2025 07:52:04 -0400 (0:00:00.197) 0:02:14.813 *********** ok: [managed-node14] => { "changed": false, "info": { "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "size": "10G", "type": "crypt", "uuid": "731a62a6-f69b-4a7f-bdc4-6a4612fde1ab" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "af16ab2a-ac54-451b-94bc-b559e3fb69bf" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 15 June 2025 07:52:07 -0400 (0:00:02.394) 0:02:17.208 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002197", "end": "2025-06-15 07:52:08.643593", "rc": 0, "start": "2025-06-15 07:52:08.641396" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 15 June 2025 07:52:08 -0400 (0:00:01.616) 0:02:18.825 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002202", "end": "2025-06-15 07:52:09.646753", "failed_when_result": false, "rc": 0, "start": "2025-06-15 07:52:09.644551" } STDOUT: luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 15 June 2025 07:52:09 -0400 (0:00:01.067) 0:02:19.892 *********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Sunday 15 June 2025 07:52:10 -0400 (0:00:00.116) 0:02:20.008 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node14 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 15 June 2025 07:52:10 -0400 (0:00:00.265) 0:02:20.273 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 15 June 2025 07:52:10 -0400 (0:00:00.152) 0:02:20.426 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node14 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 15 June 2025 07:52:11 -0400 (0:00:00.916) 0:02:21.342 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 15 June 2025 07:52:11 -0400 (0:00:00.150) 0:02:21.492 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 15 June 2025 07:52:11 -0400 (0:00:00.150) 0:02:21.643 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Sunday 15 June 2025 07:52:11 -0400 (0:00:00.128) 0:02:21.772 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Sunday 15 June 2025 07:52:11 -0400 (0:00:00.159) 0:02:21.931 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Sunday 15 June 2025 07:52:12 -0400 (0:00:00.145) 0:02:22.076 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Sunday 15 June 2025 07:52:12 -0400 (0:00:00.172) 0:02:22.249 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Sunday 15 June 2025 07:52:12 -0400 (0:00:00.158) 0:02:22.407 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Sunday 15 June 2025 07:52:12 -0400 (0:00:00.151) 0:02:22.559 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Sunday 15 June 2025 07:52:13 -0400 (0:00:00.416) 0:02:22.976 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Sunday 15 June 2025 07:52:13 -0400 (0:00:00.278) 0:02:23.254 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 15 June 2025 07:52:13 -0400 (0:00:00.201) 0:02:23.456 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 15 June 2025 07:52:14 -0400 (0:00:00.637) 0:02:24.093 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 15 June 2025 07:52:14 -0400 (0:00:00.313) 0:02:24.407 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 15 June 2025 07:52:14 -0400 (0:00:00.373) 0:02:24.781 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 15 June 2025 07:52:15 -0400 (0:00:00.189) 0:02:24.970 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 15 June 2025 07:52:15 -0400 (0:00:00.238) 0:02:25.209 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 15 June 2025 07:52:15 -0400 (0:00:00.227) 0:02:25.437 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 15 June 2025 07:52:15 -0400 (0:00:00.348) 0:02:25.786 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 15 June 2025 07:52:16 -0400 (0:00:00.440) 0:02:26.226 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988304.5018744, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749988304.5018744, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 35804, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1749988304.5018744, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 15 June 2025 07:52:17 -0400 (0:00:01.190) 0:02:27.417 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 15 June 2025 07:52:17 -0400 (0:00:00.131) 0:02:27.548 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 15 June 2025 07:52:17 -0400 (0:00:00.189) 0:02:27.737 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 15 June 2025 07:52:18 -0400 (0:00:00.264) 0:02:28.002 *********** ok: [managed-node14] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 15 June 2025 07:52:18 -0400 (0:00:00.176) 0:02:28.178 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 15 June 2025 07:52:18 -0400 (0:00:00.305) 0:02:28.484 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 15 June 2025 07:52:18 -0400 (0:00:00.256) 0:02:28.740 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988304.6398742, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749988304.6398742, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 136472, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1749988304.6398742, "nlink": 1, "path": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 15 June 2025 07:52:20 -0400 (0:00:01.520) 0:02:30.261 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 15 June 2025 07:52:24 -0400 (0:00:04.616) 0:02:34.878 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.011273", "end": "2025-06-15 07:52:26.135988", "rc": 0, "start": "2025-06-15 07:52:26.124715" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: af16ab2a-ac54-451b-94bc-b559e3fb69bf Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 944649 Threads: 2 Salt: 19 7a 3f 41 29 e6 d3 38 cf 55 bf 49 02 1a 1f e0 80 4c b0 73 ec be 5b 57 b6 4d e8 9a 1e c9 84 44 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 53 1b d0 05 54 68 c2 0f db fb 4f bb ba 25 f5 48 23 fb f0 cb d1 27 87 78 95 8a db 24 24 f1 2e 9d Digest: a1 0d 19 df a4 b5 ed b3 15 44 eb 11 2c 53 10 d4 07 44 91 78 e7 f0 bf 29 a4 30 b3 af 26 1a 6a 62 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 15 June 2025 07:52:26 -0400 (0:00:01.393) 0:02:36.271 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 15 June 2025 07:52:26 -0400 (0:00:00.185) 0:02:36.457 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 15 June 2025 07:52:26 -0400 (0:00:00.123) 0:02:36.581 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 15 June 2025 07:52:26 -0400 (0:00:00.078) 0:02:36.659 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 15 June 2025 07:52:26 -0400 (0:00:00.170) 0:02:36.829 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Sunday 15 June 2025 07:52:27 -0400 (0:00:00.209) 0:02:37.039 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Sunday 15 June 2025 07:52:27 -0400 (0:00:00.139) 0:02:37.178 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Sunday 15 June 2025 07:52:27 -0400 (0:00:00.162) 0:02:37.341 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Sunday 15 June 2025 07:52:27 -0400 (0:00:00.226) 0:02:37.567 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Sunday 15 June 2025 07:52:27 -0400 (0:00:00.106) 0:02:37.674 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Sunday 15 June 2025 07:52:27 -0400 (0:00:00.117) 0:02:37.791 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Sunday 15 June 2025 07:52:27 -0400 (0:00:00.058) 0:02:37.850 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Sunday 15 June 2025 07:52:28 -0400 (0:00:00.130) 0:02:37.980 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 15 June 2025 07:52:28 -0400 (0:00:00.211) 0:02:38.192 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 15 June 2025 07:52:28 -0400 (0:00:00.167) 0:02:38.360 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 15 June 2025 07:52:28 -0400 (0:00:00.277) 0:02:38.637 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 15 June 2025 07:52:28 -0400 (0:00:00.308) 0:02:38.946 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 15 June 2025 07:52:29 -0400 (0:00:00.179) 0:02:39.125 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 15 June 2025 07:52:29 -0400 (0:00:00.144) 0:02:39.270 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 15 June 2025 07:52:29 -0400 (0:00:00.212) 0:02:39.482 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 15 June 2025 07:52:29 -0400 (0:00:00.219) 0:02:39.702 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 15 June 2025 07:52:29 -0400 (0:00:00.185) 0:02:39.887 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 15 June 2025 07:52:30 -0400 (0:00:00.092) 0:02:39.979 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 15 June 2025 07:52:30 -0400 (0:00:00.173) 0:02:40.153 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 15 June 2025 07:52:30 -0400 (0:00:00.215) 0:02:40.369 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 15 June 2025 07:52:30 -0400 (0:00:00.235) 0:02:40.604 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 15 June 2025 07:52:30 -0400 (0:00:00.187) 0:02:40.792 *********** ok: [managed-node14] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 15 June 2025 07:52:31 -0400 (0:00:00.279) 0:02:41.071 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 15 June 2025 07:52:31 -0400 (0:00:00.189) 0:02:41.260 *********** skipping: [managed-node14] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 15 June 2025 07:52:31 -0400 (0:00:00.230) 0:02:41.490 *********** skipping: [managed-node14] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 15 June 2025 07:52:31 -0400 (0:00:00.184) 0:02:41.675 *********** skipping: [managed-node14] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 15 June 2025 07:52:31 -0400 (0:00:00.104) 0:02:41.780 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Sunday 15 June 2025 07:52:31 -0400 (0:00:00.078) 0:02:41.858 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Sunday 15 June 2025 07:52:31 -0400 (0:00:00.100) 0:02:41.959 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Sunday 15 June 2025 07:52:32 -0400 (0:00:00.217) 0:02:42.177 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Sunday 15 June 2025 07:52:32 -0400 (0:00:00.123) 0:02:42.300 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Sunday 15 June 2025 07:52:32 -0400 (0:00:00.104) 0:02:42.405 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Sunday 15 June 2025 07:52:32 -0400 (0:00:00.144) 0:02:42.549 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Sunday 15 June 2025 07:52:32 -0400 (0:00:00.122) 0:02:42.672 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Sunday 15 June 2025 07:52:32 -0400 (0:00:00.274) 0:02:42.947 *********** skipping: [managed-node14] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Sunday 15 June 2025 07:52:33 -0400 (0:00:00.198) 0:02:43.146 *********** skipping: [managed-node14] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Sunday 15 June 2025 07:52:33 -0400 (0:00:00.252) 0:02:43.398 *********** skipping: [managed-node14] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Sunday 15 June 2025 07:52:33 -0400 (0:00:00.240) 0:02:43.639 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Sunday 15 June 2025 07:52:33 -0400 (0:00:00.215) 0:02:43.855 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Sunday 15 June 2025 07:52:34 -0400 (0:00:00.257) 0:02:44.113 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Sunday 15 June 2025 07:52:34 -0400 (0:00:00.314) 0:02:44.427 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Sunday 15 June 2025 07:52:34 -0400 (0:00:00.177) 0:02:44.605 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Sunday 15 June 2025 07:52:34 -0400 (0:00:00.246) 0:02:44.851 *********** ok: [managed-node14] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Sunday 15 June 2025 07:52:35 -0400 (0:00:00.258) 0:02:45.109 *********** ok: [managed-node14] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Sunday 15 June 2025 07:52:35 -0400 (0:00:00.377) 0:02:45.487 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 15 June 2025 07:52:35 -0400 (0:00:00.271) 0:02:45.759 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 15 June 2025 07:52:36 -0400 (0:00:00.247) 0:02:46.006 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 15 June 2025 07:52:36 -0400 (0:00:00.158) 0:02:46.164 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 15 June 2025 07:52:36 -0400 (0:00:00.184) 0:02:46.348 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 15 June 2025 07:52:36 -0400 (0:00:00.200) 0:02:46.549 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 15 June 2025 07:52:36 -0400 (0:00:00.186) 0:02:46.735 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 15 June 2025 07:52:36 -0400 (0:00:00.184) 0:02:46.920 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 15 June 2025 07:52:37 -0400 (0:00:00.111) 0:02:47.031 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Sunday 15 June 2025 07:52:37 -0400 (0:00:00.199) 0:02:47.231 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Sunday 15 June 2025 07:52:37 -0400 (0:00:00.153) 0:02:47.385 *********** changed: [managed-node14] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:120 Sunday 15 June 2025 07:52:40 -0400 (0:00:03.412) 0:02:50.798 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node14 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 15 June 2025 07:52:41 -0400 (0:00:00.450) 0:02:51.249 *********** ok: [managed-node14] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 15 June 2025 07:52:41 -0400 (0:00:00.205) 0:02:51.454 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 07:52:41 -0400 (0:00:00.277) 0:02:51.731 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 07:52:42 -0400 (0:00:00.402) 0:02:52.134 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 07:52:42 -0400 (0:00:00.346) 0:02:52.480 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 07:52:43 -0400 (0:00:00.659) 0:02:53.140 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 07:52:43 -0400 (0:00:00.426) 0:02:53.567 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 07:52:43 -0400 (0:00:00.278) 0:02:53.845 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 07:52:44 -0400 (0:00:00.297) 0:02:54.143 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 07:52:44 -0400 (0:00:00.142) 0:02:54.285 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 07:52:44 -0400 (0:00:00.653) 0:02:54.939 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 07:52:49 -0400 (0:00:04.932) 0:02:59.872 *********** ok: [managed-node14] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 07:52:50 -0400 (0:00:00.276) 0:03:00.148 *********** ok: [managed-node14] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 07:52:50 -0400 (0:00:00.264) 0:03:00.413 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 07:52:55 -0400 (0:00:05.002) 0:03:05.415 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 07:52:55 -0400 (0:00:00.377) 0:03:05.793 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 07:52:56 -0400 (0:00:00.288) 0:03:06.081 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 07:52:56 -0400 (0:00:00.218) 0:03:06.300 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 07:52:56 -0400 (0:00:00.160) 0:03:06.461 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 07:53:01 -0400 (0:00:04.506) 0:03:10.967 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 07:53:03 -0400 (0:00:02.845) 0:03:13.813 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 07:53:04 -0400 (0:00:00.747) 0:03:14.561 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 07:53:04 -0400 (0:00:00.154) 0:03:14.715 *********** fatal: [managed-node14]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 15 June 2025 07:53:10 -0400 (0:00:05.269) 0:03:19.985 *********** fatal: [managed-node14]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 07:53:10 -0400 (0:00:00.267) 0:03:20.253 *********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 15 June 2025 07:53:10 -0400 (0:00:00.119) 0:03:20.372 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 15 June 2025 07:53:10 -0400 (0:00:00.322) 0:03:20.695 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 15 June 2025 07:53:10 -0400 (0:00:00.243) 0:03:20.938 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Sunday 15 June 2025 07:53:11 -0400 (0:00:00.164) 0:03:21.103 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988360.2648032, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1749988360.2648032, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1749988360.2648032, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1539500192", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Sunday 15 June 2025 07:53:12 -0400 (0:00:01.356) 0:03:22.460 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:141 Sunday 15 June 2025 07:53:12 -0400 (0:00:00.300) 0:03:22.760 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 07:53:13 -0400 (0:00:00.803) 0:03:23.564 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 07:53:14 -0400 (0:00:00.513) 0:03:24.077 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 07:53:14 -0400 (0:00:00.227) 0:03:24.305 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 07:53:14 -0400 (0:00:00.497) 0:03:24.802 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 07:53:15 -0400 (0:00:00.312) 0:03:25.115 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 07:53:15 -0400 (0:00:00.283) 0:03:25.399 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 07:53:15 -0400 (0:00:00.244) 0:03:25.644 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 07:53:15 -0400 (0:00:00.229) 0:03:25.874 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 07:53:16 -0400 (0:00:00.610) 0:03:26.484 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 07:53:21 -0400 (0:00:04.508) 0:03:30.993 *********** ok: [managed-node14] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 07:53:21 -0400 (0:00:00.259) 0:03:31.252 *********** ok: [managed-node14] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 07:53:21 -0400 (0:00:00.225) 0:03:31.478 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 07:53:26 -0400 (0:00:04.787) 0:03:36.266 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 07:53:26 -0400 (0:00:00.342) 0:03:36.609 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 07:53:26 -0400 (0:00:00.158) 0:03:36.767 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 07:53:27 -0400 (0:00:00.203) 0:03:36.970 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 07:53:27 -0400 (0:00:00.367) 0:03:37.338 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 07:53:31 -0400 (0:00:04.271) 0:03:41.609 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 07:53:34 -0400 (0:00:03.015) 0:03:44.625 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 07:53:35 -0400 (0:00:00.391) 0:03:45.016 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 07:53:35 -0400 (0:00:00.133) 0:03:45.150 *********** changed: [managed-node14] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 15 June 2025 07:53:40 -0400 (0:00:05.437) 0:03:50.587 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 15 June 2025 07:53:40 -0400 (0:00:00.282) 0:03:50.870 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988316.083854, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a31b942b1619e34790d23756ade0c26fc96583dc", "ctime": 1749988316.0798538, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1749988316.0798538, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2912082900", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 15 June 2025 07:53:42 -0400 (0:00:01.540) 0:03:52.410 *********** ok: [managed-node14] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 07:53:43 -0400 (0:00:01.448) 0:03:53.858 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 15 June 2025 07:53:44 -0400 (0:00:00.194) 0:03:54.052 *********** ok: [managed-node14] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 15 June 2025 07:53:44 -0400 (0:00:00.287) 0:03:54.339 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 15 June 2025 07:53:44 -0400 (0:00:00.264) 0:03:54.604 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 15 June 2025 07:53:44 -0400 (0:00:00.218) 0:03:54.823 *********** changed: [managed-node14] => (item={'src': '/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 15 June 2025 07:53:46 -0400 (0:00:01.264) 0:03:56.087 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 15 June 2025 07:53:47 -0400 (0:00:01.504) 0:03:57.591 *********** changed: [managed-node14] => (item={'src': 'UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 15 June 2025 07:53:49 -0400 (0:00:01.460) 0:03:59.052 *********** skipping: [managed-node14] => (item={'src': 'UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 15 June 2025 07:53:49 -0400 (0:00:00.194) 0:03:59.246 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 15 June 2025 07:53:50 -0400 (0:00:01.407) 0:04:00.653 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988329.6458297, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "839d6c20939a2285df80c9936194380ee0de055e", "ctime": 1749988321.688844, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 136315080, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1749988321.6868439, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "2046582927", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 15 June 2025 07:53:51 -0400 (0:00:00.692) 0:04:01.345 *********** changed: [managed-node14] => (item={'backing_device': '/dev/sda', 'name': 'luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 15 June 2025 07:53:52 -0400 (0:00:01.381) 0:04:02.727 *********** ok: [managed-node14] TASK [Verify role results] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:155 Sunday 15 June 2025 07:53:54 -0400 (0:00:02.123) 0:04:04.851 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node14 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 15 June 2025 07:53:55 -0400 (0:00:00.353) 0:04:05.205 *********** skipping: [managed-node14] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 15 June 2025 07:53:55 -0400 (0:00:00.186) 0:04:05.392 *********** ok: [managed-node14] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 15 June 2025 07:53:55 -0400 (0:00:00.460) 0:04:05.852 *********** ok: [managed-node14] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f7109c6b-4db3-4ea9-abe2-e4a0f17a8522" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 15 June 2025 07:53:57 -0400 (0:00:01.580) 0:04:07.433 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002265", "end": "2025-06-15 07:53:58.401772", "rc": 0, "start": "2025-06-15 07:53:58.399507" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 15 June 2025 07:53:58 -0400 (0:00:01.214) 0:04:08.647 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002651", "end": "2025-06-15 07:53:59.724045", "failed_when_result": false, "rc": 0, "start": "2025-06-15 07:53:59.721394" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 15 June 2025 07:54:00 -0400 (0:00:01.362) 0:04:10.010 *********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Sunday 15 June 2025 07:54:00 -0400 (0:00:00.183) 0:04:10.193 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node14 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 15 June 2025 07:54:00 -0400 (0:00:00.539) 0:04:10.733 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 15 June 2025 07:54:01 -0400 (0:00:00.365) 0:04:11.099 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node14 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 15 June 2025 07:54:02 -0400 (0:00:01.292) 0:04:12.391 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 15 June 2025 07:54:02 -0400 (0:00:00.333) 0:04:12.725 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 15 June 2025 07:54:03 -0400 (0:00:00.380) 0:04:13.106 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Sunday 15 June 2025 07:54:03 -0400 (0:00:00.284) 0:04:13.390 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Sunday 15 June 2025 07:54:03 -0400 (0:00:00.243) 0:04:13.634 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Sunday 15 June 2025 07:54:04 -0400 (0:00:00.396) 0:04:14.031 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Sunday 15 June 2025 07:54:04 -0400 (0:00:00.300) 0:04:14.331 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Sunday 15 June 2025 07:54:04 -0400 (0:00:00.200) 0:04:14.532 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Sunday 15 June 2025 07:54:04 -0400 (0:00:00.217) 0:04:14.749 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Sunday 15 June 2025 07:54:04 -0400 (0:00:00.201) 0:04:14.951 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Sunday 15 June 2025 07:54:05 -0400 (0:00:00.250) 0:04:15.202 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 15 June 2025 07:54:05 -0400 (0:00:00.190) 0:04:15.392 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 15 June 2025 07:54:06 -0400 (0:00:00.580) 0:04:15.973 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 15 June 2025 07:54:06 -0400 (0:00:00.306) 0:04:16.280 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 15 June 2025 07:54:06 -0400 (0:00:00.273) 0:04:16.553 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 15 June 2025 07:54:06 -0400 (0:00:00.145) 0:04:16.699 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 15 June 2025 07:54:07 -0400 (0:00:00.299) 0:04:16.998 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 15 June 2025 07:54:07 -0400 (0:00:00.143) 0:04:17.142 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 15 June 2025 07:54:07 -0400 (0:00:00.268) 0:04:17.410 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 15 June 2025 07:54:07 -0400 (0:00:00.267) 0:04:17.678 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988420.2117398, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749988420.2117398, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 35804, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1749988420.2117398, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 15 June 2025 07:54:09 -0400 (0:00:01.483) 0:04:19.162 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 15 June 2025 07:54:09 -0400 (0:00:00.308) 0:04:19.471 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 15 June 2025 07:54:09 -0400 (0:00:00.264) 0:04:19.735 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 15 June 2025 07:54:10 -0400 (0:00:00.259) 0:04:19.994 *********** ok: [managed-node14] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 15 June 2025 07:54:10 -0400 (0:00:00.258) 0:04:20.253 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 15 June 2025 07:54:10 -0400 (0:00:00.263) 0:04:20.516 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 15 June 2025 07:54:10 -0400 (0:00:00.211) 0:04:20.728 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 15 June 2025 07:54:10 -0400 (0:00:00.129) 0:04:20.858 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 15 June 2025 07:54:15 -0400 (0:00:04.815) 0:04:25.673 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 15 June 2025 07:54:16 -0400 (0:00:00.480) 0:04:26.154 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 15 June 2025 07:54:16 -0400 (0:00:00.136) 0:04:26.291 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 15 June 2025 07:54:16 -0400 (0:00:00.358) 0:04:26.650 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 15 June 2025 07:54:16 -0400 (0:00:00.271) 0:04:26.921 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 15 June 2025 07:54:17 -0400 (0:00:00.340) 0:04:27.261 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Sunday 15 June 2025 07:54:17 -0400 (0:00:00.261) 0:04:27.523 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Sunday 15 June 2025 07:54:17 -0400 (0:00:00.244) 0:04:27.768 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Sunday 15 June 2025 07:54:18 -0400 (0:00:00.245) 0:04:28.013 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Sunday 15 June 2025 07:54:18 -0400 (0:00:00.223) 0:04:28.237 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Sunday 15 June 2025 07:54:18 -0400 (0:00:00.280) 0:04:28.517 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Sunday 15 June 2025 07:54:18 -0400 (0:00:00.408) 0:04:28.926 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Sunday 15 June 2025 07:54:19 -0400 (0:00:00.434) 0:04:29.360 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Sunday 15 June 2025 07:54:19 -0400 (0:00:00.236) 0:04:29.597 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 15 June 2025 07:54:19 -0400 (0:00:00.193) 0:04:29.790 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 15 June 2025 07:54:20 -0400 (0:00:00.250) 0:04:30.041 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 15 June 2025 07:54:20 -0400 (0:00:00.197) 0:04:30.238 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 15 June 2025 07:54:20 -0400 (0:00:00.314) 0:04:30.553 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 15 June 2025 07:54:20 -0400 (0:00:00.306) 0:04:30.860 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 15 June 2025 07:54:21 -0400 (0:00:00.285) 0:04:31.145 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 15 June 2025 07:54:21 -0400 (0:00:00.226) 0:04:31.372 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 15 June 2025 07:54:21 -0400 (0:00:00.241) 0:04:31.613 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 15 June 2025 07:54:21 -0400 (0:00:00.279) 0:04:31.892 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 15 June 2025 07:54:22 -0400 (0:00:00.235) 0:04:32.128 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 15 June 2025 07:54:22 -0400 (0:00:00.308) 0:04:32.437 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 15 June 2025 07:54:22 -0400 (0:00:00.283) 0:04:32.721 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 15 June 2025 07:54:22 -0400 (0:00:00.212) 0:04:32.933 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 15 June 2025 07:54:23 -0400 (0:00:00.284) 0:04:33.217 *********** ok: [managed-node14] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 15 June 2025 07:54:23 -0400 (0:00:00.282) 0:04:33.500 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 15 June 2025 07:54:23 -0400 (0:00:00.237) 0:04:33.738 *********** skipping: [managed-node14] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 15 June 2025 07:54:24 -0400 (0:00:00.334) 0:04:34.072 *********** skipping: [managed-node14] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 15 June 2025 07:54:24 -0400 (0:00:00.335) 0:04:34.408 *********** skipping: [managed-node14] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 15 June 2025 07:54:24 -0400 (0:00:00.304) 0:04:34.712 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Sunday 15 June 2025 07:54:24 -0400 (0:00:00.247) 0:04:34.960 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Sunday 15 June 2025 07:54:25 -0400 (0:00:00.290) 0:04:35.250 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Sunday 15 June 2025 07:54:25 -0400 (0:00:00.190) 0:04:35.441 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Sunday 15 June 2025 07:54:25 -0400 (0:00:00.257) 0:04:35.698 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Sunday 15 June 2025 07:54:25 -0400 (0:00:00.214) 0:04:35.913 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Sunday 15 June 2025 07:54:26 -0400 (0:00:00.261) 0:04:36.174 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Sunday 15 June 2025 07:54:26 -0400 (0:00:00.249) 0:04:36.424 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Sunday 15 June 2025 07:54:26 -0400 (0:00:00.327) 0:04:36.752 *********** skipping: [managed-node14] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Sunday 15 June 2025 07:54:27 -0400 (0:00:00.228) 0:04:36.981 *********** skipping: [managed-node14] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Sunday 15 June 2025 07:54:27 -0400 (0:00:00.210) 0:04:37.191 *********** skipping: [managed-node14] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Sunday 15 June 2025 07:54:27 -0400 (0:00:00.248) 0:04:37.440 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Sunday 15 June 2025 07:54:27 -0400 (0:00:00.290) 0:04:37.731 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Sunday 15 June 2025 07:54:28 -0400 (0:00:00.292) 0:04:38.024 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Sunday 15 June 2025 07:54:28 -0400 (0:00:00.205) 0:04:38.230 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Sunday 15 June 2025 07:54:28 -0400 (0:00:00.319) 0:04:38.549 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Sunday 15 June 2025 07:54:28 -0400 (0:00:00.163) 0:04:38.712 *********** ok: [managed-node14] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Sunday 15 June 2025 07:54:29 -0400 (0:00:00.256) 0:04:38.969 *********** ok: [managed-node14] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Sunday 15 June 2025 07:54:29 -0400 (0:00:00.255) 0:04:39.224 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 15 June 2025 07:54:29 -0400 (0:00:00.286) 0:04:39.511 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 15 June 2025 07:54:29 -0400 (0:00:00.240) 0:04:39.751 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 15 June 2025 07:54:30 -0400 (0:00:00.312) 0:04:40.064 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 15 June 2025 07:54:30 -0400 (0:00:00.279) 0:04:40.343 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 15 June 2025 07:54:30 -0400 (0:00:00.233) 0:04:40.577 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 15 June 2025 07:54:30 -0400 (0:00:00.308) 0:04:40.885 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 15 June 2025 07:54:31 -0400 (0:00:00.318) 0:04:41.204 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 15 June 2025 07:54:31 -0400 (0:00:00.296) 0:04:41.500 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Sunday 15 June 2025 07:54:31 -0400 (0:00:00.197) 0:04:41.697 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Sunday 15 June 2025 07:54:31 -0400 (0:00:00.164) 0:04:41.862 *********** changed: [managed-node14] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:161 Sunday 15 June 2025 07:54:33 -0400 (0:00:01.457) 0:04:43.319 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node14 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 15 June 2025 07:54:33 -0400 (0:00:00.524) 0:04:43.844 *********** ok: [managed-node14] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 15 June 2025 07:54:34 -0400 (0:00:00.253) 0:04:44.097 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 07:54:34 -0400 (0:00:00.698) 0:04:44.796 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 07:54:35 -0400 (0:00:00.380) 0:04:45.177 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 07:54:35 -0400 (0:00:00.283) 0:04:45.460 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 07:54:36 -0400 (0:00:00.649) 0:04:46.109 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 07:54:36 -0400 (0:00:00.319) 0:04:46.429 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 07:54:36 -0400 (0:00:00.268) 0:04:46.697 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 07:54:36 -0400 (0:00:00.241) 0:04:46.939 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 07:54:37 -0400 (0:00:00.202) 0:04:47.141 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 07:54:37 -0400 (0:00:00.522) 0:04:47.664 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 07:54:41 -0400 (0:00:04.276) 0:04:51.940 *********** ok: [managed-node14] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 07:54:42 -0400 (0:00:00.293) 0:04:52.234 *********** ok: [managed-node14] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 07:54:42 -0400 (0:00:00.398) 0:04:52.632 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 07:54:47 -0400 (0:00:05.296) 0:04:57.929 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 07:54:48 -0400 (0:00:00.448) 0:04:58.377 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 07:54:48 -0400 (0:00:00.171) 0:04:58.549 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 07:54:48 -0400 (0:00:00.288) 0:04:58.837 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 07:54:49 -0400 (0:00:00.279) 0:04:59.116 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 07:54:53 -0400 (0:00:04.641) 0:05:03.758 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service": { "name": "systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2daf16ab2a\\x2dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service": { "name": "systemd-cryptsetup@luks\\x2daf16ab2a\\x2dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 07:54:56 -0400 (0:00:02.755) 0:05:06.513 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2daf16ab2a\\x2dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 07:54:56 -0400 (0:00:00.392) 0:05:06.906 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2daf16ab2a\x2dac54\x2d451b\x2d94bc\x2db559e3fb69bf.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2daf16ab2a\\x2dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "name": "systemd-cryptsetup@luks\\x2daf16ab2a\\x2dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket dev-sda.device system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2daf16ab2a\\x2dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2daf16ab2a\\x2dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2daf16ab2a\\x2dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2025-06-15 07:53:50 EDT", "StateChangeTimestampMonotonic": "1734327627", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...dac54\x2d451b\x2d94bc\x2db559e3fb69bf.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "name": "systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 07:55:00 -0400 (0:00:03.676) 0:05:10.583 *********** fatal: [managed-node14]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 15 June 2025 07:55:06 -0400 (0:00:05.399) 0:05:15.982 *********** fatal: [managed-node14]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 07:55:06 -0400 (0:00:00.260) 0:05:16.243 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2daf16ab2a\x2dac54\x2d451b\x2d94bc\x2db559e3fb69bf.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2daf16ab2a\\x2dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "name": "systemd-cryptsetup@luks\\x2daf16ab2a\\x2dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2daf16ab2a\\x2dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2daf16ab2a\\x2dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2daf16ab2a\\x2dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2daf16ab2a\\x2dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...dac54\x2d451b\x2d94bc\x2db559e3fb69bf.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "name": "systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dac54\\x2d451b\\x2d94bc\\x2db559e3fb69bf.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 15 June 2025 07:55:09 -0400 (0:00:03.288) 0:05:19.532 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 15 June 2025 07:55:09 -0400 (0:00:00.213) 0:05:19.746 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 15 June 2025 07:55:09 -0400 (0:00:00.215) 0:05:19.961 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Sunday 15 June 2025 07:55:10 -0400 (0:00:00.138) 0:05:20.100 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988473.063638, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1749988473.063638, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1749988473.063638, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3167317782", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Sunday 15 June 2025 07:55:11 -0400 (0:00:01.408) 0:05:21.508 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:182 Sunday 15 June 2025 07:55:11 -0400 (0:00:00.206) 0:05:21.715 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 07:55:12 -0400 (0:00:00.931) 0:05:22.647 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 07:55:13 -0400 (0:00:00.481) 0:05:23.128 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 07:55:13 -0400 (0:00:00.253) 0:05:23.382 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 07:55:13 -0400 (0:00:00.472) 0:05:23.854 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 07:55:14 -0400 (0:00:00.171) 0:05:24.026 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 07:55:14 -0400 (0:00:00.244) 0:05:24.270 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 07:55:14 -0400 (0:00:00.162) 0:05:24.433 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 07:55:14 -0400 (0:00:00.202) 0:05:24.636 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 07:55:15 -0400 (0:00:00.577) 0:05:25.213 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 07:55:19 -0400 (0:00:04.488) 0:05:29.702 *********** ok: [managed-node14] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 07:55:19 -0400 (0:00:00.250) 0:05:29.952 *********** ok: [managed-node14] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 07:55:20 -0400 (0:00:00.286) 0:05:30.239 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 07:55:25 -0400 (0:00:04.932) 0:05:35.172 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 07:55:25 -0400 (0:00:00.235) 0:05:35.417 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 07:55:25 -0400 (0:00:00.189) 0:05:35.606 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 07:55:25 -0400 (0:00:00.229) 0:05:35.835 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 07:55:26 -0400 (0:00:00.161) 0:05:35.997 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 07:55:30 -0400 (0:00:04.277) 0:05:40.274 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 07:55:33 -0400 (0:00:03.320) 0:05:43.595 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 07:55:34 -0400 (0:00:00.375) 0:05:43.970 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 07:55:34 -0400 (0:00:00.185) 0:05:44.156 *********** changed: [managed-node14] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 15 June 2025 07:55:47 -0400 (0:00:13.416) 0:05:57.572 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 15 June 2025 07:55:47 -0400 (0:00:00.257) 0:05:57.829 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988428.7767234, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "cab872f3f4842ec4757ef6388e9dfee214c1fa22", "ctime": 1749988428.7737234, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1749988428.7737234, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "2912082900", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 15 June 2025 07:55:49 -0400 (0:00:01.287) 0:05:59.116 *********** ok: [managed-node14] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 07:55:50 -0400 (0:00:01.682) 0:06:00.798 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 15 June 2025 07:55:50 -0400 (0:00:00.132) 0:06:00.931 *********** ok: [managed-node14] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 15 June 2025 07:55:51 -0400 (0:00:00.138) 0:06:01.069 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 15 June 2025 07:55:51 -0400 (0:00:00.133) 0:06:01.203 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 15 June 2025 07:55:51 -0400 (0:00:00.237) 0:06:01.440 *********** changed: [managed-node14] => (item={'src': 'UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f7109c6b-4db3-4ea9-abe2-e4a0f17a8522" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 15 June 2025 07:55:53 -0400 (0:00:01.629) 0:06:03.070 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 15 June 2025 07:55:54 -0400 (0:00:01.642) 0:06:04.712 *********** changed: [managed-node14] => (item={'src': '/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 15 June 2025 07:55:56 -0400 (0:00:01.269) 0:06:05.982 *********** skipping: [managed-node14] => (item={'src': '/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 15 June 2025 07:55:56 -0400 (0:00:00.216) 0:06:06.198 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 15 June 2025 07:55:58 -0400 (0:00:01.895) 0:06:08.093 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988439.7227023, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1749988432.4687161, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 255852739, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1749988432.4677162, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4223165577", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 15 June 2025 07:55:59 -0400 (0:00:01.542) 0:06:09.636 *********** changed: [managed-node14] => (item={'backing_device': '/dev/sda', 'name': 'luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 15 June 2025 07:56:01 -0400 (0:00:01.560) 0:06:11.197 *********** ok: [managed-node14] TASK [Verify role results] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:196 Sunday 15 June 2025 07:56:03 -0400 (0:00:01.862) 0:06:13.059 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node14 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 15 June 2025 07:56:03 -0400 (0:00:00.550) 0:06:13.610 *********** skipping: [managed-node14] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 15 June 2025 07:56:03 -0400 (0:00:00.264) 0:06:13.874 *********** ok: [managed-node14] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 15 June 2025 07:56:04 -0400 (0:00:00.199) 0:06:14.074 *********** ok: [managed-node14] => { "changed": false, "info": { "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "size": "10G", "type": "crypt", "uuid": "d1d7fc62-4176-49ae-8cd2-91c6f67627aa" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "eaaa2312-e5d4-4e1d-bde8-152f64419ced" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 15 June 2025 07:56:05 -0400 (0:00:01.007) 0:06:15.082 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002625", "end": "2025-06-15 07:56:06.129144", "rc": 0, "start": "2025-06-15 07:56:06.126519" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 15 June 2025 07:56:06 -0400 (0:00:01.298) 0:06:16.380 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002890", "end": "2025-06-15 07:56:07.253182", "failed_when_result": false, "rc": 0, "start": "2025-06-15 07:56:07.250292" } STDOUT: luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 15 June 2025 07:56:07 -0400 (0:00:01.048) 0:06:17.429 *********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Sunday 15 June 2025 07:56:07 -0400 (0:00:00.167) 0:06:17.596 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node14 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 15 June 2025 07:56:07 -0400 (0:00:00.248) 0:06:17.845 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 15 June 2025 07:56:08 -0400 (0:00:00.135) 0:06:17.980 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node14 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 15 June 2025 07:56:09 -0400 (0:00:01.045) 0:06:19.026 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 15 June 2025 07:56:09 -0400 (0:00:00.229) 0:06:19.256 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 15 June 2025 07:56:09 -0400 (0:00:00.185) 0:06:19.441 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Sunday 15 June 2025 07:56:09 -0400 (0:00:00.314) 0:06:19.755 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Sunday 15 June 2025 07:56:09 -0400 (0:00:00.174) 0:06:19.930 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Sunday 15 June 2025 07:56:10 -0400 (0:00:00.262) 0:06:20.193 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Sunday 15 June 2025 07:56:10 -0400 (0:00:00.213) 0:06:20.406 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Sunday 15 June 2025 07:56:10 -0400 (0:00:00.322) 0:06:20.728 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Sunday 15 June 2025 07:56:10 -0400 (0:00:00.219) 0:06:20.948 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Sunday 15 June 2025 07:56:11 -0400 (0:00:00.208) 0:06:21.156 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Sunday 15 June 2025 07:56:11 -0400 (0:00:00.133) 0:06:21.289 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 15 June 2025 07:56:11 -0400 (0:00:00.136) 0:06:21.425 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 15 June 2025 07:56:11 -0400 (0:00:00.320) 0:06:21.746 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 15 June 2025 07:56:12 -0400 (0:00:00.270) 0:06:22.017 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 15 June 2025 07:56:12 -0400 (0:00:00.290) 0:06:22.307 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 15 June 2025 07:56:12 -0400 (0:00:00.156) 0:06:22.464 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 15 June 2025 07:56:12 -0400 (0:00:00.230) 0:06:22.695 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 15 June 2025 07:56:13 -0400 (0:00:00.296) 0:06:22.991 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 15 June 2025 07:56:13 -0400 (0:00:00.304) 0:06:23.295 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 15 June 2025 07:56:13 -0400 (0:00:00.380) 0:06:23.675 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988547.0724952, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749988547.0724952, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 35804, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1749988547.0724952, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 15 June 2025 07:56:14 -0400 (0:00:01.209) 0:06:24.885 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 15 June 2025 07:56:15 -0400 (0:00:00.234) 0:06:25.120 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 15 June 2025 07:56:15 -0400 (0:00:00.290) 0:06:25.410 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 15 June 2025 07:56:15 -0400 (0:00:00.220) 0:06:25.631 *********** ok: [managed-node14] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 15 June 2025 07:56:15 -0400 (0:00:00.289) 0:06:25.920 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 15 June 2025 07:56:16 -0400 (0:00:00.187) 0:06:26.108 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 15 June 2025 07:56:16 -0400 (0:00:00.152) 0:06:26.261 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988547.207495, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749988547.207495, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 162915, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1749988547.207495, "nlink": 1, "path": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 15 June 2025 07:56:17 -0400 (0:00:01.324) 0:06:27.585 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 15 June 2025 07:56:21 -0400 (0:00:04.125) 0:06:31.710 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.010138", "end": "2025-06-15 07:56:22.722161", "rc": 0, "start": "2025-06-15 07:56:22.712023" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: eaaa2312-e5d4-4e1d-bde8-152f64419ced Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 951672 Threads: 2 Salt: 95 54 b4 94 69 f2 99 f6 7f 5a 60 fa 9a cf 53 bb fa 2a 22 07 10 7c da e2 55 d4 2e 5f 88 a1 ae 2e AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: d2 39 ac 92 58 e5 58 c4 f2 78 30 9f 17 ef d8 96 83 5e 4d c6 f9 d0 88 42 70 30 79 13 91 00 79 eb Digest: ab 93 c0 fa 86 72 0c cb 34 50 1e b1 c5 a5 e1 5e 80 ee d7 44 13 20 de 60 0e b8 44 33 74 3a c2 94 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 15 June 2025 07:56:22 -0400 (0:00:01.243) 0:06:32.954 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 15 June 2025 07:56:23 -0400 (0:00:00.292) 0:06:33.246 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 15 June 2025 07:56:23 -0400 (0:00:00.349) 0:06:33.596 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 15 June 2025 07:56:23 -0400 (0:00:00.205) 0:06:33.801 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 15 June 2025 07:56:24 -0400 (0:00:00.175) 0:06:33.977 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Sunday 15 June 2025 07:56:24 -0400 (0:00:00.189) 0:06:34.167 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Sunday 15 June 2025 07:56:24 -0400 (0:00:00.135) 0:06:34.303 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Sunday 15 June 2025 07:56:24 -0400 (0:00:00.225) 0:06:34.528 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Sunday 15 June 2025 07:56:24 -0400 (0:00:00.311) 0:06:34.840 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Sunday 15 June 2025 07:56:25 -0400 (0:00:00.300) 0:06:35.140 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Sunday 15 June 2025 07:56:25 -0400 (0:00:00.260) 0:06:35.400 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Sunday 15 June 2025 07:56:25 -0400 (0:00:00.317) 0:06:35.718 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Sunday 15 June 2025 07:56:26 -0400 (0:00:00.264) 0:06:35.982 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 15 June 2025 07:56:26 -0400 (0:00:00.191) 0:06:36.174 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 15 June 2025 07:56:26 -0400 (0:00:00.371) 0:06:36.546 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 15 June 2025 07:56:26 -0400 (0:00:00.239) 0:06:36.785 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 15 June 2025 07:56:27 -0400 (0:00:00.272) 0:06:37.058 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 15 June 2025 07:56:27 -0400 (0:00:00.206) 0:06:37.265 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 15 June 2025 07:56:27 -0400 (0:00:00.284) 0:06:37.549 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 15 June 2025 07:56:27 -0400 (0:00:00.266) 0:06:37.815 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 15 June 2025 07:56:28 -0400 (0:00:00.177) 0:06:37.993 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 15 June 2025 07:56:28 -0400 (0:00:00.153) 0:06:38.146 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 15 June 2025 07:56:28 -0400 (0:00:00.228) 0:06:38.374 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 15 June 2025 07:56:28 -0400 (0:00:00.144) 0:06:38.519 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 15 June 2025 07:56:28 -0400 (0:00:00.277) 0:06:38.797 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 15 June 2025 07:56:29 -0400 (0:00:00.290) 0:06:39.088 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 15 June 2025 07:56:29 -0400 (0:00:00.256) 0:06:39.344 *********** ok: [managed-node14] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 15 June 2025 07:56:29 -0400 (0:00:00.228) 0:06:39.573 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 15 June 2025 07:56:29 -0400 (0:00:00.204) 0:06:39.777 *********** skipping: [managed-node14] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 15 June 2025 07:56:29 -0400 (0:00:00.188) 0:06:39.965 *********** skipping: [managed-node14] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 15 June 2025 07:56:30 -0400 (0:00:00.108) 0:06:40.074 *********** skipping: [managed-node14] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 15 June 2025 07:56:30 -0400 (0:00:00.096) 0:06:40.170 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Sunday 15 June 2025 07:56:30 -0400 (0:00:00.092) 0:06:40.262 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Sunday 15 June 2025 07:56:30 -0400 (0:00:00.123) 0:06:40.386 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Sunday 15 June 2025 07:56:30 -0400 (0:00:00.195) 0:06:40.581 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Sunday 15 June 2025 07:56:30 -0400 (0:00:00.154) 0:06:40.735 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Sunday 15 June 2025 07:56:31 -0400 (0:00:00.235) 0:06:40.971 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Sunday 15 June 2025 07:56:31 -0400 (0:00:00.194) 0:06:41.166 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Sunday 15 June 2025 07:56:31 -0400 (0:00:00.174) 0:06:41.340 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Sunday 15 June 2025 07:56:31 -0400 (0:00:00.187) 0:06:41.527 *********** skipping: [managed-node14] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Sunday 15 June 2025 07:56:31 -0400 (0:00:00.238) 0:06:41.765 *********** skipping: [managed-node14] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Sunday 15 June 2025 07:56:32 -0400 (0:00:00.270) 0:06:42.036 *********** skipping: [managed-node14] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Sunday 15 June 2025 07:56:32 -0400 (0:00:00.166) 0:06:42.203 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Sunday 15 June 2025 07:56:32 -0400 (0:00:00.204) 0:06:42.408 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Sunday 15 June 2025 07:56:32 -0400 (0:00:00.206) 0:06:42.614 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Sunday 15 June 2025 07:56:32 -0400 (0:00:00.154) 0:06:42.769 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Sunday 15 June 2025 07:56:32 -0400 (0:00:00.169) 0:06:42.938 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Sunday 15 June 2025 07:56:33 -0400 (0:00:00.122) 0:06:43.061 *********** ok: [managed-node14] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Sunday 15 June 2025 07:56:33 -0400 (0:00:00.161) 0:06:43.222 *********** ok: [managed-node14] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Sunday 15 June 2025 07:56:33 -0400 (0:00:00.295) 0:06:43.518 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 15 June 2025 07:56:33 -0400 (0:00:00.237) 0:06:43.755 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 15 June 2025 07:56:34 -0400 (0:00:00.285) 0:06:44.041 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 15 June 2025 07:56:34 -0400 (0:00:00.254) 0:06:44.295 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 15 June 2025 07:56:34 -0400 (0:00:00.254) 0:06:44.549 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 15 June 2025 07:56:34 -0400 (0:00:00.272) 0:06:44.822 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 15 June 2025 07:56:35 -0400 (0:00:00.228) 0:06:45.051 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 15 June 2025 07:56:35 -0400 (0:00:00.222) 0:06:45.273 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 15 June 2025 07:56:35 -0400 (0:00:00.306) 0:06:45.579 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Sunday 15 June 2025 07:56:35 -0400 (0:00:00.200) 0:06:45.780 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:203 Sunday 15 June 2025 07:56:36 -0400 (0:00:00.191) 0:06:45.972 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node14 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 15 June 2025 07:56:36 -0400 (0:00:00.659) 0:06:46.631 *********** ok: [managed-node14] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 15 June 2025 07:56:37 -0400 (0:00:00.769) 0:06:47.401 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 07:56:37 -0400 (0:00:00.300) 0:06:47.702 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 07:56:37 -0400 (0:00:00.200) 0:06:47.902 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 07:56:38 -0400 (0:00:00.325) 0:06:48.228 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 07:56:38 -0400 (0:00:00.493) 0:06:48.722 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 07:56:38 -0400 (0:00:00.219) 0:06:48.942 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 07:56:39 -0400 (0:00:00.264) 0:06:49.206 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 07:56:39 -0400 (0:00:00.134) 0:06:49.341 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 07:56:39 -0400 (0:00:00.186) 0:06:49.527 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 07:56:39 -0400 (0:00:00.404) 0:06:49.998 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 07:56:44 -0400 (0:00:04.326) 0:06:54.324 *********** ok: [managed-node14] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 07:56:44 -0400 (0:00:00.220) 0:06:54.545 *********** ok: [managed-node14] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 07:56:44 -0400 (0:00:00.311) 0:06:54.856 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 07:56:49 -0400 (0:00:04.833) 0:06:59.689 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 07:56:50 -0400 (0:00:00.397) 0:07:00.087 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 07:56:50 -0400 (0:00:00.124) 0:07:00.211 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 07:56:50 -0400 (0:00:00.389) 0:07:00.601 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 07:56:50 -0400 (0:00:00.173) 0:07:00.774 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 07:56:55 -0400 (0:00:04.516) 0:07:05.291 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 07:56:58 -0400 (0:00:02.966) 0:07:08.257 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 07:56:58 -0400 (0:00:00.335) 0:07:08.593 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 07:56:58 -0400 (0:00:00.192) 0:07:08.785 *********** fatal: [managed-node14]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 15 June 2025 07:57:04 -0400 (0:00:05.195) 0:07:13.981 *********** fatal: [managed-node14]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 07:57:04 -0400 (0:00:00.210) 0:07:14.192 *********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 15 June 2025 07:57:04 -0400 (0:00:00.176) 0:07:14.368 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 15 June 2025 07:57:04 -0400 (0:00:00.184) 0:07:14.553 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 15 June 2025 07:57:04 -0400 (0:00:00.333) 0:07:14.887 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:223 Sunday 15 June 2025 07:57:04 -0400 (0:00:00.062) 0:07:14.949 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 07:57:05 -0400 (0:00:00.498) 0:07:15.448 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 07:57:05 -0400 (0:00:00.273) 0:07:15.721 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 07:57:06 -0400 (0:00:00.262) 0:07:15.984 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 07:57:06 -0400 (0:00:00.392) 0:07:16.376 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 07:57:06 -0400 (0:00:00.296) 0:07:16.672 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 07:57:06 -0400 (0:00:00.155) 0:07:16.827 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 07:57:06 -0400 (0:00:00.118) 0:07:16.945 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 07:57:07 -0400 (0:00:00.159) 0:07:17.105 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 07:57:07 -0400 (0:00:00.383) 0:07:17.488 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 07:57:11 -0400 (0:00:04.336) 0:07:21.825 *********** ok: [managed-node14] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 07:57:12 -0400 (0:00:00.363) 0:07:22.188 *********** ok: [managed-node14] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 07:57:12 -0400 (0:00:00.358) 0:07:22.547 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 07:57:17 -0400 (0:00:05.068) 0:07:27.615 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 07:57:18 -0400 (0:00:00.452) 0:07:28.068 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 07:57:18 -0400 (0:00:00.238) 0:07:28.307 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 07:57:18 -0400 (0:00:00.219) 0:07:28.527 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 07:57:18 -0400 (0:00:00.157) 0:07:28.684 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 07:57:23 -0400 (0:00:04.566) 0:07:33.250 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 07:57:26 -0400 (0:00:02.802) 0:07:36.053 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 07:57:26 -0400 (0:00:00.445) 0:07:36.498 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 07:57:26 -0400 (0:00:00.249) 0:07:36.748 *********** changed: [managed-node14] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 15 June 2025 07:57:41 -0400 (0:00:14.347) 0:07:51.095 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 15 June 2025 07:57:41 -0400 (0:00:00.149) 0:07:51.244 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988555.8364785, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "124e305c5cc3afea10a8db70e8188132db3682cc", "ctime": 1749988555.8334785, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1749988555.8334785, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2912082900", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 15 June 2025 07:57:42 -0400 (0:00:01.576) 0:07:52.821 *********** ok: [managed-node14] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 07:57:44 -0400 (0:00:01.400) 0:07:54.222 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 15 June 2025 07:57:44 -0400 (0:00:00.149) 0:07:54.372 *********** ok: [managed-node14] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 15 June 2025 07:57:44 -0400 (0:00:00.253) 0:07:54.626 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 15 June 2025 07:57:44 -0400 (0:00:00.200) 0:07:54.827 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 15 June 2025 07:57:44 -0400 (0:00:00.085) 0:07:54.912 *********** changed: [managed-node14] => (item={'src': '/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 15 June 2025 07:57:46 -0400 (0:00:01.581) 0:07:56.494 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 15 June 2025 07:57:48 -0400 (0:00:01.751) 0:07:58.246 *********** changed: [managed-node14] => (item={'src': '/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 15 June 2025 07:57:49 -0400 (0:00:01.649) 0:07:59.896 *********** skipping: [managed-node14] => (item={'src': '/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 15 June 2025 07:57:50 -0400 (0:00:00.230) 0:08:00.127 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 15 June 2025 07:57:51 -0400 (0:00:01.516) 0:08:01.643 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988567.2514565, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "05b6146c662e63d104e41e8ebe0f0d8d61438b0e", "ctime": 1749988560.9424686, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 383778967, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1749988560.9414685, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1732951680", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 15 June 2025 07:57:53 -0400 (0:00:01.464) 0:08:03.108 *********** changed: [managed-node14] => (item={'backing_device': '/dev/sda', 'name': 'luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node14] => (item={'backing_device': '/dev/sda1', 'name': 'luks-31d2f766-d0b1-458a-9b68-cf2015a578aa', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 15 June 2025 07:57:56 -0400 (0:00:03.152) 0:08:06.261 *********** ok: [managed-node14] TASK [Verify role results] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:241 Sunday 15 June 2025 07:57:58 -0400 (0:00:02.083) 0:08:08.344 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node14 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 15 June 2025 07:57:59 -0400 (0:00:00.646) 0:08:08.990 *********** ok: [managed-node14] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 15 June 2025 07:57:59 -0400 (0:00:00.293) 0:08:09.284 *********** skipping: [managed-node14] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 15 June 2025 07:57:59 -0400 (0:00:00.249) 0:08:09.533 *********** ok: [managed-node14] => { "changed": false, "info": { "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "size": "10G", "type": "crypt", "uuid": "fe5f760d-f4a2-475e-8776-53dab903eb63" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "31d2f766-d0b1-458a-9b68-cf2015a578aa" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 15 June 2025 07:58:00 -0400 (0:00:01.230) 0:08:10.764 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002424", "end": "2025-06-15 07:58:01.880880", "rc": 0, "start": "2025-06-15 07:58:01.878456" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 15 June 2025 07:58:02 -0400 (0:00:01.625) 0:08:12.389 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002327", "end": "2025-06-15 07:58:03.555060", "failed_when_result": false, "rc": 0, "start": "2025-06-15 07:58:03.552733" } STDOUT: luks-31d2f766-d0b1-458a-9b68-cf2015a578aa /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 15 June 2025 07:58:03 -0400 (0:00:01.382) 0:08:13.771 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node14 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Sunday 15 June 2025 07:58:04 -0400 (0:00:00.348) 0:08:14.120 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Sunday 15 June 2025 07:58:04 -0400 (0:00:00.231) 0:08:14.351 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Sunday 15 June 2025 07:58:04 -0400 (0:00:00.274) 0:08:14.626 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Sunday 15 June 2025 07:58:04 -0400 (0:00:00.274) 0:08:14.900 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node14 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Sunday 15 June 2025 07:58:05 -0400 (0:00:00.415) 0:08:15.316 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Sunday 15 June 2025 07:58:05 -0400 (0:00:00.229) 0:08:15.545 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Sunday 15 June 2025 07:58:05 -0400 (0:00:00.224) 0:08:15.770 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Sunday 15 June 2025 07:58:05 -0400 (0:00:00.189) 0:08:15.960 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Sunday 15 June 2025 07:58:06 -0400 (0:00:00.173) 0:08:16.133 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Sunday 15 June 2025 07:58:06 -0400 (0:00:00.228) 0:08:16.362 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Sunday 15 June 2025 07:58:06 -0400 (0:00:00.275) 0:08:16.637 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Sunday 15 June 2025 07:58:06 -0400 (0:00:00.216) 0:08:16.854 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Sunday 15 June 2025 07:58:07 -0400 (0:00:00.141) 0:08:16.995 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Sunday 15 June 2025 07:58:07 -0400 (0:00:00.278) 0:08:17.274 *********** ok: [managed-node14] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.207 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:74 Sunday 15 June 2025 07:58:08 -0400 (0:00:01.546) 0:08:18.821 *********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:84 Sunday 15 June 2025 07:58:09 -0400 (0:00:00.211) 0:08:19.032 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node14 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Sunday 15 June 2025 07:58:09 -0400 (0:00:00.464) 0:08:19.496 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Sunday 15 June 2025 07:58:09 -0400 (0:00:00.150) 0:08:19.647 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Sunday 15 June 2025 07:58:09 -0400 (0:00:00.172) 0:08:19.820 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Sunday 15 June 2025 07:58:10 -0400 (0:00:00.173) 0:08:19.993 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Sunday 15 June 2025 07:58:10 -0400 (0:00:00.094) 0:08:20.087 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Sunday 15 June 2025 07:58:10 -0400 (0:00:00.067) 0:08:20.155 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Sunday 15 June 2025 07:58:10 -0400 (0:00:00.381) 0:08:20.536 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Sunday 15 June 2025 07:58:10 -0400 (0:00:00.203) 0:08:20.739 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Sunday 15 June 2025 07:58:11 -0400 (0:00:00.249) 0:08:20.989 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Sunday 15 June 2025 07:58:11 -0400 (0:00:00.223) 0:08:21.213 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Sunday 15 June 2025 07:58:11 -0400 (0:00:00.181) 0:08:21.394 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Sunday 15 June 2025 07:58:11 -0400 (0:00:00.155) 0:08:21.549 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node14 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Sunday 15 June 2025 07:58:11 -0400 (0:00:00.408) 0:08:21.957 *********** skipping: [managed-node14] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Sunday 15 June 2025 07:58:12 -0400 (0:00:00.225) 0:08:22.182 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node14 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Sunday 15 June 2025 07:58:12 -0400 (0:00:00.571) 0:08:22.754 *********** skipping: [managed-node14] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Sunday 15 June 2025 07:58:13 -0400 (0:00:00.385) 0:08:23.139 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node14 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Sunday 15 June 2025 07:58:13 -0400 (0:00:00.527) 0:08:23.667 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Sunday 15 June 2025 07:58:13 -0400 (0:00:00.232) 0:08:23.900 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Sunday 15 June 2025 07:58:14 -0400 (0:00:00.495) 0:08:24.395 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Sunday 15 June 2025 07:58:14 -0400 (0:00:00.354) 0:08:24.749 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Sunday 15 June 2025 07:58:14 -0400 (0:00:00.194) 0:08:24.944 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node14 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Sunday 15 June 2025 07:58:15 -0400 (0:00:00.491) 0:08:25.435 *********** skipping: [managed-node14] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Sunday 15 June 2025 07:58:15 -0400 (0:00:00.274) 0:08:25.710 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node14 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Sunday 15 June 2025 07:58:16 -0400 (0:00:00.578) 0:08:26.288 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Sunday 15 June 2025 07:58:16 -0400 (0:00:00.174) 0:08:26.463 *********** skipping: [managed-node14] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Sunday 15 June 2025 07:58:16 -0400 (0:00:00.225) 0:08:26.688 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Sunday 15 June 2025 07:58:16 -0400 (0:00:00.253) 0:08:26.942 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Sunday 15 June 2025 07:58:17 -0400 (0:00:00.234) 0:08:27.176 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Sunday 15 June 2025 07:58:17 -0400 (0:00:00.240) 0:08:27.416 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Sunday 15 June 2025 07:58:17 -0400 (0:00:00.240) 0:08:27.656 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Sunday 15 June 2025 07:58:17 -0400 (0:00:00.217) 0:08:27.874 *********** ok: [managed-node14] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Sunday 15 June 2025 07:58:18 -0400 (0:00:00.203) 0:08:28.078 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node14 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 15 June 2025 07:58:18 -0400 (0:00:00.399) 0:08:28.478 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 15 June 2025 07:58:18 -0400 (0:00:00.267) 0:08:28.746 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node14 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 15 June 2025 07:58:19 -0400 (0:00:00.993) 0:08:29.739 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 15 June 2025 07:58:20 -0400 (0:00:00.298) 0:08:30.038 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 15 June 2025 07:58:20 -0400 (0:00:00.210) 0:08:30.249 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Sunday 15 June 2025 07:58:20 -0400 (0:00:00.192) 0:08:30.441 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Sunday 15 June 2025 07:58:20 -0400 (0:00:00.252) 0:08:30.694 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Sunday 15 June 2025 07:58:20 -0400 (0:00:00.248) 0:08:30.942 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Sunday 15 June 2025 07:58:21 -0400 (0:00:00.111) 0:08:31.054 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Sunday 15 June 2025 07:58:21 -0400 (0:00:00.176) 0:08:31.230 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Sunday 15 June 2025 07:58:21 -0400 (0:00:00.238) 0:08:31.468 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Sunday 15 June 2025 07:58:21 -0400 (0:00:00.253) 0:08:31.722 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Sunday 15 June 2025 07:58:21 -0400 (0:00:00.184) 0:08:31.906 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 15 June 2025 07:58:22 -0400 (0:00:00.145) 0:08:32.052 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 15 June 2025 07:58:22 -0400 (0:00:00.457) 0:08:32.509 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 15 June 2025 07:58:22 -0400 (0:00:00.248) 0:08:32.758 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 15 June 2025 07:58:23 -0400 (0:00:00.266) 0:08:33.024 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 15 June 2025 07:58:23 -0400 (0:00:00.252) 0:08:33.277 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 15 June 2025 07:58:23 -0400 (0:00:00.236) 0:08:33.513 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 15 June 2025 07:58:23 -0400 (0:00:00.194) 0:08:33.708 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 15 June 2025 07:58:24 -0400 (0:00:00.358) 0:08:34.066 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 15 June 2025 07:58:24 -0400 (0:00:00.393) 0:08:34.460 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988660.6472774, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749988660.6472774, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 174984, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1749988660.6472774, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 15 June 2025 07:58:26 -0400 (0:00:01.567) 0:08:36.028 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 15 June 2025 07:58:26 -0400 (0:00:00.409) 0:08:36.437 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 15 June 2025 07:58:26 -0400 (0:00:00.506) 0:08:36.943 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 15 June 2025 07:58:27 -0400 (0:00:00.214) 0:08:37.158 *********** ok: [managed-node14] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 15 June 2025 07:58:27 -0400 (0:00:00.214) 0:08:37.372 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 15 June 2025 07:58:27 -0400 (0:00:00.315) 0:08:37.687 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 15 June 2025 07:58:28 -0400 (0:00:00.359) 0:08:38.047 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988660.791277, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749988660.791277, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 175507, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1749988660.791277, "nlink": 1, "path": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 15 June 2025 07:58:29 -0400 (0:00:01.633) 0:08:39.681 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 15 June 2025 07:58:34 -0400 (0:00:04.574) 0:08:44.255 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.009982", "end": "2025-06-15 07:58:35.306940", "rc": 0, "start": "2025-06-15 07:58:35.296958" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 31d2f766-d0b1-458a-9b68-cf2015a578aa Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 941171 Threads: 2 Salt: 2f d2 16 df 65 59 17 66 45 ce a0 1d 76 cf 87 13 97 59 ee 2f f9 59 85 fc c5 fa ba f5 ef 87 86 ca AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 17 3d a4 7b c5 af 47 ff 47 cc 7e 41 fd 30 ac 77 7e 19 a6 a8 08 80 4e 01 ed ad 4b 6a f9 41 30 70 Digest: bf 76 fb d7 31 fc 2a b6 6a 06 7b 54 ba d6 52 ae b2 f4 4a 93 e1 9c 06 8e e5 96 88 d4 8b da de 5b TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 15 June 2025 07:58:35 -0400 (0:00:01.186) 0:08:45.442 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 15 June 2025 07:58:35 -0400 (0:00:00.188) 0:08:45.630 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 15 June 2025 07:58:35 -0400 (0:00:00.177) 0:08:45.808 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 15 June 2025 07:58:36 -0400 (0:00:00.194) 0:08:46.003 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 15 June 2025 07:58:36 -0400 (0:00:00.207) 0:08:46.211 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Sunday 15 June 2025 07:58:36 -0400 (0:00:00.204) 0:08:46.416 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Sunday 15 June 2025 07:58:36 -0400 (0:00:00.173) 0:08:46.589 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Sunday 15 June 2025 07:58:36 -0400 (0:00:00.159) 0:08:46.748 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-31d2f766-d0b1-458a-9b68-cf2015a578aa /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Sunday 15 June 2025 07:58:36 -0400 (0:00:00.209) 0:08:46.958 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Sunday 15 June 2025 07:58:37 -0400 (0:00:00.195) 0:08:47.153 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Sunday 15 June 2025 07:58:37 -0400 (0:00:00.200) 0:08:47.353 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Sunday 15 June 2025 07:58:37 -0400 (0:00:00.175) 0:08:47.528 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Sunday 15 June 2025 07:58:37 -0400 (0:00:00.222) 0:08:47.751 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 15 June 2025 07:58:37 -0400 (0:00:00.174) 0:08:47.925 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 15 June 2025 07:58:38 -0400 (0:00:00.199) 0:08:48.124 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 15 June 2025 07:58:38 -0400 (0:00:00.167) 0:08:48.291 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 15 June 2025 07:58:38 -0400 (0:00:00.199) 0:08:48.491 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 15 June 2025 07:58:38 -0400 (0:00:00.219) 0:08:48.711 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 15 June 2025 07:58:38 -0400 (0:00:00.212) 0:08:48.923 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 15 June 2025 07:58:39 -0400 (0:00:00.174) 0:08:49.098 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 15 June 2025 07:58:39 -0400 (0:00:00.267) 0:08:49.365 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 15 June 2025 07:58:39 -0400 (0:00:00.099) 0:08:49.465 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 15 June 2025 07:58:39 -0400 (0:00:00.211) 0:08:49.677 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 15 June 2025 07:58:39 -0400 (0:00:00.231) 0:08:49.908 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 15 June 2025 07:58:40 -0400 (0:00:00.230) 0:08:50.139 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 15 June 2025 07:58:40 -0400 (0:00:00.232) 0:08:50.371 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 15 June 2025 07:58:40 -0400 (0:00:00.208) 0:08:50.579 *********** ok: [managed-node14] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 15 June 2025 07:58:40 -0400 (0:00:00.212) 0:08:50.792 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 15 June 2025 07:58:41 -0400 (0:00:00.228) 0:08:51.021 *********** skipping: [managed-node14] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 15 June 2025 07:58:41 -0400 (0:00:00.206) 0:08:51.228 *********** skipping: [managed-node14] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 15 June 2025 07:58:41 -0400 (0:00:00.253) 0:08:51.481 *********** skipping: [managed-node14] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 15 June 2025 07:58:41 -0400 (0:00:00.212) 0:08:51.694 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Sunday 15 June 2025 07:58:41 -0400 (0:00:00.174) 0:08:51.868 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Sunday 15 June 2025 07:58:42 -0400 (0:00:00.231) 0:08:52.100 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Sunday 15 June 2025 07:58:42 -0400 (0:00:00.253) 0:08:52.353 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Sunday 15 June 2025 07:58:42 -0400 (0:00:00.179) 0:08:52.532 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Sunday 15 June 2025 07:58:42 -0400 (0:00:00.249) 0:08:52.782 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Sunday 15 June 2025 07:58:43 -0400 (0:00:00.214) 0:08:52.996 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Sunday 15 June 2025 07:58:43 -0400 (0:00:00.163) 0:08:53.159 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Sunday 15 June 2025 07:58:43 -0400 (0:00:00.213) 0:08:53.373 *********** skipping: [managed-node14] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Sunday 15 June 2025 07:58:43 -0400 (0:00:00.257) 0:08:53.631 *********** skipping: [managed-node14] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Sunday 15 June 2025 07:58:43 -0400 (0:00:00.233) 0:08:53.864 *********** skipping: [managed-node14] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Sunday 15 June 2025 07:58:44 -0400 (0:00:00.279) 0:08:54.143 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Sunday 15 June 2025 07:58:44 -0400 (0:00:00.229) 0:08:54.372 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Sunday 15 June 2025 07:58:44 -0400 (0:00:00.237) 0:08:54.610 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Sunday 15 June 2025 07:58:44 -0400 (0:00:00.251) 0:08:54.861 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Sunday 15 June 2025 07:58:45 -0400 (0:00:00.310) 0:08:55.172 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Sunday 15 June 2025 07:58:45 -0400 (0:00:00.181) 0:08:55.353 *********** ok: [managed-node14] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Sunday 15 June 2025 07:58:45 -0400 (0:00:00.179) 0:08:55.532 *********** ok: [managed-node14] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Sunday 15 June 2025 07:58:45 -0400 (0:00:00.208) 0:08:55.741 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 15 June 2025 07:58:46 -0400 (0:00:00.259) 0:08:56.000 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 15 June 2025 07:58:46 -0400 (0:00:00.322) 0:08:56.323 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 15 June 2025 07:58:46 -0400 (0:00:00.269) 0:08:56.593 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 15 June 2025 07:58:46 -0400 (0:00:00.250) 0:08:56.843 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 15 June 2025 07:58:47 -0400 (0:00:00.701) 0:08:57.545 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 15 June 2025 07:58:48 -0400 (0:00:00.421) 0:08:57.966 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 15 June 2025 07:58:48 -0400 (0:00:00.363) 0:08:58.330 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 15 June 2025 07:58:48 -0400 (0:00:00.183) 0:08:58.513 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Sunday 15 June 2025 07:58:48 -0400 (0:00:00.215) 0:08:58.728 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Sunday 15 June 2025 07:58:48 -0400 (0:00:00.111) 0:08:58.840 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Sunday 15 June 2025 07:58:49 -0400 (0:00:00.225) 0:08:59.065 *********** changed: [managed-node14] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:247 Sunday 15 June 2025 07:58:50 -0400 (0:00:01.447) 0:09:00.512 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node14 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 15 June 2025 07:58:51 -0400 (0:00:00.686) 0:09:01.199 *********** ok: [managed-node14] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 15 June 2025 07:58:51 -0400 (0:00:00.177) 0:09:01.377 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 07:58:51 -0400 (0:00:00.280) 0:09:01.658 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 07:58:52 -0400 (0:00:00.384) 0:09:02.043 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 07:58:52 -0400 (0:00:00.310) 0:09:02.353 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 07:58:52 -0400 (0:00:00.465) 0:09:02.819 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 07:58:53 -0400 (0:00:00.231) 0:09:03.051 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 07:58:53 -0400 (0:00:00.214) 0:09:03.265 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 07:58:53 -0400 (0:00:00.145) 0:09:03.411 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 07:58:53 -0400 (0:00:00.195) 0:09:03.607 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 07:58:54 -0400 (0:00:00.646) 0:09:04.253 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 07:58:59 -0400 (0:00:04.864) 0:09:09.118 *********** ok: [managed-node14] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 07:58:59 -0400 (0:00:00.341) 0:09:09.460 *********** ok: [managed-node14] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 07:58:59 -0400 (0:00:00.341) 0:09:09.802 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 07:59:05 -0400 (0:00:05.535) 0:09:15.338 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 07:59:05 -0400 (0:00:00.549) 0:09:15.887 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 07:59:06 -0400 (0:00:00.257) 0:09:16.144 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 07:59:06 -0400 (0:00:00.350) 0:09:16.494 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 07:59:06 -0400 (0:00:00.234) 0:09:16.729 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 07:59:11 -0400 (0:00:04.502) 0:09:21.232 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service": { "name": "systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2deaaa2312\\x2de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service": { "name": "systemd-cryptsetup@luks\\x2deaaa2312\\x2de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 07:59:14 -0400 (0:00:03.590) 0:09:24.822 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2deaaa2312\\x2de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 07:59:15 -0400 (0:00:00.451) 0:09:25.273 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2deaaa2312\x2de5d4\x2d4e1d\x2dbde8\x2d152f64419ced.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2deaaa2312\\x2de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "name": "systemd-cryptsetup@luks\\x2deaaa2312\\x2de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-sda.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-eaaa2312-e5d4-4e1d-bde8-152f64419ced ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2deaaa2312\\x2de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2deaaa2312\\x2de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2deaaa2312\\x2de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2025-06-15 07:57:51 EDT", "StateChangeTimestampMonotonic": "1975232583", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...de5d4\x2d4e1d\x2dbde8\x2d152f64419ced.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "name": "systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 07:59:18 -0400 (0:00:03.517) 0:09:28.791 *********** fatal: [managed-node14]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-31d2f766-d0b1-458a-9b68-cf2015a578aa' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 15 June 2025 07:59:24 -0400 (0:00:05.593) 0:09:34.384 *********** fatal: [managed-node14]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-31d2f766-d0b1-458a-9b68-cf2015a578aa' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 07:59:24 -0400 (0:00:00.332) 0:09:34.717 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2deaaa2312\x2de5d4\x2d4e1d\x2dbde8\x2d152f64419ced.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2deaaa2312\\x2de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "name": "systemd-cryptsetup@luks\\x2deaaa2312\\x2de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2deaaa2312\\x2de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2deaaa2312\\x2de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2deaaa2312\\x2de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2deaaa2312\\x2de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...de5d4\x2d4e1d\x2dbde8\x2d152f64419ced.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "name": "systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...de5d4\\x2d4e1d\\x2dbde8\\x2d152f64419ced.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 15 June 2025 07:59:28 -0400 (0:00:03.715) 0:09:38.432 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 15 June 2025 07:59:28 -0400 (0:00:00.321) 0:09:38.754 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 15 June 2025 07:59:29 -0400 (0:00:00.319) 0:09:39.074 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Sunday 15 June 2025 07:59:29 -0400 (0:00:00.206) 0:09:39.280 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988730.2671442, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1749988730.2671442, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1749988730.2671442, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2113120513", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Sunday 15 June 2025 07:59:30 -0400 (0:00:01.612) 0:09:40.893 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:272 Sunday 15 June 2025 07:59:31 -0400 (0:00:00.172) 0:09:41.066 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 07:59:31 -0400 (0:00:00.891) 0:09:41.957 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 07:59:32 -0400 (0:00:00.367) 0:09:42.325 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 07:59:32 -0400 (0:00:00.228) 0:09:42.553 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 07:59:33 -0400 (0:00:00.483) 0:09:43.036 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 07:59:33 -0400 (0:00:00.199) 0:09:43.236 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 07:59:33 -0400 (0:00:00.251) 0:09:43.487 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 07:59:33 -0400 (0:00:00.247) 0:09:43.734 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 07:59:33 -0400 (0:00:00.219) 0:09:43.954 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 07:59:35 -0400 (0:00:01.226) 0:09:45.181 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 07:59:39 -0400 (0:00:04.468) 0:09:49.649 *********** ok: [managed-node14] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 07:59:39 -0400 (0:00:00.277) 0:09:49.927 *********** ok: [managed-node14] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 07:59:40 -0400 (0:00:00.248) 0:09:50.175 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 07:59:45 -0400 (0:00:05.316) 0:09:55.492 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 07:59:45 -0400 (0:00:00.376) 0:09:55.868 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 07:59:46 -0400 (0:00:00.212) 0:09:56.080 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 07:59:46 -0400 (0:00:00.284) 0:09:56.365 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 07:59:46 -0400 (0:00:00.195) 0:09:56.561 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 07:59:51 -0400 (0:00:04.485) 0:10:01.046 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service": { "name": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service": { "name": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 07:59:53 -0400 (0:00:02.742) 0:10:03.789 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 07:59:54 -0400 (0:00:00.331) 0:10:04.121 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2d31d2f766\x2dd0b1\x2d458a\x2d9b68\x2dcf2015a578aa.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "name": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device systemd-journald.socket cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-31d2f766-d0b1-458a-9b68-cf2015a578aa /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-31d2f766-d0b1-458a-9b68-cf2015a578aa ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2025-06-15 07:59:18 EDT", "StateChangeTimestampMonotonic": "2062328488", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...dd0b1\x2d458a\x2d9b68\x2dcf2015a578aa.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "name": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 07:59:57 -0400 (0:00:03.492) 0:10:07.614 *********** changed: [managed-node14] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=c8854512-63fe-486c-b1ad-1597284a9216", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=c8854512-63fe-486c-b1ad-1597284a9216", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 15 June 2025 08:00:03 -0400 (0:00:05.968) 0:10:13.583 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 15 June 2025 08:00:03 -0400 (0:00:00.339) 0:10:13.922 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988669.6542602, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "d7ff5a27876abd6c491c4f695cd178e182655df9", "ctime": 1749988669.6512601, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1749988669.6512601, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2912082900", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 15 June 2025 08:00:05 -0400 (0:00:01.184) 0:10:15.107 *********** ok: [managed-node14] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 08:00:06 -0400 (0:00:01.455) 0:10:16.562 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2d31d2f766\x2dd0b1\x2d458a\x2d9b68\x2dcf2015a578aa.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "name": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2025-06-15 07:59:18 EDT", "StateChangeTimestampMonotonic": "2062328488", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...dd0b1\x2d458a\x2d9b68\x2dcf2015a578aa.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "name": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 15 June 2025 08:00:10 -0400 (0:00:04.307) 0:10:20.869 *********** ok: [managed-node14] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=c8854512-63fe-486c-b1ad-1597284a9216", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=c8854512-63fe-486c-b1ad-1597284a9216", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 15 June 2025 08:00:11 -0400 (0:00:00.223) 0:10:21.093 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=c8854512-63fe-486c-b1ad-1597284a9216", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 15 June 2025 08:00:11 -0400 (0:00:00.266) 0:10:21.359 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 15 June 2025 08:00:11 -0400 (0:00:00.261) 0:10:21.621 *********** changed: [managed-node14] => (item={'src': '/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-31d2f766-d0b1-458a-9b68-cf2015a578aa" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 15 June 2025 08:00:13 -0400 (0:00:01.591) 0:10:23.212 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 15 June 2025 08:00:14 -0400 (0:00:01.637) 0:10:24.850 *********** changed: [managed-node14] => (item={'src': 'UUID=c8854512-63fe-486c-b1ad-1597284a9216', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=c8854512-63fe-486c-b1ad-1597284a9216", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=c8854512-63fe-486c-b1ad-1597284a9216" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 15 June 2025 08:00:16 -0400 (0:00:01.235) 0:10:26.086 *********** skipping: [managed-node14] => (item={'src': 'UUID=c8854512-63fe-486c-b1ad-1597284a9216', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=c8854512-63fe-486c-b1ad-1597284a9216", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 15 June 2025 08:00:16 -0400 (0:00:00.219) 0:10:26.305 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 15 June 2025 08:00:17 -0400 (0:00:01.349) 0:10:27.655 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988683.5542336, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "df0c797e25a073dccd3692ccc4ebe18e17a31e8d", "ctime": 1749988676.030248, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 499122334, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1749988676.029248, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "3852494827", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 15 June 2025 08:00:18 -0400 (0:00:01.267) 0:10:28.922 *********** changed: [managed-node14] => (item={'backing_device': '/dev/sda1', 'name': 'luks-31d2f766-d0b1-458a-9b68-cf2015a578aa', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 15 June 2025 08:00:20 -0400 (0:00:01.188) 0:10:30.111 *********** ok: [managed-node14] TASK [Verify role results] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:290 Sunday 15 June 2025 08:00:21 -0400 (0:00:01.475) 0:10:31.587 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node14 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 15 June 2025 08:00:22 -0400 (0:00:00.625) 0:10:32.213 *********** ok: [managed-node14] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=c8854512-63fe-486c-b1ad-1597284a9216", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 15 June 2025 08:00:22 -0400 (0:00:00.227) 0:10:32.440 *********** skipping: [managed-node14] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 15 June 2025 08:00:22 -0400 (0:00:00.241) 0:10:32.682 *********** ok: [managed-node14] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "c8854512-63fe-486c-b1ad-1597284a9216" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 15 June 2025 08:00:23 -0400 (0:00:01.267) 0:10:33.949 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002204", "end": "2025-06-15 08:00:24.871430", "rc": 0, "start": "2025-06-15 08:00:24.869226" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=c8854512-63fe-486c-b1ad-1597284a9216 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 15 June 2025 08:00:25 -0400 (0:00:01.083) 0:10:35.033 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002428", "end": "2025-06-15 08:00:25.923840", "failed_when_result": false, "rc": 0, "start": "2025-06-15 08:00:25.921412" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 15 June 2025 08:00:26 -0400 (0:00:01.110) 0:10:36.143 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node14 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Sunday 15 June 2025 08:00:26 -0400 (0:00:00.342) 0:10:36.486 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Sunday 15 June 2025 08:00:26 -0400 (0:00:00.129) 0:10:36.615 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Sunday 15 June 2025 08:00:26 -0400 (0:00:00.238) 0:10:36.854 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Sunday 15 June 2025 08:00:26 -0400 (0:00:00.056) 0:10:36.911 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node14 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Sunday 15 June 2025 08:00:27 -0400 (0:00:00.181) 0:10:37.093 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Sunday 15 June 2025 08:00:27 -0400 (0:00:00.066) 0:10:37.159 *********** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Sunday 15 June 2025 08:00:27 -0400 (0:00:00.158) 0:10:37.318 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Sunday 15 June 2025 08:00:27 -0400 (0:00:00.213) 0:10:37.532 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Sunday 15 June 2025 08:00:27 -0400 (0:00:00.408) 0:10:37.940 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Sunday 15 June 2025 08:00:28 -0400 (0:00:00.092) 0:10:38.032 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Sunday 15 June 2025 08:00:28 -0400 (0:00:00.218) 0:10:38.251 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Sunday 15 June 2025 08:00:28 -0400 (0:00:00.246) 0:10:38.498 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Sunday 15 June 2025 08:00:28 -0400 (0:00:00.175) 0:10:38.673 *********** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Sunday 15 June 2025 08:00:28 -0400 (0:00:00.175) 0:10:38.848 *********** ok: [managed-node14] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.207 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:74 Sunday 15 June 2025 08:00:30 -0400 (0:00:01.505) 0:10:40.354 *********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:84 Sunday 15 June 2025 08:00:30 -0400 (0:00:00.200) 0:10:40.555 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node14 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Sunday 15 June 2025 08:00:31 -0400 (0:00:00.479) 0:10:41.034 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Sunday 15 June 2025 08:00:31 -0400 (0:00:00.226) 0:10:41.261 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Sunday 15 June 2025 08:00:31 -0400 (0:00:00.123) 0:10:41.384 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Sunday 15 June 2025 08:00:31 -0400 (0:00:00.262) 0:10:41.647 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Sunday 15 June 2025 08:00:31 -0400 (0:00:00.269) 0:10:41.916 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Sunday 15 June 2025 08:00:32 -0400 (0:00:00.257) 0:10:42.174 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Sunday 15 June 2025 08:00:32 -0400 (0:00:00.285) 0:10:42.460 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Sunday 15 June 2025 08:00:32 -0400 (0:00:00.308) 0:10:42.768 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Sunday 15 June 2025 08:00:32 -0400 (0:00:00.195) 0:10:42.964 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Sunday 15 June 2025 08:00:33 -0400 (0:00:00.214) 0:10:43.178 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Sunday 15 June 2025 08:00:33 -0400 (0:00:00.344) 0:10:43.522 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Sunday 15 June 2025 08:00:33 -0400 (0:00:00.282) 0:10:43.805 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node14 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Sunday 15 June 2025 08:00:34 -0400 (0:00:00.440) 0:10:44.245 *********** skipping: [managed-node14] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=c8854512-63fe-486c-b1ad-1597284a9216', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=c8854512-63fe-486c-b1ad-1597284a9216", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Sunday 15 June 2025 08:00:34 -0400 (0:00:00.235) 0:10:44.481 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node14 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Sunday 15 June 2025 08:00:35 -0400 (0:00:00.521) 0:10:45.002 *********** skipping: [managed-node14] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=c8854512-63fe-486c-b1ad-1597284a9216', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=c8854512-63fe-486c-b1ad-1597284a9216", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Sunday 15 June 2025 08:00:35 -0400 (0:00:00.205) 0:10:45.208 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node14 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Sunday 15 June 2025 08:00:35 -0400 (0:00:00.575) 0:10:45.784 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Sunday 15 June 2025 08:00:36 -0400 (0:00:00.248) 0:10:46.032 *********** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Sunday 15 June 2025 08:00:36 -0400 (0:00:00.206) 0:10:46.238 *********** TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Sunday 15 June 2025 08:00:36 -0400 (0:00:00.210) 0:10:46.449 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Sunday 15 June 2025 08:00:36 -0400 (0:00:00.247) 0:10:46.696 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node14 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Sunday 15 June 2025 08:00:37 -0400 (0:00:00.533) 0:10:47.230 *********** skipping: [managed-node14] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=c8854512-63fe-486c-b1ad-1597284a9216', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=c8854512-63fe-486c-b1ad-1597284a9216", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Sunday 15 June 2025 08:00:37 -0400 (0:00:00.243) 0:10:47.473 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node14 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Sunday 15 June 2025 08:00:37 -0400 (0:00:00.444) 0:10:47.918 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Sunday 15 June 2025 08:00:38 -0400 (0:00:00.305) 0:10:48.223 *********** skipping: [managed-node14] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Sunday 15 June 2025 08:00:38 -0400 (0:00:00.186) 0:10:48.410 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Sunday 15 June 2025 08:00:38 -0400 (0:00:00.173) 0:10:48.583 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Sunday 15 June 2025 08:00:38 -0400 (0:00:00.239) 0:10:48.823 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Sunday 15 June 2025 08:00:39 -0400 (0:00:00.257) 0:10:49.081 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Sunday 15 June 2025 08:00:39 -0400 (0:00:00.205) 0:10:49.286 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Sunday 15 June 2025 08:00:39 -0400 (0:00:00.218) 0:10:49.505 *********** ok: [managed-node14] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Sunday 15 June 2025 08:00:39 -0400 (0:00:00.263) 0:10:49.769 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node14 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 15 June 2025 08:00:40 -0400 (0:00:01.004) 0:10:50.773 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 15 June 2025 08:00:41 -0400 (0:00:00.240) 0:10:51.014 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node14 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 15 June 2025 08:00:42 -0400 (0:00:01.275) 0:10:52.290 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 15 June 2025 08:00:42 -0400 (0:00:00.351) 0:10:52.641 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 15 June 2025 08:00:42 -0400 (0:00:00.321) 0:10:52.963 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Sunday 15 June 2025 08:00:43 -0400 (0:00:00.255) 0:10:53.219 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Sunday 15 June 2025 08:00:43 -0400 (0:00:00.335) 0:10:53.555 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Sunday 15 June 2025 08:00:43 -0400 (0:00:00.200) 0:10:53.755 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Sunday 15 June 2025 08:00:44 -0400 (0:00:00.215) 0:10:53.971 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Sunday 15 June 2025 08:00:44 -0400 (0:00:00.194) 0:10:54.166 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Sunday 15 June 2025 08:00:44 -0400 (0:00:00.264) 0:10:54.431 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Sunday 15 June 2025 08:00:44 -0400 (0:00:00.129) 0:10:54.560 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Sunday 15 June 2025 08:00:44 -0400 (0:00:00.207) 0:10:54.767 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 15 June 2025 08:00:45 -0400 (0:00:00.208) 0:10:54.976 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=c8854512-63fe-486c-b1ad-1597284a9216 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 15 June 2025 08:00:45 -0400 (0:00:00.434) 0:10:55.410 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 15 June 2025 08:00:45 -0400 (0:00:00.325) 0:10:55.735 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 15 June 2025 08:00:46 -0400 (0:00:00.263) 0:10:55.999 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 15 June 2025 08:00:46 -0400 (0:00:00.289) 0:10:56.289 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 15 June 2025 08:00:46 -0400 (0:00:00.278) 0:10:56.567 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 15 June 2025 08:00:46 -0400 (0:00:00.306) 0:10:56.874 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 15 June 2025 08:00:47 -0400 (0:00:00.289) 0:10:57.163 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 15 June 2025 08:00:47 -0400 (0:00:00.269) 0:10:57.433 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988803.3330045, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749988803.3330045, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 174984, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1749988803.3330045, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 15 June 2025 08:00:48 -0400 (0:00:01.234) 0:10:58.667 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 15 June 2025 08:00:49 -0400 (0:00:00.343) 0:10:59.010 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 15 June 2025 08:00:49 -0400 (0:00:00.334) 0:10:59.345 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 15 June 2025 08:00:49 -0400 (0:00:00.169) 0:10:59.514 *********** ok: [managed-node14] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 15 June 2025 08:00:49 -0400 (0:00:00.234) 0:10:59.749 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 15 June 2025 08:00:49 -0400 (0:00:00.149) 0:10:59.898 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 15 June 2025 08:00:50 -0400 (0:00:00.202) 0:11:00.100 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 15 June 2025 08:00:50 -0400 (0:00:00.126) 0:11:00.226 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 15 June 2025 08:00:54 -0400 (0:00:04.626) 0:11:04.852 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 15 June 2025 08:00:55 -0400 (0:00:00.175) 0:11:05.028 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 15 June 2025 08:00:55 -0400 (0:00:00.098) 0:11:05.126 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 15 June 2025 08:00:55 -0400 (0:00:00.161) 0:11:05.288 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 15 June 2025 08:00:55 -0400 (0:00:00.172) 0:11:05.461 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 15 June 2025 08:00:55 -0400 (0:00:00.127) 0:11:05.588 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Sunday 15 June 2025 08:00:55 -0400 (0:00:00.180) 0:11:05.769 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Sunday 15 June 2025 08:00:55 -0400 (0:00:00.154) 0:11:05.923 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Sunday 15 June 2025 08:00:56 -0400 (0:00:00.199) 0:11:06.123 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Sunday 15 June 2025 08:00:56 -0400 (0:00:00.258) 0:11:06.382 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Sunday 15 June 2025 08:00:56 -0400 (0:00:00.204) 0:11:06.586 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Sunday 15 June 2025 08:00:56 -0400 (0:00:00.256) 0:11:06.843 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Sunday 15 June 2025 08:00:57 -0400 (0:00:00.198) 0:11:07.041 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Sunday 15 June 2025 08:00:57 -0400 (0:00:00.313) 0:11:07.355 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 15 June 2025 08:00:57 -0400 (0:00:00.304) 0:11:07.659 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 15 June 2025 08:00:57 -0400 (0:00:00.223) 0:11:07.882 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 15 June 2025 08:00:58 -0400 (0:00:00.272) 0:11:08.155 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 15 June 2025 08:00:58 -0400 (0:00:00.201) 0:11:08.357 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 15 June 2025 08:00:58 -0400 (0:00:00.259) 0:11:08.616 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 15 June 2025 08:00:58 -0400 (0:00:00.177) 0:11:08.794 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 15 June 2025 08:00:59 -0400 (0:00:00.271) 0:11:09.066 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 15 June 2025 08:00:59 -0400 (0:00:00.247) 0:11:09.314 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 15 June 2025 08:00:59 -0400 (0:00:00.307) 0:11:09.621 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 15 June 2025 08:00:59 -0400 (0:00:00.193) 0:11:09.814 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 15 June 2025 08:01:00 -0400 (0:00:00.695) 0:11:10.509 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 15 June 2025 08:01:00 -0400 (0:00:00.341) 0:11:10.851 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 15 June 2025 08:01:01 -0400 (0:00:00.228) 0:11:11.079 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 15 June 2025 08:01:01 -0400 (0:00:00.196) 0:11:11.276 *********** ok: [managed-node14] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 15 June 2025 08:01:01 -0400 (0:00:00.295) 0:11:11.571 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 15 June 2025 08:01:01 -0400 (0:00:00.224) 0:11:11.796 *********** skipping: [managed-node14] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 15 June 2025 08:01:02 -0400 (0:00:00.341) 0:11:12.137 *********** skipping: [managed-node14] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 15 June 2025 08:01:02 -0400 (0:00:00.252) 0:11:12.390 *********** skipping: [managed-node14] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 15 June 2025 08:01:02 -0400 (0:00:00.400) 0:11:12.790 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Sunday 15 June 2025 08:01:03 -0400 (0:00:00.197) 0:11:12.987 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Sunday 15 June 2025 08:01:03 -0400 (0:00:00.296) 0:11:13.283 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Sunday 15 June 2025 08:01:03 -0400 (0:00:00.194) 0:11:13.477 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Sunday 15 June 2025 08:01:03 -0400 (0:00:00.129) 0:11:13.607 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Sunday 15 June 2025 08:01:03 -0400 (0:00:00.232) 0:11:13.839 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Sunday 15 June 2025 08:01:04 -0400 (0:00:00.310) 0:11:14.150 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Sunday 15 June 2025 08:01:04 -0400 (0:00:00.260) 0:11:14.411 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Sunday 15 June 2025 08:01:04 -0400 (0:00:00.171) 0:11:14.582 *********** skipping: [managed-node14] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Sunday 15 June 2025 08:01:04 -0400 (0:00:00.285) 0:11:14.867 *********** skipping: [managed-node14] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Sunday 15 June 2025 08:01:05 -0400 (0:00:00.122) 0:11:14.990 *********** skipping: [managed-node14] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Sunday 15 June 2025 08:01:05 -0400 (0:00:00.214) 0:11:15.205 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Sunday 15 June 2025 08:01:05 -0400 (0:00:00.181) 0:11:15.387 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Sunday 15 June 2025 08:01:05 -0400 (0:00:00.163) 0:11:15.551 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Sunday 15 June 2025 08:01:05 -0400 (0:00:00.134) 0:11:15.685 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Sunday 15 June 2025 08:01:05 -0400 (0:00:00.115) 0:11:15.801 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Sunday 15 June 2025 08:01:06 -0400 (0:00:00.257) 0:11:16.058 *********** ok: [managed-node14] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Sunday 15 June 2025 08:01:06 -0400 (0:00:00.175) 0:11:16.234 *********** ok: [managed-node14] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Sunday 15 June 2025 08:01:06 -0400 (0:00:00.150) 0:11:16.384 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 15 June 2025 08:01:06 -0400 (0:00:00.237) 0:11:16.622 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 15 June 2025 08:01:06 -0400 (0:00:00.134) 0:11:16.756 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 15 June 2025 08:01:07 -0400 (0:00:00.325) 0:11:17.082 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 15 June 2025 08:01:07 -0400 (0:00:00.239) 0:11:17.322 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 15 June 2025 08:01:07 -0400 (0:00:00.125) 0:11:17.447 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 15 June 2025 08:01:07 -0400 (0:00:00.257) 0:11:17.705 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 15 June 2025 08:01:08 -0400 (0:00:00.271) 0:11:17.976 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 15 June 2025 08:01:08 -0400 (0:00:00.297) 0:11:18.274 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Sunday 15 June 2025 08:01:08 -0400 (0:00:00.232) 0:11:18.506 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Sunday 15 June 2025 08:01:08 -0400 (0:00:00.196) 0:11:18.703 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Sunday 15 June 2025 08:01:09 -0400 (0:00:00.273) 0:11:18.976 *********** changed: [managed-node14] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:296 Sunday 15 June 2025 08:01:10 -0400 (0:00:01.261) 0:11:20.237 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node14 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 15 June 2025 08:01:11 -0400 (0:00:00.798) 0:11:21.036 *********** ok: [managed-node14] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 15 June 2025 08:01:11 -0400 (0:00:00.207) 0:11:21.243 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 08:01:11 -0400 (0:00:00.380) 0:11:21.623 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 08:01:11 -0400 (0:00:00.336) 0:11:21.960 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 08:01:12 -0400 (0:00:00.298) 0:11:22.259 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 08:01:13 -0400 (0:00:01.109) 0:11:23.369 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 08:01:13 -0400 (0:00:00.246) 0:11:23.615 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 08:01:13 -0400 (0:00:00.176) 0:11:23.791 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 08:01:14 -0400 (0:00:00.262) 0:11:24.054 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 08:01:14 -0400 (0:00:00.182) 0:11:24.237 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 08:01:14 -0400 (0:00:00.548) 0:11:24.785 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 08:01:19 -0400 (0:00:04.860) 0:11:29.646 *********** ok: [managed-node14] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 08:01:19 -0400 (0:00:00.195) 0:11:29.841 *********** ok: [managed-node14] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 08:01:20 -0400 (0:00:00.298) 0:11:30.140 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 08:01:25 -0400 (0:00:05.366) 0:11:35.507 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 08:01:25 -0400 (0:00:00.269) 0:11:35.776 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 08:01:25 -0400 (0:00:00.080) 0:11:35.857 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 08:01:26 -0400 (0:00:00.210) 0:11:36.067 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 08:01:26 -0400 (0:00:00.150) 0:11:36.218 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 08:01:30 -0400 (0:00:04.485) 0:11:40.703 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service": { "name": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service": { "name": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 08:01:33 -0400 (0:00:02.868) 0:11:43.572 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 08:01:33 -0400 (0:00:00.338) 0:11:43.911 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2d31d2f766\x2dd0b1\x2d458a\x2d9b68\x2dcf2015a578aa.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "name": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-31d2f766-d0b1-458a-9b68-cf2015a578aa", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-31d2f766-d0b1-458a-9b68-cf2015a578aa /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-31d2f766-d0b1-458a-9b68-cf2015a578aa ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2025-06-15 07:59:18 EDT", "StateChangeTimestampMonotonic": "2062328488", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...dd0b1\x2d458a\x2d9b68\x2dcf2015a578aa.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "name": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 08:01:36 -0400 (0:00:03.012) 0:11:46.923 *********** fatal: [managed-node14]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 15 June 2025 08:01:42 -0400 (0:00:05.531) 0:11:52.455 *********** fatal: [managed-node14]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 08:01:42 -0400 (0:00:00.103) 0:11:52.558 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2d31d2f766\x2dd0b1\x2d458a\x2d9b68\x2dcf2015a578aa.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "name": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d31d2f766\\x2dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...dd0b1\x2d458a\x2d9b68\x2dcf2015a578aa.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "name": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dd0b1\\x2d458a\\x2d9b68\\x2dcf2015a578aa.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 15 June 2025 08:01:46 -0400 (0:00:03.637) 0:11:56.195 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 15 June 2025 08:01:46 -0400 (0:00:00.422) 0:11:56.617 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 15 June 2025 08:01:47 -0400 (0:00:00.371) 0:11:56.989 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Sunday 15 June 2025 08:01:47 -0400 (0:00:00.117) 0:11:57.107 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988869.9218771, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1749988869.9218771, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1749988869.9218771, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1559456819", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Sunday 15 June 2025 08:01:48 -0400 (0:00:01.659) 0:11:58.766 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:323 Sunday 15 June 2025 08:01:49 -0400 (0:00:00.278) 0:11:59.045 *********** ok: [managed-node14] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testtpprf78plukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:330 Sunday 15 June 2025 08:01:51 -0400 (0:00:02.702) 0:12:01.747 *********** ok: [managed-node14] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testtpprf78plukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1749988912.129629-159238-204337525099590/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:337 Sunday 15 June 2025 08:01:55 -0400 (0:00:03.464) 0:12:05.211 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 08:01:55 -0400 (0:00:00.343) 0:12:05.555 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 08:01:55 -0400 (0:00:00.270) 0:12:05.825 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 08:01:56 -0400 (0:00:00.213) 0:12:06.038 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 08:01:56 -0400 (0:00:00.573) 0:12:06.611 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 08:01:56 -0400 (0:00:00.198) 0:12:06.810 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 08:01:57 -0400 (0:00:00.234) 0:12:07.044 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 08:01:57 -0400 (0:00:00.187) 0:12:07.231 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 08:01:57 -0400 (0:00:00.182) 0:12:07.413 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 08:01:57 -0400 (0:00:00.494) 0:12:07.908 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 08:02:02 -0400 (0:00:04.881) 0:12:12.789 *********** ok: [managed-node14] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testtpprf78plukskey", "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 08:02:03 -0400 (0:00:00.313) 0:12:13.102 *********** ok: [managed-node14] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 08:02:03 -0400 (0:00:00.187) 0:12:13.290 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 08:02:08 -0400 (0:00:04.785) 0:12:18.076 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 08:02:08 -0400 (0:00:00.304) 0:12:18.380 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 08:02:08 -0400 (0:00:00.194) 0:12:18.575 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 08:02:08 -0400 (0:00:00.227) 0:12:18.803 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 08:02:09 -0400 (0:00:00.205) 0:12:19.009 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 08:02:13 -0400 (0:00:04.912) 0:12:23.921 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 08:02:16 -0400 (0:00:02.849) 0:12:26.771 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 08:02:17 -0400 (0:00:00.340) 0:12:27.111 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 08:02:17 -0400 (0:00:00.169) 0:12:27.280 *********** changed: [managed-node14] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=c8854512-63fe-486c-b1ad-1597284a9216", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 15 June 2025 08:02:31 -0400 (0:00:13.721) 0:12:41.001 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 15 June 2025 08:02:31 -0400 (0:00:00.251) 0:12:41.253 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988815.8439806, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e1558727c899fce0d160c8a7842d848fcb57a989", "ctime": 1749988815.8409805, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1749988815.8409805, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "2912082900", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 15 June 2025 08:02:32 -0400 (0:00:01.268) 0:12:42.521 *********** ok: [managed-node14] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 08:02:34 -0400 (0:00:01.589) 0:12:44.111 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 15 June 2025 08:02:34 -0400 (0:00:00.164) 0:12:44.276 *********** ok: [managed-node14] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=c8854512-63fe-486c-b1ad-1597284a9216", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 15 June 2025 08:02:34 -0400 (0:00:00.314) 0:12:44.591 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 15 June 2025 08:02:34 -0400 (0:00:00.270) 0:12:44.861 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 15 June 2025 08:02:35 -0400 (0:00:00.263) 0:12:45.125 *********** changed: [managed-node14] => (item={'src': 'UUID=c8854512-63fe-486c-b1ad-1597284a9216', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=c8854512-63fe-486c-b1ad-1597284a9216", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=c8854512-63fe-486c-b1ad-1597284a9216" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 15 June 2025 08:02:37 -0400 (0:00:01.876) 0:12:47.002 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 15 June 2025 08:02:38 -0400 (0:00:01.832) 0:12:48.834 *********** changed: [managed-node14] => (item={'src': '/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 15 June 2025 08:02:40 -0400 (0:00:01.605) 0:12:50.440 *********** skipping: [managed-node14] => (item={'src': '/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 15 June 2025 08:02:40 -0400 (0:00:00.276) 0:12:50.717 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 15 June 2025 08:02:42 -0400 (0:00:01.881) 0:12:52.598 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988825.9229612, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1749988819.9229727, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 127926472, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1749988819.9219728, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2847224760", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 15 June 2025 08:02:44 -0400 (0:00:01.522) 0:12:54.121 *********** changed: [managed-node14] => (item={'backing_device': '/dev/sda1', 'name': 'luks-0a10fa1a-6547-4337-bcdf-c5411995e857', 'password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 15 June 2025 08:02:45 -0400 (0:00:01.457) 0:12:55.578 *********** ok: [managed-node14] TASK [Verify role results] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:355 Sunday 15 June 2025 08:02:47 -0400 (0:00:02.131) 0:12:57.710 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node14 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 15 June 2025 08:02:48 -0400 (0:00:00.462) 0:12:58.172 *********** ok: [managed-node14] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 15 June 2025 08:02:48 -0400 (0:00:00.331) 0:12:58.504 *********** skipping: [managed-node14] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 15 June 2025 08:02:48 -0400 (0:00:00.279) 0:12:58.783 *********** ok: [managed-node14] => { "changed": false, "info": { "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "size": "10G", "type": "crypt", "uuid": "8cc91050-cfc1-4205-aafe-11e9f67da1bb" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "0a10fa1a-6547-4337-bcdf-c5411995e857" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 15 June 2025 08:02:50 -0400 (0:00:01.514) 0:13:00.298 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002197", "end": "2025-06-15 08:02:51.479407", "rc": 0, "start": "2025-06-15 08:02:51.477210" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 15 June 2025 08:02:51 -0400 (0:00:01.364) 0:13:01.662 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002391", "end": "2025-06-15 08:02:52.642283", "failed_when_result": false, "rc": 0, "start": "2025-06-15 08:02:52.639892" } STDOUT: luks-0a10fa1a-6547-4337-bcdf-c5411995e857 /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 15 June 2025 08:02:52 -0400 (0:00:01.207) 0:13:02.870 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node14 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Sunday 15 June 2025 08:02:53 -0400 (0:00:00.457) 0:13:03.327 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Sunday 15 June 2025 08:02:53 -0400 (0:00:00.303) 0:13:03.631 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Sunday 15 June 2025 08:02:53 -0400 (0:00:00.231) 0:13:03.863 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Sunday 15 June 2025 08:02:54 -0400 (0:00:00.251) 0:13:04.114 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node14 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Sunday 15 June 2025 08:02:54 -0400 (0:00:00.513) 0:13:04.627 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Sunday 15 June 2025 08:02:54 -0400 (0:00:00.234) 0:13:04.861 *********** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Sunday 15 June 2025 08:02:55 -0400 (0:00:00.215) 0:13:05.077 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Sunday 15 June 2025 08:02:55 -0400 (0:00:00.277) 0:13:05.355 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Sunday 15 June 2025 08:02:55 -0400 (0:00:00.249) 0:13:05.605 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Sunday 15 June 2025 08:02:55 -0400 (0:00:00.287) 0:13:05.892 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Sunday 15 June 2025 08:02:56 -0400 (0:00:00.283) 0:13:06.176 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Sunday 15 June 2025 08:02:56 -0400 (0:00:00.266) 0:13:06.442 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Sunday 15 June 2025 08:02:56 -0400 (0:00:00.230) 0:13:06.673 *********** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Sunday 15 June 2025 08:02:56 -0400 (0:00:00.101) 0:13:06.774 *********** ok: [managed-node14] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.207 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:74 Sunday 15 June 2025 08:02:58 -0400 (0:00:01.871) 0:13:08.645 *********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:84 Sunday 15 June 2025 08:02:58 -0400 (0:00:00.176) 0:13:08.822 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node14 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Sunday 15 June 2025 08:02:59 -0400 (0:00:00.369) 0:13:09.191 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Sunday 15 June 2025 08:02:59 -0400 (0:00:00.272) 0:13:09.464 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Sunday 15 June 2025 08:02:59 -0400 (0:00:00.142) 0:13:09.607 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Sunday 15 June 2025 08:02:59 -0400 (0:00:00.168) 0:13:09.776 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Sunday 15 June 2025 08:02:59 -0400 (0:00:00.133) 0:13:09.909 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Sunday 15 June 2025 08:03:00 -0400 (0:00:00.149) 0:13:10.059 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Sunday 15 June 2025 08:03:00 -0400 (0:00:00.223) 0:13:10.282 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Sunday 15 June 2025 08:03:00 -0400 (0:00:00.080) 0:13:10.363 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Sunday 15 June 2025 08:03:00 -0400 (0:00:00.119) 0:13:10.482 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Sunday 15 June 2025 08:03:00 -0400 (0:00:00.222) 0:13:10.704 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Sunday 15 June 2025 08:03:00 -0400 (0:00:00.130) 0:13:10.835 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Sunday 15 June 2025 08:03:00 -0400 (0:00:00.109) 0:13:10.944 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node14 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Sunday 15 June 2025 08:03:01 -0400 (0:00:00.376) 0:13:11.321 *********** skipping: [managed-node14] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Sunday 15 June 2025 08:03:01 -0400 (0:00:00.240) 0:13:11.561 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node14 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Sunday 15 June 2025 08:03:02 -0400 (0:00:00.435) 0:13:11.997 *********** skipping: [managed-node14] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Sunday 15 June 2025 08:03:02 -0400 (0:00:00.246) 0:13:12.243 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node14 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Sunday 15 June 2025 08:03:02 -0400 (0:00:00.421) 0:13:12.664 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Sunday 15 June 2025 08:03:03 -0400 (0:00:00.326) 0:13:12.991 *********** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Sunday 15 June 2025 08:03:03 -0400 (0:00:00.121) 0:13:13.112 *********** TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Sunday 15 June 2025 08:03:03 -0400 (0:00:00.229) 0:13:13.342 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Sunday 15 June 2025 08:03:03 -0400 (0:00:00.257) 0:13:13.599 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node14 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Sunday 15 June 2025 08:03:03 -0400 (0:00:00.356) 0:13:13.956 *********** skipping: [managed-node14] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Sunday 15 June 2025 08:03:04 -0400 (0:00:00.287) 0:13:14.243 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node14 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Sunday 15 June 2025 08:03:04 -0400 (0:00:00.470) 0:13:14.714 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Sunday 15 June 2025 08:03:04 -0400 (0:00:00.190) 0:13:14.905 *********** skipping: [managed-node14] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Sunday 15 June 2025 08:03:05 -0400 (0:00:00.163) 0:13:15.068 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Sunday 15 June 2025 08:03:05 -0400 (0:00:00.208) 0:13:15.276 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Sunday 15 June 2025 08:03:05 -0400 (0:00:00.232) 0:13:15.509 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Sunday 15 June 2025 08:03:05 -0400 (0:00:00.253) 0:13:15.762 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Sunday 15 June 2025 08:03:06 -0400 (0:00:00.251) 0:13:16.013 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Sunday 15 June 2025 08:03:06 -0400 (0:00:00.164) 0:13:16.178 *********** ok: [managed-node14] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Sunday 15 June 2025 08:03:06 -0400 (0:00:00.139) 0:13:16.317 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node14 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 15 June 2025 08:03:06 -0400 (0:00:00.312) 0:13:16.630 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 15 June 2025 08:03:06 -0400 (0:00:00.172) 0:13:16.803 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node14 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 15 June 2025 08:03:08 -0400 (0:00:01.481) 0:13:18.285 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 15 June 2025 08:03:08 -0400 (0:00:00.297) 0:13:18.582 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 15 June 2025 08:03:08 -0400 (0:00:00.203) 0:13:18.786 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Sunday 15 June 2025 08:03:09 -0400 (0:00:00.315) 0:13:19.102 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Sunday 15 June 2025 08:03:09 -0400 (0:00:00.269) 0:13:19.371 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Sunday 15 June 2025 08:03:09 -0400 (0:00:00.267) 0:13:19.638 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Sunday 15 June 2025 08:03:09 -0400 (0:00:00.272) 0:13:19.910 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Sunday 15 June 2025 08:03:10 -0400 (0:00:00.275) 0:13:20.186 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Sunday 15 June 2025 08:03:10 -0400 (0:00:00.156) 0:13:20.342 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Sunday 15 June 2025 08:03:10 -0400 (0:00:00.211) 0:13:20.554 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Sunday 15 June 2025 08:03:10 -0400 (0:00:00.218) 0:13:20.772 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 15 June 2025 08:03:11 -0400 (0:00:00.218) 0:13:20.991 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 15 June 2025 08:03:11 -0400 (0:00:00.517) 0:13:21.508 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 15 June 2025 08:03:11 -0400 (0:00:00.291) 0:13:21.800 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 15 June 2025 08:03:12 -0400 (0:00:00.235) 0:13:22.036 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 15 June 2025 08:03:12 -0400 (0:00:00.172) 0:13:22.208 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 15 June 2025 08:03:12 -0400 (0:00:00.170) 0:13:22.378 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 15 June 2025 08:03:12 -0400 (0:00:00.154) 0:13:22.533 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 15 June 2025 08:03:12 -0400 (0:00:00.274) 0:13:22.808 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 15 June 2025 08:03:13 -0400 (0:00:00.371) 0:13:23.180 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988950.5177236, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749988950.5177236, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 174984, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1749988950.5177236, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 15 June 2025 08:03:14 -0400 (0:00:01.486) 0:13:24.667 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 15 June 2025 08:03:14 -0400 (0:00:00.287) 0:13:24.954 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 15 June 2025 08:03:15 -0400 (0:00:00.316) 0:13:25.271 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 15 June 2025 08:03:15 -0400 (0:00:00.311) 0:13:25.582 *********** ok: [managed-node14] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 15 June 2025 08:03:15 -0400 (0:00:00.226) 0:13:25.809 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 15 June 2025 08:03:16 -0400 (0:00:00.370) 0:13:26.179 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 15 June 2025 08:03:16 -0400 (0:00:00.306) 0:13:26.486 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988950.6597233, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749988950.6597233, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 207410, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1749988950.6597233, "nlink": 1, "path": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 15 June 2025 08:03:18 -0400 (0:00:01.534) 0:13:28.021 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 15 June 2025 08:03:23 -0400 (0:00:05.304) 0:13:33.325 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.009664", "end": "2025-06-15 08:03:24.404566", "rc": 0, "start": "2025-06-15 08:03:24.394902" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 0a10fa1a-6547-4337-bcdf-c5411995e857 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 948582 Threads: 2 Salt: 34 f5 1c 13 a6 d0 36 6c 0a cb d4 98 c3 05 1c 7f cc 7c 66 49 11 ad 91 a0 22 1c b5 3a ab 9f c1 8a AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: bd 82 0e 32 f9 eb ca d5 c9 84 f1 de a9 30 f6 35 d6 7c b8 85 e6 63 c5 48 7e 9d 02 cf 49 58 ef 72 Digest: 63 b8 67 04 fd 73 9f a9 7a 37 22 4b ec ab c8 99 dc 2c 64 8c 80 84 81 bd f2 30 7a 4b 44 87 60 3c TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 15 June 2025 08:03:24 -0400 (0:00:01.381) 0:13:34.707 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 15 June 2025 08:03:24 -0400 (0:00:00.254) 0:13:34.962 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 15 June 2025 08:03:25 -0400 (0:00:00.279) 0:13:35.241 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 15 June 2025 08:03:25 -0400 (0:00:00.319) 0:13:35.560 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 15 June 2025 08:03:25 -0400 (0:00:00.259) 0:13:35.819 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Sunday 15 June 2025 08:03:26 -0400 (0:00:00.367) 0:13:36.187 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Sunday 15 June 2025 08:03:26 -0400 (0:00:00.281) 0:13:36.469 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Sunday 15 June 2025 08:03:26 -0400 (0:00:00.306) 0:13:36.775 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-0a10fa1a-6547-4337-bcdf-c5411995e857 /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Sunday 15 June 2025 08:03:27 -0400 (0:00:00.393) 0:13:37.169 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Sunday 15 June 2025 08:03:27 -0400 (0:00:00.398) 0:13:37.567 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Sunday 15 June 2025 08:03:27 -0400 (0:00:00.239) 0:13:37.807 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Sunday 15 June 2025 08:03:28 -0400 (0:00:00.322) 0:13:38.129 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Sunday 15 June 2025 08:03:28 -0400 (0:00:00.357) 0:13:38.487 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 15 June 2025 08:03:28 -0400 (0:00:00.218) 0:13:38.706 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 15 June 2025 08:03:29 -0400 (0:00:00.772) 0:13:39.478 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 15 June 2025 08:03:29 -0400 (0:00:00.241) 0:13:39.719 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 15 June 2025 08:03:29 -0400 (0:00:00.175) 0:13:39.895 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 15 June 2025 08:03:30 -0400 (0:00:00.193) 0:13:40.088 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 15 June 2025 08:03:30 -0400 (0:00:00.262) 0:13:40.350 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 15 June 2025 08:03:30 -0400 (0:00:00.123) 0:13:40.474 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 15 June 2025 08:03:30 -0400 (0:00:00.173) 0:13:40.648 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 15 June 2025 08:03:30 -0400 (0:00:00.285) 0:13:40.933 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 15 June 2025 08:03:31 -0400 (0:00:00.225) 0:13:41.159 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 15 June 2025 08:03:31 -0400 (0:00:00.307) 0:13:41.466 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 15 June 2025 08:03:31 -0400 (0:00:00.208) 0:13:41.674 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 15 June 2025 08:03:32 -0400 (0:00:00.320) 0:13:41.995 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 15 June 2025 08:03:32 -0400 (0:00:00.297) 0:13:42.292 *********** ok: [managed-node14] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 15 June 2025 08:03:32 -0400 (0:00:00.372) 0:13:42.665 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 15 June 2025 08:03:33 -0400 (0:00:00.315) 0:13:42.980 *********** skipping: [managed-node14] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 15 June 2025 08:03:33 -0400 (0:00:00.393) 0:13:43.374 *********** skipping: [managed-node14] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 15 June 2025 08:03:33 -0400 (0:00:00.286) 0:13:43.660 *********** skipping: [managed-node14] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 15 June 2025 08:03:34 -0400 (0:00:00.330) 0:13:43.990 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Sunday 15 June 2025 08:03:34 -0400 (0:00:00.239) 0:13:44.230 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Sunday 15 June 2025 08:03:34 -0400 (0:00:00.425) 0:13:44.656 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Sunday 15 June 2025 08:03:34 -0400 (0:00:00.190) 0:13:44.846 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Sunday 15 June 2025 08:03:35 -0400 (0:00:00.388) 0:13:45.235 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Sunday 15 June 2025 08:03:35 -0400 (0:00:00.288) 0:13:45.523 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Sunday 15 June 2025 08:03:35 -0400 (0:00:00.321) 0:13:45.845 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Sunday 15 June 2025 08:03:36 -0400 (0:00:00.256) 0:13:46.101 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Sunday 15 June 2025 08:03:36 -0400 (0:00:00.189) 0:13:46.291 *********** skipping: [managed-node14] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Sunday 15 June 2025 08:03:36 -0400 (0:00:00.168) 0:13:46.460 *********** skipping: [managed-node14] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Sunday 15 June 2025 08:03:36 -0400 (0:00:00.195) 0:13:46.655 *********** skipping: [managed-node14] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Sunday 15 June 2025 08:03:36 -0400 (0:00:00.302) 0:13:46.958 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Sunday 15 June 2025 08:03:37 -0400 (0:00:00.326) 0:13:47.284 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Sunday 15 June 2025 08:03:37 -0400 (0:00:00.346) 0:13:47.631 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Sunday 15 June 2025 08:03:37 -0400 (0:00:00.330) 0:13:47.961 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Sunday 15 June 2025 08:03:38 -0400 (0:00:00.275) 0:13:48.237 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Sunday 15 June 2025 08:03:38 -0400 (0:00:00.220) 0:13:48.458 *********** ok: [managed-node14] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Sunday 15 June 2025 08:03:38 -0400 (0:00:00.280) 0:13:48.738 *********** ok: [managed-node14] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Sunday 15 June 2025 08:03:39 -0400 (0:00:00.261) 0:13:48.999 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 15 June 2025 08:03:39 -0400 (0:00:00.351) 0:13:49.351 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 15 June 2025 08:03:39 -0400 (0:00:00.314) 0:13:49.666 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 15 June 2025 08:03:40 -0400 (0:00:00.333) 0:13:50.000 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 15 June 2025 08:03:40 -0400 (0:00:00.315) 0:13:50.315 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 15 June 2025 08:03:40 -0400 (0:00:00.320) 0:13:50.636 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 15 June 2025 08:03:40 -0400 (0:00:00.262) 0:13:50.898 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 15 June 2025 08:03:41 -0400 (0:00:00.282) 0:13:51.180 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 15 June 2025 08:03:41 -0400 (0:00:00.253) 0:13:51.434 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Sunday 15 June 2025 08:03:41 -0400 (0:00:00.319) 0:13:51.754 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Sunday 15 June 2025 08:03:42 -0400 (0:00:00.239) 0:13:51.993 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:358 Sunday 15 June 2025 08:03:42 -0400 (0:00:00.197) 0:13:52.190 *********** ok: [managed-node14] => { "changed": false, "path": "/tmp/storage_testtpprf78plukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:368 Sunday 15 June 2025 08:03:43 -0400 (0:00:01.654) 0:13:53.845 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node14 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 15 June 2025 08:03:44 -0400 (0:00:00.362) 0:13:54.207 *********** ok: [managed-node14] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 15 June 2025 08:03:44 -0400 (0:00:00.326) 0:13:54.533 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 08:03:44 -0400 (0:00:00.343) 0:13:54.877 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 08:03:45 -0400 (0:00:00.403) 0:13:55.280 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 08:03:46 -0400 (0:00:00.948) 0:13:56.229 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 08:03:46 -0400 (0:00:00.515) 0:13:56.744 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 08:03:47 -0400 (0:00:00.256) 0:13:57.001 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 08:03:47 -0400 (0:00:00.335) 0:13:57.336 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 08:03:47 -0400 (0:00:00.345) 0:13:57.682 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 08:03:47 -0400 (0:00:00.248) 0:13:57.930 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 08:03:48 -0400 (0:00:00.549) 0:13:58.480 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 08:03:53 -0400 (0:00:05.017) 0:14:03.497 *********** ok: [managed-node14] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 08:03:53 -0400 (0:00:00.241) 0:14:03.739 *********** ok: [managed-node14] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 08:03:54 -0400 (0:00:00.262) 0:14:04.001 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 08:03:59 -0400 (0:00:05.162) 0:14:09.164 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 08:03:59 -0400 (0:00:00.264) 0:14:09.429 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 08:03:59 -0400 (0:00:00.136) 0:14:09.566 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 08:03:59 -0400 (0:00:00.199) 0:14:09.765 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 08:04:00 -0400 (0:00:00.205) 0:14:09.971 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 08:04:04 -0400 (0:00:04.543) 0:14:14.514 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 08:04:07 -0400 (0:00:03.003) 0:14:17.517 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 08:04:07 -0400 (0:00:00.320) 0:14:17.838 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 08:04:08 -0400 (0:00:00.220) 0:14:18.058 *********** fatal: [managed-node14]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 15 June 2025 08:04:13 -0400 (0:00:05.324) 0:14:23.383 *********** fatal: [managed-node14]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 08:04:13 -0400 (0:00:00.220) 0:14:23.603 *********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 15 June 2025 08:04:13 -0400 (0:00:00.240) 0:14:23.844 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 15 June 2025 08:04:14 -0400 (0:00:00.218) 0:14:24.062 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 15 June 2025 08:04:14 -0400 (0:00:00.220) 0:14:24.282 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:387 Sunday 15 June 2025 08:04:14 -0400 (0:00:00.230) 0:14:24.513 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 08:04:15 -0400 (0:00:00.470) 0:14:24.984 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 08:04:15 -0400 (0:00:00.492) 0:14:25.477 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 08:04:15 -0400 (0:00:00.408) 0:14:25.886 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 08:04:16 -0400 (0:00:00.382) 0:14:26.269 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 08:04:16 -0400 (0:00:00.156) 0:14:26.426 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 08:04:16 -0400 (0:00:00.336) 0:14:26.762 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 08:04:16 -0400 (0:00:00.177) 0:14:26.939 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 08:04:17 -0400 (0:00:00.143) 0:14:27.083 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 08:04:17 -0400 (0:00:00.425) 0:14:27.509 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 08:04:22 -0400 (0:00:04.778) 0:14:32.287 *********** ok: [managed-node14] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 08:04:22 -0400 (0:00:00.442) 0:14:32.730 *********** ok: [managed-node14] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 08:04:23 -0400 (0:00:00.240) 0:14:32.971 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 08:04:28 -0400 (0:00:05.088) 0:14:38.060 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 08:04:28 -0400 (0:00:00.291) 0:14:38.351 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 08:04:28 -0400 (0:00:00.242) 0:14:38.594 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 08:04:28 -0400 (0:00:00.277) 0:14:38.871 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 08:04:29 -0400 (0:00:00.187) 0:14:39.058 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 08:04:33 -0400 (0:00:04.757) 0:14:43.816 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 08:04:36 -0400 (0:00:02.884) 0:14:46.701 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 08:04:37 -0400 (0:00:00.537) 0:14:47.238 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 08:04:37 -0400 (0:00:00.189) 0:14:47.427 *********** changed: [managed-node14] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 15 June 2025 08:04:51 -0400 (0:00:14.409) 0:15:01.837 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 15 June 2025 08:04:52 -0400 (0:00:00.201) 0:15:02.038 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988960.154705, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "687370a87c4b57439ee7ea440d17db3976b41a99", "ctime": 1749988960.1517053, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1749988960.1517053, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2912082900", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 15 June 2025 08:04:53 -0400 (0:00:01.610) 0:15:03.649 *********** ok: [managed-node14] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 08:04:55 -0400 (0:00:01.610) 0:15:05.259 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 15 June 2025 08:04:55 -0400 (0:00:00.254) 0:15:05.514 *********** ok: [managed-node14] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 15 June 2025 08:04:55 -0400 (0:00:00.252) 0:15:05.767 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 15 June 2025 08:04:56 -0400 (0:00:00.362) 0:15:06.129 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 15 June 2025 08:04:56 -0400 (0:00:00.348) 0:15:06.478 *********** changed: [managed-node14] => (item={'src': '/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-0a10fa1a-6547-4337-bcdf-c5411995e857" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 15 June 2025 08:04:58 -0400 (0:00:01.990) 0:15:08.468 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 15 June 2025 08:05:00 -0400 (0:00:02.109) 0:15:10.578 *********** changed: [managed-node14] => (item={'src': '/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 15 June 2025 08:05:02 -0400 (0:00:01.641) 0:15:12.219 *********** skipping: [managed-node14] => (item={'src': '/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 15 June 2025 08:05:02 -0400 (0:00:00.381) 0:15:12.601 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 15 June 2025 08:05:04 -0400 (0:00:01.918) 0:15:14.520 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749988972.6406813, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "2ead6d5a51a437e09cfed9f2fa97cafcde655aea", "ctime": 1749988965.3236954, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 268435652, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1749988965.3236954, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 88, "uid": 0, "version": "1019551290", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 15 June 2025 08:05:06 -0400 (0:00:01.569) 0:15:16.089 *********** changed: [managed-node14] => (item={'backing_device': '/dev/sda1', 'name': 'luks-0a10fa1a-6547-4337-bcdf-c5411995e857', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node14] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-6cef51e7-40d1-4541-a5c4-6934aa486a81', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 15 June 2025 08:05:09 -0400 (0:00:03.316) 0:15:19.406 *********** ok: [managed-node14] TASK [Verify role results] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:406 Sunday 15 June 2025 08:05:11 -0400 (0:00:02.248) 0:15:21.654 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node14 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 15 June 2025 08:05:12 -0400 (0:00:00.332) 0:15:21.987 *********** ok: [managed-node14] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 15 June 2025 08:05:12 -0400 (0:00:00.235) 0:15:22.222 *********** skipping: [managed-node14] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 15 June 2025 08:05:12 -0400 (0:00:00.269) 0:15:22.491 *********** ok: [managed-node14] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "6cef51e7-40d1-4541-a5c4-6934aa486a81" }, "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "size": "4G", "type": "crypt", "uuid": "c2047825-24b0-4d31-8e9e-60ff51cdc32d" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "1Y9WCB-x61J-PNpb-KWfI-65dd-u3gi-UoLrvq" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 15 June 2025 08:05:14 -0400 (0:00:01.560) 0:15:24.052 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002313", "end": "2025-06-15 08:05:15.328129", "rc": 0, "start": "2025-06-15 08:05:15.325816" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 15 June 2025 08:05:15 -0400 (0:00:01.556) 0:15:25.609 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002297", "end": "2025-06-15 08:05:16.732170", "failed_when_result": false, "rc": 0, "start": "2025-06-15 08:05:16.729873" } STDOUT: luks-6cef51e7-40d1-4541-a5c4-6934aa486a81 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 15 June 2025 08:05:17 -0400 (0:00:01.396) 0:15:27.005 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node14 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Sunday 15 June 2025 08:05:17 -0400 (0:00:00.408) 0:15:27.414 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Sunday 15 June 2025 08:05:17 -0400 (0:00:00.282) 0:15:27.696 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.027128", "end": "2025-06-15 08:05:18.982434", "rc": 0, "start": "2025-06-15 08:05:18.955306" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Sunday 15 June 2025 08:05:19 -0400 (0:00:01.459) 0:15:29.156 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Sunday 15 June 2025 08:05:19 -0400 (0:00:00.312) 0:15:29.469 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node14 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Sunday 15 June 2025 08:05:19 -0400 (0:00:00.469) 0:15:29.938 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Sunday 15 June 2025 08:05:20 -0400 (0:00:00.306) 0:15:30.244 *********** ok: [managed-node14] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Sunday 15 June 2025 08:05:22 -0400 (0:00:02.306) 0:15:32.551 *********** ok: [managed-node14] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Sunday 15 June 2025 08:05:22 -0400 (0:00:00.178) 0:15:32.729 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Sunday 15 June 2025 08:05:23 -0400 (0:00:00.247) 0:15:32.977 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Sunday 15 June 2025 08:05:23 -0400 (0:00:00.260) 0:15:33.237 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Sunday 15 June 2025 08:05:23 -0400 (0:00:00.249) 0:15:33.487 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Sunday 15 June 2025 08:05:23 -0400 (0:00:00.274) 0:15:33.761 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Sunday 15 June 2025 08:05:23 -0400 (0:00:00.165) 0:15:33.927 *********** ok: [managed-node14] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Sunday 15 June 2025 08:05:24 -0400 (0:00:00.349) 0:15:34.277 *********** ok: [managed-node14] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.207 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:74 Sunday 15 June 2025 08:05:25 -0400 (0:00:01.530) 0:15:35.807 *********** skipping: [managed-node14] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:84 Sunday 15 June 2025 08:05:26 -0400 (0:00:00.277) 0:15:36.085 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node14 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Sunday 15 June 2025 08:05:26 -0400 (0:00:00.793) 0:15:36.878 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Sunday 15 June 2025 08:05:27 -0400 (0:00:00.269) 0:15:37.148 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Sunday 15 June 2025 08:05:27 -0400 (0:00:00.304) 0:15:37.452 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Sunday 15 June 2025 08:05:27 -0400 (0:00:00.345) 0:15:37.798 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Sunday 15 June 2025 08:05:28 -0400 (0:00:00.299) 0:15:38.097 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Sunday 15 June 2025 08:05:28 -0400 (0:00:00.352) 0:15:38.450 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Sunday 15 June 2025 08:05:28 -0400 (0:00:00.304) 0:15:38.755 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Sunday 15 June 2025 08:05:29 -0400 (0:00:00.323) 0:15:39.078 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Sunday 15 June 2025 08:05:29 -0400 (0:00:00.249) 0:15:39.328 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Sunday 15 June 2025 08:05:29 -0400 (0:00:00.281) 0:15:39.610 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Sunday 15 June 2025 08:05:30 -0400 (0:00:00.407) 0:15:40.017 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Sunday 15 June 2025 08:05:30 -0400 (0:00:00.317) 0:15:40.335 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node14 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Sunday 15 June 2025 08:05:30 -0400 (0:00:00.394) 0:15:40.729 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node14 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Sunday 15 June 2025 08:05:31 -0400 (0:00:00.549) 0:15:41.279 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Sunday 15 June 2025 08:05:31 -0400 (0:00:00.322) 0:15:41.601 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Sunday 15 June 2025 08:05:31 -0400 (0:00:00.284) 0:15:41.885 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Sunday 15 June 2025 08:05:32 -0400 (0:00:00.180) 0:15:42.066 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Sunday 15 June 2025 08:05:32 -0400 (0:00:00.267) 0:15:42.333 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Sunday 15 June 2025 08:05:32 -0400 (0:00:00.304) 0:15:42.637 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Sunday 15 June 2025 08:05:32 -0400 (0:00:00.321) 0:15:42.958 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Sunday 15 June 2025 08:05:33 -0400 (0:00:00.199) 0:15:43.158 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node14 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Sunday 15 June 2025 08:05:33 -0400 (0:00:00.388) 0:15:43.546 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node14 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Sunday 15 June 2025 08:05:33 -0400 (0:00:00.417) 0:15:43.964 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Sunday 15 June 2025 08:05:34 -0400 (0:00:00.187) 0:15:44.152 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Sunday 15 June 2025 08:05:34 -0400 (0:00:00.176) 0:15:44.328 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Sunday 15 June 2025 08:05:34 -0400 (0:00:00.181) 0:15:44.510 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Sunday 15 June 2025 08:05:34 -0400 (0:00:00.143) 0:15:44.653 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node14 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Sunday 15 June 2025 08:05:35 -0400 (0:00:00.454) 0:15:45.107 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Sunday 15 June 2025 08:05:35 -0400 (0:00:00.184) 0:15:45.292 *********** skipping: [managed-node14] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Sunday 15 June 2025 08:05:35 -0400 (0:00:00.220) 0:15:45.513 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node14 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Sunday 15 June 2025 08:05:35 -0400 (0:00:00.407) 0:15:45.921 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Sunday 15 June 2025 08:05:36 -0400 (0:00:00.226) 0:15:46.147 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Sunday 15 June 2025 08:05:36 -0400 (0:00:00.257) 0:15:46.405 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Sunday 15 June 2025 08:05:36 -0400 (0:00:00.299) 0:15:46.704 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Sunday 15 June 2025 08:05:36 -0400 (0:00:00.223) 0:15:46.928 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Sunday 15 June 2025 08:05:37 -0400 (0:00:00.320) 0:15:47.248 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Sunday 15 June 2025 08:05:37 -0400 (0:00:00.175) 0:15:47.424 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Sunday 15 June 2025 08:05:37 -0400 (0:00:00.248) 0:15:47.672 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node14 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Sunday 15 June 2025 08:05:38 -0400 (0:00:01.092) 0:15:48.764 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node14 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Sunday 15 June 2025 08:05:39 -0400 (0:00:00.524) 0:15:49.289 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Sunday 15 June 2025 08:05:39 -0400 (0:00:00.182) 0:15:49.472 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Sunday 15 June 2025 08:05:39 -0400 (0:00:00.384) 0:15:49.856 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Sunday 15 June 2025 08:05:40 -0400 (0:00:00.234) 0:15:50.090 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Sunday 15 June 2025 08:05:40 -0400 (0:00:00.235) 0:15:50.326 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Sunday 15 June 2025 08:05:40 -0400 (0:00:00.224) 0:15:50.551 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Sunday 15 June 2025 08:05:40 -0400 (0:00:00.232) 0:15:50.783 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Sunday 15 June 2025 08:05:41 -0400 (0:00:00.227) 0:15:51.011 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node14 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Sunday 15 June 2025 08:05:41 -0400 (0:00:00.697) 0:15:51.709 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Sunday 15 June 2025 08:05:42 -0400 (0:00:00.277) 0:15:51.987 *********** skipping: [managed-node14] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Sunday 15 June 2025 08:05:42 -0400 (0:00:00.272) 0:15:52.259 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Sunday 15 June 2025 08:05:42 -0400 (0:00:00.270) 0:15:52.529 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Sunday 15 June 2025 08:05:42 -0400 (0:00:00.265) 0:15:52.795 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Sunday 15 June 2025 08:05:43 -0400 (0:00:00.244) 0:15:53.039 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Sunday 15 June 2025 08:05:43 -0400 (0:00:00.223) 0:15:53.263 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Sunday 15 June 2025 08:05:43 -0400 (0:00:00.189) 0:15:53.453 *********** ok: [managed-node14] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Sunday 15 June 2025 08:05:43 -0400 (0:00:00.187) 0:15:53.641 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node14 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 15 June 2025 08:05:44 -0400 (0:00:00.389) 0:15:54.030 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 15 June 2025 08:05:44 -0400 (0:00:00.326) 0:15:54.357 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node14 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 15 June 2025 08:05:45 -0400 (0:00:00.943) 0:15:55.300 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 15 June 2025 08:05:45 -0400 (0:00:00.214) 0:15:55.515 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 15 June 2025 08:05:45 -0400 (0:00:00.357) 0:15:55.872 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Sunday 15 June 2025 08:05:46 -0400 (0:00:00.199) 0:15:56.072 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Sunday 15 June 2025 08:05:46 -0400 (0:00:00.273) 0:15:56.345 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Sunday 15 June 2025 08:05:46 -0400 (0:00:00.140) 0:15:56.485 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Sunday 15 June 2025 08:05:46 -0400 (0:00:00.260) 0:15:56.745 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Sunday 15 June 2025 08:05:46 -0400 (0:00:00.211) 0:15:56.957 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Sunday 15 June 2025 08:05:47 -0400 (0:00:00.226) 0:15:57.183 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Sunday 15 June 2025 08:05:47 -0400 (0:00:00.319) 0:15:57.503 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Sunday 15 June 2025 08:05:47 -0400 (0:00:00.266) 0:15:57.769 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 15 June 2025 08:05:48 -0400 (0:00:00.219) 0:15:57.988 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 15 June 2025 08:05:48 -0400 (0:00:00.575) 0:15:58.564 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 15 June 2025 08:05:48 -0400 (0:00:00.266) 0:15:58.831 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 15 June 2025 08:05:49 -0400 (0:00:00.271) 0:15:59.102 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 15 June 2025 08:05:49 -0400 (0:00:00.286) 0:15:59.389 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 15 June 2025 08:05:49 -0400 (0:00:00.215) 0:15:59.605 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 15 June 2025 08:05:49 -0400 (0:00:00.149) 0:15:59.755 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 15 June 2025 08:05:50 -0400 (0:00:00.241) 0:15:59.996 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 15 June 2025 08:05:50 -0400 (0:00:00.216) 0:16:00.213 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989091.357455, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749989091.357455, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 222344, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1749989091.357455, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 15 June 2025 08:05:51 -0400 (0:00:01.441) 0:16:01.654 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 15 June 2025 08:05:51 -0400 (0:00:00.268) 0:16:01.922 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 15 June 2025 08:05:52 -0400 (0:00:00.232) 0:16:02.155 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 15 June 2025 08:05:52 -0400 (0:00:00.235) 0:16:02.391 *********** ok: [managed-node14] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 15 June 2025 08:05:52 -0400 (0:00:00.208) 0:16:02.599 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 15 June 2025 08:05:52 -0400 (0:00:00.253) 0:16:02.852 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 15 June 2025 08:05:53 -0400 (0:00:00.347) 0:16:03.199 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989091.4934547, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749989091.4934547, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 222454, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1749989091.4934547, "nlink": 1, "path": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 15 June 2025 08:05:54 -0400 (0:00:01.551) 0:16:04.750 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 15 June 2025 08:05:59 -0400 (0:00:04.866) 0:16:09.616 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.009705", "end": "2025-06-15 08:06:01.038086", "rc": 0, "start": "2025-06-15 08:06:01.028381" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 6cef51e7-40d1-4541-a5c4-6934aa486a81 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 934306 Threads: 2 Salt: 87 e5 c3 6a f7 12 58 3a 5c 79 f2 7b 13 f2 13 ae 48 41 d1 53 f1 02 10 e1 6d c1 43 76 e5 01 b2 dd AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 02 a0 88 1a 9b f7 bb ed 3b cf fa 03 74 8b 5e a1 80 a2 8a 37 51 a4 73 9d 27 93 5a 56 e2 8c 51 b1 Digest: 24 44 0e a7 ed d1 c5 81 3c 37 36 ce 05 0a 07 08 aa 80 1d 79 54 ee 2d 69 39 6f 38 6c 5e bd a4 53 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 15 June 2025 08:06:01 -0400 (0:00:01.699) 0:16:11.316 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 15 June 2025 08:06:01 -0400 (0:00:00.400) 0:16:11.716 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 15 June 2025 08:06:02 -0400 (0:00:00.469) 0:16:12.186 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 15 June 2025 08:06:02 -0400 (0:00:00.287) 0:16:12.474 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 15 June 2025 08:06:02 -0400 (0:00:00.359) 0:16:12.834 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Sunday 15 June 2025 08:06:03 -0400 (0:00:00.303) 0:16:13.137 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Sunday 15 June 2025 08:06:03 -0400 (0:00:00.497) 0:16:13.634 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Sunday 15 June 2025 08:06:03 -0400 (0:00:00.314) 0:16:13.949 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-6cef51e7-40d1-4541-a5c4-6934aa486a81 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Sunday 15 June 2025 08:06:04 -0400 (0:00:00.436) 0:16:14.385 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Sunday 15 June 2025 08:06:04 -0400 (0:00:00.300) 0:16:14.686 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Sunday 15 June 2025 08:06:04 -0400 (0:00:00.269) 0:16:14.956 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Sunday 15 June 2025 08:06:05 -0400 (0:00:00.426) 0:16:15.382 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Sunday 15 June 2025 08:06:05 -0400 (0:00:00.334) 0:16:15.717 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 15 June 2025 08:06:05 -0400 (0:00:00.223) 0:16:15.941 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 15 June 2025 08:06:06 -0400 (0:00:00.144) 0:16:16.085 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 15 June 2025 08:06:06 -0400 (0:00:00.280) 0:16:16.365 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 15 June 2025 08:06:06 -0400 (0:00:00.249) 0:16:16.614 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 15 June 2025 08:06:06 -0400 (0:00:00.220) 0:16:16.835 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 15 June 2025 08:06:07 -0400 (0:00:00.183) 0:16:17.018 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 15 June 2025 08:06:07 -0400 (0:00:00.158) 0:16:17.177 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 15 June 2025 08:06:07 -0400 (0:00:00.159) 0:16:17.336 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 15 June 2025 08:06:07 -0400 (0:00:00.132) 0:16:17.469 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 15 June 2025 08:06:07 -0400 (0:00:00.215) 0:16:17.684 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 15 June 2025 08:06:07 -0400 (0:00:00.185) 0:16:17.869 *********** ok: [managed-node14] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 15 June 2025 08:06:11 -0400 (0:00:03.124) 0:16:20.994 *********** ok: [managed-node14] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 15 June 2025 08:06:12 -0400 (0:00:01.635) 0:16:22.629 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 15 June 2025 08:06:13 -0400 (0:00:00.382) 0:16:23.012 *********** ok: [managed-node14] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 15 June 2025 08:06:13 -0400 (0:00:00.272) 0:16:23.284 *********** ok: [managed-node14] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 15 June 2025 08:06:14 -0400 (0:00:01.662) 0:16:24.947 *********** skipping: [managed-node14] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 15 June 2025 08:06:15 -0400 (0:00:00.290) 0:16:25.238 *********** skipping: [managed-node14] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 15 June 2025 08:06:15 -0400 (0:00:00.341) 0:16:25.580 *********** skipping: [managed-node14] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 15 June 2025 08:06:15 -0400 (0:00:00.338) 0:16:25.919 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Sunday 15 June 2025 08:06:16 -0400 (0:00:00.315) 0:16:26.234 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Sunday 15 June 2025 08:06:16 -0400 (0:00:00.306) 0:16:26.541 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Sunday 15 June 2025 08:06:16 -0400 (0:00:00.228) 0:16:26.769 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Sunday 15 June 2025 08:06:17 -0400 (0:00:00.298) 0:16:27.068 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Sunday 15 June 2025 08:06:17 -0400 (0:00:00.247) 0:16:27.315 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Sunday 15 June 2025 08:06:17 -0400 (0:00:00.249) 0:16:27.565 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Sunday 15 June 2025 08:06:17 -0400 (0:00:00.293) 0:16:27.859 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Sunday 15 June 2025 08:06:18 -0400 (0:00:00.268) 0:16:28.127 *********** skipping: [managed-node14] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Sunday 15 June 2025 08:06:18 -0400 (0:00:00.162) 0:16:28.290 *********** skipping: [managed-node14] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Sunday 15 June 2025 08:06:18 -0400 (0:00:00.203) 0:16:28.494 *********** skipping: [managed-node14] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Sunday 15 June 2025 08:06:18 -0400 (0:00:00.325) 0:16:28.819 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Sunday 15 June 2025 08:06:19 -0400 (0:00:00.324) 0:16:29.143 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Sunday 15 June 2025 08:06:19 -0400 (0:00:00.814) 0:16:29.958 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Sunday 15 June 2025 08:06:20 -0400 (0:00:00.233) 0:16:30.192 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Sunday 15 June 2025 08:06:20 -0400 (0:00:00.318) 0:16:30.510 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Sunday 15 June 2025 08:06:20 -0400 (0:00:00.262) 0:16:30.773 *********** ok: [managed-node14] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Sunday 15 June 2025 08:06:21 -0400 (0:00:00.339) 0:16:31.112 *********** ok: [managed-node14] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Sunday 15 June 2025 08:06:21 -0400 (0:00:00.377) 0:16:31.490 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 15 June 2025 08:06:21 -0400 (0:00:00.386) 0:16:31.877 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023997", "end": "2025-06-15 08:06:23.544464", "rc": 0, "start": "2025-06-15 08:06:23.520467" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 15 June 2025 08:06:23 -0400 (0:00:02.042) 0:16:33.919 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 15 June 2025 08:06:24 -0400 (0:00:00.244) 0:16:34.164 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 15 June 2025 08:06:24 -0400 (0:00:00.414) 0:16:34.578 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 15 June 2025 08:06:24 -0400 (0:00:00.219) 0:16:34.797 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 15 June 2025 08:06:24 -0400 (0:00:00.157) 0:16:34.955 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 15 June 2025 08:06:25 -0400 (0:00:00.282) 0:16:35.238 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 15 June 2025 08:06:25 -0400 (0:00:00.194) 0:16:35.432 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Sunday 15 June 2025 08:06:25 -0400 (0:00:00.359) 0:16:35.791 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Sunday 15 June 2025 08:06:25 -0400 (0:00:00.126) 0:16:35.918 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:409 Sunday 15 June 2025 08:06:26 -0400 (0:00:00.213) 0:16:36.131 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 08:06:26 -0400 (0:00:00.618) 0:16:36.749 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 08:06:27 -0400 (0:00:00.412) 0:16:37.162 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 08:06:27 -0400 (0:00:00.283) 0:16:37.445 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 08:06:27 -0400 (0:00:00.384) 0:16:37.829 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 08:06:28 -0400 (0:00:00.325) 0:16:38.154 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 08:06:28 -0400 (0:00:00.237) 0:16:38.392 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 08:06:28 -0400 (0:00:00.174) 0:16:38.566 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 08:06:28 -0400 (0:00:00.329) 0:16:38.896 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 08:06:29 -0400 (0:00:00.688) 0:16:39.585 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 08:06:34 -0400 (0:00:04.657) 0:16:44.243 *********** ok: [managed-node14] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 08:06:34 -0400 (0:00:00.231) 0:16:44.474 *********** ok: [managed-node14] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 08:06:34 -0400 (0:00:00.202) 0:16:44.677 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 08:06:40 -0400 (0:00:05.460) 0:16:50.137 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 08:06:40 -0400 (0:00:00.488) 0:16:50.625 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 08:06:40 -0400 (0:00:00.188) 0:16:50.814 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 08:06:41 -0400 (0:00:00.286) 0:16:51.100 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 08:06:41 -0400 (0:00:00.210) 0:16:51.311 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 08:06:46 -0400 (0:00:04.889) 0:16:56.201 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service": { "name": "systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d0a10fa1a\\x2d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service": { "name": "systemd-cryptsetup@luks\\x2d0a10fa1a\\x2d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 08:06:50 -0400 (0:00:04.168) 0:17:00.369 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d0a10fa1a\\x2d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 08:06:50 -0400 (0:00:00.463) 0:17:00.833 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2d0a10fa1a\x2d6547\x2d4337\x2dbcdf\x2dc5411995e857.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d0a10fa1a\\x2d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "name": "systemd-cryptsetup@luks\\x2d0a10fa1a\\x2d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket dev-sda1.device system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-0a10fa1a-6547-4337-bcdf-c5411995e857", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-0a10fa1a-6547-4337-bcdf-c5411995e857 /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-0a10fa1a-6547-4337-bcdf-c5411995e857 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d0a10fa1a\\x2d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d0a10fa1a\\x2d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d0a10fa1a\\x2d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2025-06-15 08:05:04 EDT", "StateChangeTimestampMonotonic": "2407946346", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...d6547\x2d4337\x2dbcdf\x2dc5411995e857.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "name": "systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 08:06:54 -0400 (0:00:03.665) 0:17:04.499 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 15 June 2025 08:06:59 -0400 (0:00:05.438) 0:17:09.937 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 15 June 2025 08:07:00 -0400 (0:00:00.104) 0:17:10.041 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989101.9544349, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "abdb16d652768a091774dd564181819b365d8733", "ctime": 1749989101.9514349, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1749989101.9514349, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2912082900", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 15 June 2025 08:07:01 -0400 (0:00:01.629) 0:17:11.671 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 08:07:01 -0400 (0:00:00.240) 0:17:11.912 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2d0a10fa1a\x2d6547\x2d4337\x2dbcdf\x2dc5411995e857.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d0a10fa1a\\x2d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "name": "systemd-cryptsetup@luks\\x2d0a10fa1a\\x2d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d0a10fa1a\\x2d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d0a10fa1a\\x2d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d0a10fa1a\\x2d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d0a10fa1a\\x2d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...d6547\x2d4337\x2dbcdf\x2dc5411995e857.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "name": "systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d6547\\x2d4337\\x2dbcdf\\x2dc5411995e857.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 15 June 2025 08:07:05 -0400 (0:00:03.463) 0:17:15.375 *********** ok: [managed-node14] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 15 June 2025 08:07:05 -0400 (0:00:00.162) 0:17:15.538 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 15 June 2025 08:07:05 -0400 (0:00:00.199) 0:17:15.737 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 15 June 2025 08:07:06 -0400 (0:00:00.321) 0:17:16.058 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 15 June 2025 08:07:06 -0400 (0:00:00.264) 0:17:16.323 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 15 June 2025 08:07:08 -0400 (0:00:01.851) 0:17:18.174 *********** ok: [managed-node14] => (item={'src': '/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 15 June 2025 08:07:09 -0400 (0:00:01.637) 0:17:19.812 *********** skipping: [managed-node14] => (item={'src': '/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 15 June 2025 08:07:10 -0400 (0:00:00.350) 0:17:20.163 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 15 June 2025 08:07:12 -0400 (0:00:01.933) 0:17:22.096 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989116.7304065, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "b44e0c13378e4547cef1a46e344db727b891e173", "ctime": 1749989109.164421, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 390070468, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1749989109.1634212, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "2163277715", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 15 June 2025 08:07:13 -0400 (0:00:01.614) 0:17:23.711 *********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 15 June 2025 08:07:13 -0400 (0:00:00.193) 0:17:23.905 *********** ok: [managed-node14] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:423 Sunday 15 June 2025 08:07:15 -0400 (0:00:01.656) 0:17:25.562 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify role results] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:430 Sunday 15 June 2025 08:07:15 -0400 (0:00:00.396) 0:17:25.959 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node14 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 15 June 2025 08:07:16 -0400 (0:00:00.495) 0:17:26.454 *********** ok: [managed-node14] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 15 June 2025 08:07:16 -0400 (0:00:00.236) 0:17:26.691 *********** skipping: [managed-node14] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 15 June 2025 08:07:16 -0400 (0:00:00.052) 0:17:26.744 *********** ok: [managed-node14] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "6cef51e7-40d1-4541-a5c4-6934aa486a81" }, "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "size": "4G", "type": "crypt", "uuid": "c2047825-24b0-4d31-8e9e-60ff51cdc32d" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "1Y9WCB-x61J-PNpb-KWfI-65dd-u3gi-UoLrvq" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 15 June 2025 08:07:18 -0400 (0:00:01.515) 0:17:28.259 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002357", "end": "2025-06-15 08:07:19.946070", "rc": 0, "start": "2025-06-15 08:07:19.943713" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 15 June 2025 08:07:20 -0400 (0:00:01.924) 0:17:30.184 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002522", "end": "2025-06-15 08:07:21.323153", "failed_when_result": false, "rc": 0, "start": "2025-06-15 08:07:21.320631" } STDOUT: luks-6cef51e7-40d1-4541-a5c4-6934aa486a81 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 15 June 2025 08:07:21 -0400 (0:00:01.432) 0:17:31.616 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node14 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Sunday 15 June 2025 08:07:22 -0400 (0:00:00.488) 0:17:32.104 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Sunday 15 June 2025 08:07:22 -0400 (0:00:00.255) 0:17:32.360 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.022266", "end": "2025-06-15 08:07:23.762436", "rc": 0, "start": "2025-06-15 08:07:23.740170" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Sunday 15 June 2025 08:07:24 -0400 (0:00:01.720) 0:17:34.081 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Sunday 15 June 2025 08:07:24 -0400 (0:00:00.421) 0:17:34.503 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node14 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Sunday 15 June 2025 08:07:24 -0400 (0:00:00.401) 0:17:34.904 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Sunday 15 June 2025 08:07:25 -0400 (0:00:00.392) 0:17:35.297 *********** ok: [managed-node14] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Sunday 15 June 2025 08:07:26 -0400 (0:00:01.516) 0:17:36.814 *********** ok: [managed-node14] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Sunday 15 June 2025 08:07:27 -0400 (0:00:00.247) 0:17:37.061 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Sunday 15 June 2025 08:07:27 -0400 (0:00:00.197) 0:17:37.258 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Sunday 15 June 2025 08:07:27 -0400 (0:00:00.322) 0:17:37.581 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Sunday 15 June 2025 08:07:27 -0400 (0:00:00.332) 0:17:37.914 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Sunday 15 June 2025 08:07:28 -0400 (0:00:00.347) 0:17:38.261 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Sunday 15 June 2025 08:07:28 -0400 (0:00:00.304) 0:17:38.565 *********** ok: [managed-node14] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Sunday 15 June 2025 08:07:28 -0400 (0:00:00.342) 0:17:38.907 *********** ok: [managed-node14] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.207 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:74 Sunday 15 June 2025 08:07:30 -0400 (0:00:01.687) 0:17:40.595 *********** skipping: [managed-node14] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:84 Sunday 15 June 2025 08:07:30 -0400 (0:00:00.316) 0:17:40.912 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node14 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Sunday 15 June 2025 08:07:31 -0400 (0:00:00.634) 0:17:41.546 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Sunday 15 June 2025 08:07:31 -0400 (0:00:00.327) 0:17:41.873 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Sunday 15 June 2025 08:07:32 -0400 (0:00:00.227) 0:17:42.101 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Sunday 15 June 2025 08:07:32 -0400 (0:00:00.209) 0:17:42.311 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Sunday 15 June 2025 08:07:32 -0400 (0:00:00.191) 0:17:42.503 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Sunday 15 June 2025 08:07:32 -0400 (0:00:00.235) 0:17:42.739 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Sunday 15 June 2025 08:07:33 -0400 (0:00:00.335) 0:17:43.074 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Sunday 15 June 2025 08:07:33 -0400 (0:00:00.260) 0:17:43.335 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Sunday 15 June 2025 08:07:33 -0400 (0:00:00.271) 0:17:43.606 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Sunday 15 June 2025 08:07:33 -0400 (0:00:00.207) 0:17:43.814 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Sunday 15 June 2025 08:07:34 -0400 (0:00:00.519) 0:17:44.334 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Sunday 15 June 2025 08:07:34 -0400 (0:00:00.303) 0:17:44.637 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node14 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Sunday 15 June 2025 08:07:35 -0400 (0:00:00.500) 0:17:45.139 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node14 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Sunday 15 June 2025 08:07:35 -0400 (0:00:00.411) 0:17:45.550 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Sunday 15 June 2025 08:07:35 -0400 (0:00:00.253) 0:17:45.804 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Sunday 15 June 2025 08:07:36 -0400 (0:00:00.369) 0:17:46.174 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Sunday 15 June 2025 08:07:36 -0400 (0:00:00.209) 0:17:46.383 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Sunday 15 June 2025 08:07:36 -0400 (0:00:00.204) 0:17:46.587 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Sunday 15 June 2025 08:07:36 -0400 (0:00:00.327) 0:17:46.915 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Sunday 15 June 2025 08:07:37 -0400 (0:00:00.221) 0:17:47.136 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Sunday 15 June 2025 08:07:37 -0400 (0:00:00.302) 0:17:47.438 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node14 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Sunday 15 June 2025 08:07:37 -0400 (0:00:00.457) 0:17:47.896 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node14 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Sunday 15 June 2025 08:07:38 -0400 (0:00:01.027) 0:17:48.924 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Sunday 15 June 2025 08:07:39 -0400 (0:00:00.317) 0:17:49.241 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Sunday 15 June 2025 08:07:39 -0400 (0:00:00.238) 0:17:49.480 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Sunday 15 June 2025 08:07:39 -0400 (0:00:00.235) 0:17:49.715 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Sunday 15 June 2025 08:07:40 -0400 (0:00:00.277) 0:17:49.993 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node14 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Sunday 15 June 2025 08:07:40 -0400 (0:00:00.451) 0:17:50.445 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Sunday 15 June 2025 08:07:40 -0400 (0:00:00.344) 0:17:50.789 *********** skipping: [managed-node14] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Sunday 15 June 2025 08:07:41 -0400 (0:00:00.316) 0:17:51.106 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node14 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Sunday 15 June 2025 08:07:41 -0400 (0:00:00.420) 0:17:51.526 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Sunday 15 June 2025 08:07:41 -0400 (0:00:00.320) 0:17:51.847 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Sunday 15 June 2025 08:07:42 -0400 (0:00:00.286) 0:17:52.134 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Sunday 15 June 2025 08:07:42 -0400 (0:00:00.238) 0:17:52.372 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Sunday 15 June 2025 08:07:42 -0400 (0:00:00.313) 0:17:52.686 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Sunday 15 June 2025 08:07:43 -0400 (0:00:00.333) 0:17:53.019 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Sunday 15 June 2025 08:07:43 -0400 (0:00:00.306) 0:17:53.326 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Sunday 15 June 2025 08:07:43 -0400 (0:00:00.362) 0:17:53.689 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node14 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Sunday 15 June 2025 08:07:44 -0400 (0:00:00.802) 0:17:54.492 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node14 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Sunday 15 June 2025 08:07:44 -0400 (0:00:00.360) 0:17:54.853 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Sunday 15 June 2025 08:07:45 -0400 (0:00:00.357) 0:17:55.211 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Sunday 15 June 2025 08:07:45 -0400 (0:00:00.298) 0:17:55.509 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Sunday 15 June 2025 08:07:45 -0400 (0:00:00.197) 0:17:55.707 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Sunday 15 June 2025 08:07:45 -0400 (0:00:00.229) 0:17:55.936 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Sunday 15 June 2025 08:07:46 -0400 (0:00:00.287) 0:17:56.224 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Sunday 15 June 2025 08:07:46 -0400 (0:00:00.296) 0:17:56.520 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Sunday 15 June 2025 08:07:46 -0400 (0:00:00.191) 0:17:56.712 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node14 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Sunday 15 June 2025 08:07:47 -0400 (0:00:00.715) 0:17:57.427 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Sunday 15 June 2025 08:07:47 -0400 (0:00:00.303) 0:17:57.730 *********** skipping: [managed-node14] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Sunday 15 June 2025 08:07:48 -0400 (0:00:00.443) 0:17:58.174 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Sunday 15 June 2025 08:07:48 -0400 (0:00:00.278) 0:17:58.453 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Sunday 15 June 2025 08:07:48 -0400 (0:00:00.320) 0:17:58.773 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Sunday 15 June 2025 08:07:49 -0400 (0:00:00.251) 0:17:59.025 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Sunday 15 June 2025 08:07:49 -0400 (0:00:00.333) 0:17:59.359 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Sunday 15 June 2025 08:07:49 -0400 (0:00:00.242) 0:17:59.601 *********** ok: [managed-node14] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Sunday 15 June 2025 08:07:49 -0400 (0:00:00.258) 0:17:59.859 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node14 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 15 June 2025 08:07:50 -0400 (0:00:00.521) 0:18:00.380 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 15 June 2025 08:07:50 -0400 (0:00:00.262) 0:18:00.643 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node14 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 15 June 2025 08:07:52 -0400 (0:00:01.572) 0:18:02.215 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 15 June 2025 08:07:52 -0400 (0:00:00.315) 0:18:02.531 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 15 June 2025 08:07:52 -0400 (0:00:00.230) 0:18:02.761 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Sunday 15 June 2025 08:07:53 -0400 (0:00:00.225) 0:18:02.987 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Sunday 15 June 2025 08:07:53 -0400 (0:00:00.174) 0:18:03.162 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Sunday 15 June 2025 08:07:53 -0400 (0:00:00.293) 0:18:03.455 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Sunday 15 June 2025 08:07:53 -0400 (0:00:00.319) 0:18:03.775 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Sunday 15 June 2025 08:07:54 -0400 (0:00:00.342) 0:18:04.117 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Sunday 15 June 2025 08:07:54 -0400 (0:00:00.334) 0:18:04.451 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Sunday 15 June 2025 08:07:54 -0400 (0:00:00.223) 0:18:04.675 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Sunday 15 June 2025 08:07:55 -0400 (0:00:00.399) 0:18:05.074 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 15 June 2025 08:07:55 -0400 (0:00:00.228) 0:18:05.303 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 15 June 2025 08:07:56 -0400 (0:00:00.673) 0:18:05.976 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 15 June 2025 08:07:56 -0400 (0:00:00.268) 0:18:06.245 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 15 June 2025 08:07:56 -0400 (0:00:00.210) 0:18:06.455 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 15 June 2025 08:07:56 -0400 (0:00:00.275) 0:18:06.731 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 15 June 2025 08:07:57 -0400 (0:00:00.314) 0:18:07.046 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 15 June 2025 08:07:57 -0400 (0:00:00.241) 0:18:07.287 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 15 June 2025 08:07:57 -0400 (0:00:00.445) 0:18:07.733 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 15 June 2025 08:07:58 -0400 (0:00:00.305) 0:18:08.038 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989161.0323207, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749989091.357455, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 222344, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1749989091.357455, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 15 June 2025 08:07:59 -0400 (0:00:01.706) 0:18:09.745 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 15 June 2025 08:08:00 -0400 (0:00:00.364) 0:18:10.109 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 15 June 2025 08:08:00 -0400 (0:00:00.344) 0:18:10.453 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 15 June 2025 08:08:00 -0400 (0:00:00.236) 0:18:10.690 *********** ok: [managed-node14] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 15 June 2025 08:08:00 -0400 (0:00:00.262) 0:18:10.953 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 15 June 2025 08:08:01 -0400 (0:00:00.293) 0:18:11.247 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 15 June 2025 08:08:01 -0400 (0:00:00.331) 0:18:11.578 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989219.7362068, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749989091.4934547, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 222454, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1749989091.4934547, "nlink": 1, "path": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 15 June 2025 08:08:03 -0400 (0:00:01.932) 0:18:13.510 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 15 June 2025 08:08:08 -0400 (0:00:04.631) 0:18:18.142 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.009526", "end": "2025-06-15 08:08:09.378868", "rc": 0, "start": "2025-06-15 08:08:09.369342" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 6cef51e7-40d1-4541-a5c4-6934aa486a81 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 934306 Threads: 2 Salt: 87 e5 c3 6a f7 12 58 3a 5c 79 f2 7b 13 f2 13 ae 48 41 d1 53 f1 02 10 e1 6d c1 43 76 e5 01 b2 dd AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 02 a0 88 1a 9b f7 bb ed 3b cf fa 03 74 8b 5e a1 80 a2 8a 37 51 a4 73 9d 27 93 5a 56 e2 8c 51 b1 Digest: 24 44 0e a7 ed d1 c5 81 3c 37 36 ce 05 0a 07 08 aa 80 1d 79 54 ee 2d 69 39 6f 38 6c 5e bd a4 53 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 15 June 2025 08:08:09 -0400 (0:00:01.495) 0:18:19.638 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 15 June 2025 08:08:09 -0400 (0:00:00.306) 0:18:19.945 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 15 June 2025 08:08:10 -0400 (0:00:00.313) 0:18:20.259 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 15 June 2025 08:08:10 -0400 (0:00:00.346) 0:18:20.605 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 15 June 2025 08:08:10 -0400 (0:00:00.279) 0:18:20.884 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Sunday 15 June 2025 08:08:11 -0400 (0:00:00.412) 0:18:21.297 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Sunday 15 June 2025 08:08:11 -0400 (0:00:00.288) 0:18:21.586 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Sunday 15 June 2025 08:08:11 -0400 (0:00:00.313) 0:18:21.900 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-6cef51e7-40d1-4541-a5c4-6934aa486a81 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Sunday 15 June 2025 08:08:12 -0400 (0:00:00.355) 0:18:22.255 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Sunday 15 June 2025 08:08:12 -0400 (0:00:00.317) 0:18:22.572 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Sunday 15 June 2025 08:08:12 -0400 (0:00:00.364) 0:18:22.936 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Sunday 15 June 2025 08:08:13 -0400 (0:00:00.282) 0:18:23.219 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Sunday 15 June 2025 08:08:13 -0400 (0:00:00.452) 0:18:23.671 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 15 June 2025 08:08:13 -0400 (0:00:00.243) 0:18:23.915 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 15 June 2025 08:08:14 -0400 (0:00:00.275) 0:18:24.191 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 15 June 2025 08:08:14 -0400 (0:00:00.267) 0:18:24.459 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 15 June 2025 08:08:14 -0400 (0:00:00.215) 0:18:24.674 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 15 June 2025 08:08:15 -0400 (0:00:00.763) 0:18:25.438 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 15 June 2025 08:08:15 -0400 (0:00:00.311) 0:18:25.750 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 15 June 2025 08:08:15 -0400 (0:00:00.173) 0:18:25.923 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 15 June 2025 08:08:16 -0400 (0:00:00.276) 0:18:26.200 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 15 June 2025 08:08:16 -0400 (0:00:00.258) 0:18:26.458 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 15 June 2025 08:08:16 -0400 (0:00:00.230) 0:18:26.688 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 15 June 2025 08:08:16 -0400 (0:00:00.178) 0:18:26.867 *********** ok: [managed-node14] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 15 June 2025 08:08:18 -0400 (0:00:01.725) 0:18:28.592 *********** ok: [managed-node14] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 15 June 2025 08:08:20 -0400 (0:00:01.849) 0:18:30.442 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 15 June 2025 08:08:20 -0400 (0:00:00.399) 0:18:30.841 *********** ok: [managed-node14] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 15 June 2025 08:08:21 -0400 (0:00:00.324) 0:18:31.166 *********** ok: [managed-node14] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 15 June 2025 08:08:22 -0400 (0:00:01.696) 0:18:32.862 *********** skipping: [managed-node14] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 15 June 2025 08:08:23 -0400 (0:00:00.280) 0:18:33.143 *********** skipping: [managed-node14] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 15 June 2025 08:08:23 -0400 (0:00:00.287) 0:18:33.430 *********** skipping: [managed-node14] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 15 June 2025 08:08:23 -0400 (0:00:00.212) 0:18:33.643 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Sunday 15 June 2025 08:08:23 -0400 (0:00:00.239) 0:18:33.883 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Sunday 15 June 2025 08:08:24 -0400 (0:00:00.252) 0:18:34.135 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Sunday 15 June 2025 08:08:24 -0400 (0:00:00.287) 0:18:34.423 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Sunday 15 June 2025 08:08:24 -0400 (0:00:00.268) 0:18:34.692 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Sunday 15 June 2025 08:08:25 -0400 (0:00:00.456) 0:18:35.149 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Sunday 15 June 2025 08:08:25 -0400 (0:00:00.217) 0:18:35.366 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Sunday 15 June 2025 08:08:25 -0400 (0:00:00.210) 0:18:35.577 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Sunday 15 June 2025 08:08:25 -0400 (0:00:00.228) 0:18:35.805 *********** skipping: [managed-node14] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Sunday 15 June 2025 08:08:26 -0400 (0:00:00.237) 0:18:36.043 *********** skipping: [managed-node14] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Sunday 15 June 2025 08:08:26 -0400 (0:00:00.308) 0:18:36.351 *********** skipping: [managed-node14] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Sunday 15 June 2025 08:08:26 -0400 (0:00:00.332) 0:18:36.684 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Sunday 15 June 2025 08:08:26 -0400 (0:00:00.256) 0:18:36.940 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Sunday 15 June 2025 08:08:27 -0400 (0:00:00.242) 0:18:37.182 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Sunday 15 June 2025 08:08:27 -0400 (0:00:00.316) 0:18:37.498 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Sunday 15 June 2025 08:08:27 -0400 (0:00:00.158) 0:18:37.656 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Sunday 15 June 2025 08:08:27 -0400 (0:00:00.199) 0:18:37.856 *********** ok: [managed-node14] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Sunday 15 June 2025 08:08:28 -0400 (0:00:00.268) 0:18:38.125 *********** ok: [managed-node14] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Sunday 15 June 2025 08:08:28 -0400 (0:00:00.278) 0:18:38.403 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 15 June 2025 08:08:28 -0400 (0:00:00.349) 0:18:38.753 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023146", "end": "2025-06-15 08:08:30.015513", "rc": 0, "start": "2025-06-15 08:08:29.992367" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 15 June 2025 08:08:30 -0400 (0:00:01.511) 0:18:40.264 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 15 June 2025 08:08:30 -0400 (0:00:00.297) 0:18:40.562 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 15 June 2025 08:08:31 -0400 (0:00:00.424) 0:18:40.987 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 15 June 2025 08:08:31 -0400 (0:00:00.186) 0:18:41.173 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 15 June 2025 08:08:31 -0400 (0:00:00.252) 0:18:41.426 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 15 June 2025 08:08:31 -0400 (0:00:00.261) 0:18:41.687 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 15 June 2025 08:08:31 -0400 (0:00:00.273) 0:18:41.960 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Sunday 15 June 2025 08:08:32 -0400 (0:00:00.200) 0:18:42.161 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Sunday 15 June 2025 08:08:32 -0400 (0:00:00.235) 0:18:42.397 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Sunday 15 June 2025 08:08:32 -0400 (0:00:00.231) 0:18:42.628 *********** changed: [managed-node14] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:436 Sunday 15 June 2025 08:08:34 -0400 (0:00:01.599) 0:18:44.227 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node14 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 15 June 2025 08:08:34 -0400 (0:00:00.349) 0:18:44.577 *********** ok: [managed-node14] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 15 June 2025 08:08:34 -0400 (0:00:00.262) 0:18:44.840 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 08:08:35 -0400 (0:00:00.307) 0:18:45.147 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 08:08:35 -0400 (0:00:00.712) 0:18:45.860 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 08:08:36 -0400 (0:00:00.235) 0:18:46.095 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 08:08:36 -0400 (0:00:00.527) 0:18:46.623 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 08:08:36 -0400 (0:00:00.227) 0:18:46.850 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 08:08:37 -0400 (0:00:00.313) 0:18:47.163 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 08:08:37 -0400 (0:00:00.230) 0:18:47.394 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 08:08:37 -0400 (0:00:00.190) 0:18:47.584 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 08:08:38 -0400 (0:00:00.441) 0:18:48.025 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 08:08:42 -0400 (0:00:04.505) 0:18:52.531 *********** ok: [managed-node14] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 08:08:42 -0400 (0:00:00.222) 0:18:52.754 *********** ok: [managed-node14] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 08:08:43 -0400 (0:00:00.251) 0:18:53.005 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 08:08:48 -0400 (0:00:05.473) 0:18:58.479 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 08:08:48 -0400 (0:00:00.360) 0:18:58.840 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 08:08:49 -0400 (0:00:00.315) 0:18:59.155 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 08:08:49 -0400 (0:00:00.258) 0:18:59.413 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 08:08:49 -0400 (0:00:00.239) 0:18:59.653 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 08:08:54 -0400 (0:00:05.002) 0:19:04.656 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service": { "name": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service": { "name": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 08:08:57 -0400 (0:00:02.761) 0:19:07.417 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 08:08:57 -0400 (0:00:00.312) 0:19:07.730 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2d6cef51e7\x2d40d1\x2d4541\x2da5c4\x2d6934aa486a81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "name": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-mapper-foo\\x2dtest1.device system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-6cef51e7-40d1-4541-a5c4-6934aa486a81 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-6cef51e7-40d1-4541-a5c4-6934aa486a81 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2025-06-15 08:06:54 EDT", "StateChangeTimestampMonotonic": "2517957418", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...d40d1\x2d4541\x2da5c4\x2d6934aa486a81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "name": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 08:09:01 -0400 (0:00:03.325) 0:19:11.055 *********** fatal: [managed-node14]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-6cef51e7-40d1-4541-a5c4-6934aa486a81' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 15 June 2025 08:09:06 -0400 (0:00:05.665) 0:19:16.721 *********** fatal: [managed-node14]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-6cef51e7-40d1-4541-a5c4-6934aa486a81' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 08:09:07 -0400 (0:00:00.263) 0:19:16.985 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2d6cef51e7\x2d40d1\x2d4541\x2da5c4\x2d6934aa486a81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "name": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2025-06-15 08:06:54 EDT", "StateChangeTimestampMonotonic": "2517957418", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...d40d1\x2d4541\x2da5c4\x2d6934aa486a81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "name": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 15 June 2025 08:09:10 -0400 (0:00:03.480) 0:19:20.465 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 15 June 2025 08:09:10 -0400 (0:00:00.325) 0:19:20.790 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 15 June 2025 08:09:11 -0400 (0:00:00.303) 0:19:21.094 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Sunday 15 June 2025 08:09:11 -0400 (0:00:00.236) 0:19:21.331 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989313.932024, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1749989313.932024, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1749989313.932024, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "4143580416", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Sunday 15 June 2025 08:09:12 -0400 (0:00:01.466) 0:19:22.797 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:460 Sunday 15 June 2025 08:09:12 -0400 (0:00:00.134) 0:19:22.932 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 08:09:13 -0400 (0:00:00.497) 0:19:23.429 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 08:09:14 -0400 (0:00:00.798) 0:19:24.227 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 08:09:14 -0400 (0:00:00.271) 0:19:24.499 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 08:09:15 -0400 (0:00:00.574) 0:19:25.073 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 08:09:15 -0400 (0:00:00.336) 0:19:25.410 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 08:09:15 -0400 (0:00:00.279) 0:19:25.690 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 08:09:15 -0400 (0:00:00.202) 0:19:25.892 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 08:09:16 -0400 (0:00:00.235) 0:19:26.128 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 08:09:16 -0400 (0:00:00.620) 0:19:26.748 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 08:09:21 -0400 (0:00:04.728) 0:19:31.476 *********** ok: [managed-node14] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 08:09:21 -0400 (0:00:00.376) 0:19:31.853 *********** ok: [managed-node14] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 08:09:22 -0400 (0:00:00.194) 0:19:32.047 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 08:09:27 -0400 (0:00:05.777) 0:19:37.825 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 08:09:28 -0400 (0:00:00.351) 0:19:38.177 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 08:09:28 -0400 (0:00:00.144) 0:19:38.321 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 08:09:28 -0400 (0:00:00.126) 0:19:38.448 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 08:09:28 -0400 (0:00:00.115) 0:19:38.563 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 08:09:32 -0400 (0:00:04.261) 0:19:42.825 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service": { "name": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service": { "name": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 08:09:35 -0400 (0:00:02.638) 0:19:45.463 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 08:09:35 -0400 (0:00:00.265) 0:19:45.728 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2d6cef51e7\x2d40d1\x2d4541\x2da5c4\x2d6934aa486a81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "name": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket system-systemd\\x2dcryptsetup.slice dev-mapper-foo\\x2dtest1.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-6cef51e7-40d1-4541-a5c4-6934aa486a81 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-6cef51e7-40d1-4541-a5c4-6934aa486a81 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2025-06-15 08:06:54 EDT", "StateChangeTimestampMonotonic": "2517957418", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...d40d1\x2d4541\x2da5c4\x2d6934aa486a81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "name": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 08:09:38 -0400 (0:00:03.166) 0:19:48.895 *********** changed: [managed-node14] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 15 June 2025 08:09:44 -0400 (0:00:06.022) 0:19:54.918 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 15 June 2025 08:09:45 -0400 (0:00:00.214) 0:19:55.132 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989101.9544349, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "abdb16d652768a091774dd564181819b365d8733", "ctime": 1749989101.9514349, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1749989101.9514349, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2912082900", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 15 June 2025 08:09:46 -0400 (0:00:01.391) 0:19:56.524 *********** ok: [managed-node14] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 08:09:48 -0400 (0:00:01.514) 0:19:58.038 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2d6cef51e7\x2d40d1\x2d4541\x2da5c4\x2d6934aa486a81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "name": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.device", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2025-06-15 08:06:54 EDT", "StateChangeTimestampMonotonic": "2517957418", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...d40d1\x2d4541\x2da5c4\x2d6934aa486a81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "name": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 15 June 2025 08:09:51 -0400 (0:00:03.343) 0:20:01.382 *********** ok: [managed-node14] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 15 June 2025 08:09:51 -0400 (0:00:00.268) 0:20:01.651 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 15 June 2025 08:09:51 -0400 (0:00:00.166) 0:20:01.818 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 15 June 2025 08:09:52 -0400 (0:00:00.317) 0:20:02.136 *********** changed: [managed-node14] => (item={'src': '/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6cef51e7-40d1-4541-a5c4-6934aa486a81" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 15 June 2025 08:09:53 -0400 (0:00:01.762) 0:20:03.898 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 15 June 2025 08:09:55 -0400 (0:00:01.858) 0:20:05.757 *********** changed: [managed-node14] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 15 June 2025 08:09:57 -0400 (0:00:01.764) 0:20:07.521 *********** skipping: [managed-node14] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 15 June 2025 08:09:57 -0400 (0:00:00.243) 0:20:07.765 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 15 June 2025 08:09:59 -0400 (0:00:01.929) 0:20:09.695 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989116.7304065, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "b44e0c13378e4547cef1a46e344db727b891e173", "ctime": 1749989109.164421, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 390070468, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1749989109.1634212, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "2163277715", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 15 June 2025 08:10:01 -0400 (0:00:01.637) 0:20:11.333 *********** changed: [managed-node14] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-6cef51e7-40d1-4541-a5c4-6934aa486a81', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 15 June 2025 08:10:02 -0400 (0:00:01.610) 0:20:12.943 *********** ok: [managed-node14] TASK [Verify role results] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:477 Sunday 15 June 2025 08:10:05 -0400 (0:00:02.118) 0:20:15.062 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node14 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 15 June 2025 08:10:05 -0400 (0:00:00.531) 0:20:15.593 *********** ok: [managed-node14] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 15 June 2025 08:10:05 -0400 (0:00:00.305) 0:20:15.899 *********** skipping: [managed-node14] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 15 June 2025 08:10:06 -0400 (0:00:00.216) 0:20:16.115 *********** ok: [managed-node14] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "62c35678-1dc9-47a5-b558-ab3239e80e55" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "1Y9WCB-x61J-PNpb-KWfI-65dd-u3gi-UoLrvq" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 15 June 2025 08:10:07 -0400 (0:00:01.491) 0:20:17.607 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002273", "end": "2025-06-15 08:10:08.732559", "rc": 0, "start": "2025-06-15 08:10:08.730286" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 15 June 2025 08:10:09 -0400 (0:00:01.424) 0:20:19.032 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002280", "end": "2025-06-15 08:10:10.400188", "failed_when_result": false, "rc": 0, "start": "2025-06-15 08:10:10.397908" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 15 June 2025 08:10:10 -0400 (0:00:01.710) 0:20:20.743 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node14 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Sunday 15 June 2025 08:10:11 -0400 (0:00:00.567) 0:20:21.311 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Sunday 15 June 2025 08:10:11 -0400 (0:00:00.194) 0:20:21.506 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.023155", "end": "2025-06-15 08:10:12.526328", "rc": 0, "start": "2025-06-15 08:10:12.503173" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Sunday 15 June 2025 08:10:12 -0400 (0:00:01.198) 0:20:22.705 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Sunday 15 June 2025 08:10:12 -0400 (0:00:00.244) 0:20:22.949 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node14 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Sunday 15 June 2025 08:10:13 -0400 (0:00:00.564) 0:20:23.514 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Sunday 15 June 2025 08:10:14 -0400 (0:00:00.454) 0:20:23.968 *********** ok: [managed-node14] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Sunday 15 June 2025 08:10:15 -0400 (0:00:01.572) 0:20:25.541 *********** ok: [managed-node14] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Sunday 15 June 2025 08:10:15 -0400 (0:00:00.258) 0:20:25.800 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Sunday 15 June 2025 08:10:16 -0400 (0:00:00.345) 0:20:26.146 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Sunday 15 June 2025 08:10:16 -0400 (0:00:00.421) 0:20:26.567 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Sunday 15 June 2025 08:10:16 -0400 (0:00:00.137) 0:20:26.705 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Sunday 15 June 2025 08:10:16 -0400 (0:00:00.124) 0:20:26.829 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Sunday 15 June 2025 08:10:17 -0400 (0:00:00.167) 0:20:26.996 *********** ok: [managed-node14] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Sunday 15 June 2025 08:10:17 -0400 (0:00:00.209) 0:20:27.206 *********** ok: [managed-node14] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.207 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:74 Sunday 15 June 2025 08:10:18 -0400 (0:00:01.597) 0:20:28.803 *********** skipping: [managed-node14] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:84 Sunday 15 June 2025 08:10:19 -0400 (0:00:00.265) 0:20:29.069 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node14 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Sunday 15 June 2025 08:10:19 -0400 (0:00:00.458) 0:20:29.528 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Sunday 15 June 2025 08:10:19 -0400 (0:00:00.400) 0:20:29.928 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Sunday 15 June 2025 08:10:20 -0400 (0:00:00.264) 0:20:30.192 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Sunday 15 June 2025 08:10:20 -0400 (0:00:00.256) 0:20:30.449 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Sunday 15 June 2025 08:10:20 -0400 (0:00:00.134) 0:20:30.584 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Sunday 15 June 2025 08:10:20 -0400 (0:00:00.219) 0:20:30.804 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Sunday 15 June 2025 08:10:21 -0400 (0:00:00.259) 0:20:31.064 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Sunday 15 June 2025 08:10:21 -0400 (0:00:00.229) 0:20:31.293 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Sunday 15 June 2025 08:10:21 -0400 (0:00:00.333) 0:20:31.626 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Sunday 15 June 2025 08:10:21 -0400 (0:00:00.246) 0:20:31.873 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Sunday 15 June 2025 08:10:22 -0400 (0:00:00.296) 0:20:32.170 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Sunday 15 June 2025 08:10:22 -0400 (0:00:00.293) 0:20:32.464 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node14 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Sunday 15 June 2025 08:10:23 -0400 (0:00:00.539) 0:20:33.004 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node14 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Sunday 15 June 2025 08:10:23 -0400 (0:00:00.590) 0:20:33.594 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Sunday 15 June 2025 08:10:23 -0400 (0:00:00.169) 0:20:33.763 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Sunday 15 June 2025 08:10:24 -0400 (0:00:00.303) 0:20:34.067 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Sunday 15 June 2025 08:10:24 -0400 (0:00:00.280) 0:20:34.348 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Sunday 15 June 2025 08:10:24 -0400 (0:00:00.230) 0:20:34.578 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Sunday 15 June 2025 08:10:24 -0400 (0:00:00.216) 0:20:34.795 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Sunday 15 June 2025 08:10:24 -0400 (0:00:00.160) 0:20:34.955 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Sunday 15 June 2025 08:10:25 -0400 (0:00:00.230) 0:20:35.185 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node14 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Sunday 15 June 2025 08:10:25 -0400 (0:00:00.474) 0:20:35.659 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node14 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Sunday 15 June 2025 08:10:26 -0400 (0:00:00.423) 0:20:36.083 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Sunday 15 June 2025 08:10:26 -0400 (0:00:00.258) 0:20:36.341 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Sunday 15 June 2025 08:10:26 -0400 (0:00:00.244) 0:20:36.585 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Sunday 15 June 2025 08:10:26 -0400 (0:00:00.235) 0:20:36.821 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Sunday 15 June 2025 08:10:27 -0400 (0:00:00.171) 0:20:36.992 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node14 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Sunday 15 June 2025 08:10:27 -0400 (0:00:00.450) 0:20:37.443 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Sunday 15 June 2025 08:10:27 -0400 (0:00:00.364) 0:20:37.807 *********** skipping: [managed-node14] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Sunday 15 June 2025 08:10:28 -0400 (0:00:00.283) 0:20:38.091 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node14 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Sunday 15 June 2025 08:10:28 -0400 (0:00:00.784) 0:20:38.876 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Sunday 15 June 2025 08:10:29 -0400 (0:00:00.217) 0:20:39.093 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Sunday 15 June 2025 08:10:29 -0400 (0:00:00.161) 0:20:39.255 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Sunday 15 June 2025 08:10:29 -0400 (0:00:00.169) 0:20:39.425 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Sunday 15 June 2025 08:10:29 -0400 (0:00:00.138) 0:20:39.564 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Sunday 15 June 2025 08:10:29 -0400 (0:00:00.211) 0:20:39.775 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Sunday 15 June 2025 08:10:29 -0400 (0:00:00.081) 0:20:39.856 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Sunday 15 June 2025 08:10:30 -0400 (0:00:00.184) 0:20:40.041 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node14 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Sunday 15 June 2025 08:10:30 -0400 (0:00:00.334) 0:20:40.375 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node14 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Sunday 15 June 2025 08:10:30 -0400 (0:00:00.201) 0:20:40.577 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Sunday 15 June 2025 08:10:30 -0400 (0:00:00.227) 0:20:40.804 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Sunday 15 June 2025 08:10:31 -0400 (0:00:00.189) 0:20:40.993 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Sunday 15 June 2025 08:10:31 -0400 (0:00:00.205) 0:20:41.199 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Sunday 15 June 2025 08:10:31 -0400 (0:00:00.212) 0:20:41.411 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Sunday 15 June 2025 08:10:31 -0400 (0:00:00.186) 0:20:41.598 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Sunday 15 June 2025 08:10:31 -0400 (0:00:00.179) 0:20:41.778 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Sunday 15 June 2025 08:10:31 -0400 (0:00:00.170) 0:20:41.948 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node14 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Sunday 15 June 2025 08:10:32 -0400 (0:00:00.474) 0:20:42.423 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Sunday 15 June 2025 08:10:32 -0400 (0:00:00.234) 0:20:42.657 *********** skipping: [managed-node14] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Sunday 15 June 2025 08:10:33 -0400 (0:00:00.339) 0:20:42.996 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Sunday 15 June 2025 08:10:33 -0400 (0:00:00.139) 0:20:43.135 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Sunday 15 June 2025 08:10:33 -0400 (0:00:00.205) 0:20:43.341 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Sunday 15 June 2025 08:10:33 -0400 (0:00:00.159) 0:20:43.500 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Sunday 15 June 2025 08:10:33 -0400 (0:00:00.141) 0:20:43.642 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Sunday 15 June 2025 08:10:33 -0400 (0:00:00.144) 0:20:43.786 *********** ok: [managed-node14] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Sunday 15 June 2025 08:10:34 -0400 (0:00:00.201) 0:20:43.988 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node14 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 15 June 2025 08:10:34 -0400 (0:00:00.385) 0:20:44.373 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 15 June 2025 08:10:34 -0400 (0:00:00.218) 0:20:44.592 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node14 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 15 June 2025 08:10:35 -0400 (0:00:01.118) 0:20:45.711 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 15 June 2025 08:10:35 -0400 (0:00:00.249) 0:20:45.960 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 15 June 2025 08:10:36 -0400 (0:00:00.276) 0:20:46.236 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Sunday 15 June 2025 08:10:36 -0400 (0:00:00.411) 0:20:46.648 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Sunday 15 June 2025 08:10:36 -0400 (0:00:00.182) 0:20:46.831 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Sunday 15 June 2025 08:10:37 -0400 (0:00:00.297) 0:20:47.128 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Sunday 15 June 2025 08:10:37 -0400 (0:00:00.319) 0:20:47.447 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Sunday 15 June 2025 08:10:37 -0400 (0:00:00.218) 0:20:47.666 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Sunday 15 June 2025 08:10:37 -0400 (0:00:00.145) 0:20:47.811 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Sunday 15 June 2025 08:10:38 -0400 (0:00:00.214) 0:20:48.025 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Sunday 15 June 2025 08:10:38 -0400 (0:00:00.260) 0:20:48.286 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 15 June 2025 08:10:38 -0400 (0:00:00.216) 0:20:48.503 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 15 June 2025 08:10:39 -0400 (0:00:00.565) 0:20:49.068 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 15 June 2025 08:10:39 -0400 (0:00:00.261) 0:20:49.329 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 15 June 2025 08:10:39 -0400 (0:00:00.318) 0:20:49.647 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 15 June 2025 08:10:40 -0400 (0:00:00.822) 0:20:50.469 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 15 June 2025 08:10:40 -0400 (0:00:00.230) 0:20:50.700 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 15 June 2025 08:10:40 -0400 (0:00:00.179) 0:20:50.880 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 15 June 2025 08:10:41 -0400 (0:00:00.317) 0:20:51.197 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 15 June 2025 08:10:41 -0400 (0:00:00.246) 0:20:51.444 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989384.6158874, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749989384.6158874, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 251684, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1749989384.6158874, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 15 June 2025 08:10:43 -0400 (0:00:01.725) 0:20:53.170 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 15 June 2025 08:10:43 -0400 (0:00:00.228) 0:20:53.398 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 15 June 2025 08:10:43 -0400 (0:00:00.261) 0:20:53.660 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 15 June 2025 08:10:43 -0400 (0:00:00.299) 0:20:53.960 *********** ok: [managed-node14] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 15 June 2025 08:10:44 -0400 (0:00:00.300) 0:20:54.260 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 15 June 2025 08:10:44 -0400 (0:00:00.279) 0:20:54.540 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 15 June 2025 08:10:44 -0400 (0:00:00.335) 0:20:54.875 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 15 June 2025 08:10:45 -0400 (0:00:00.231) 0:20:55.106 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 15 June 2025 08:10:49 -0400 (0:00:04.437) 0:20:59.544 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 15 June 2025 08:10:49 -0400 (0:00:00.235) 0:20:59.780 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 15 June 2025 08:10:50 -0400 (0:00:00.231) 0:21:00.011 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 15 June 2025 08:10:50 -0400 (0:00:00.308) 0:21:00.320 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 15 June 2025 08:10:50 -0400 (0:00:00.336) 0:21:00.656 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 15 June 2025 08:10:51 -0400 (0:00:00.320) 0:21:00.976 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Sunday 15 June 2025 08:10:51 -0400 (0:00:00.253) 0:21:01.230 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Sunday 15 June 2025 08:10:51 -0400 (0:00:00.225) 0:21:01.456 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Sunday 15 June 2025 08:10:51 -0400 (0:00:00.218) 0:21:01.674 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Sunday 15 June 2025 08:10:52 -0400 (0:00:00.292) 0:21:01.966 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Sunday 15 June 2025 08:10:52 -0400 (0:00:00.345) 0:21:02.312 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Sunday 15 June 2025 08:10:52 -0400 (0:00:00.269) 0:21:02.582 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Sunday 15 June 2025 08:10:52 -0400 (0:00:00.348) 0:21:02.930 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Sunday 15 June 2025 08:10:53 -0400 (0:00:00.359) 0:21:03.290 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 15 June 2025 08:10:53 -0400 (0:00:00.206) 0:21:03.497 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 15 June 2025 08:10:53 -0400 (0:00:00.159) 0:21:03.657 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 15 June 2025 08:10:54 -0400 (0:00:00.433) 0:21:04.090 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 15 June 2025 08:10:54 -0400 (0:00:00.176) 0:21:04.267 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 15 June 2025 08:10:54 -0400 (0:00:00.380) 0:21:04.648 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 15 June 2025 08:10:54 -0400 (0:00:00.287) 0:21:04.935 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 15 June 2025 08:10:55 -0400 (0:00:00.269) 0:21:05.204 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 15 June 2025 08:10:55 -0400 (0:00:00.302) 0:21:05.507 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 15 June 2025 08:10:55 -0400 (0:00:00.285) 0:21:05.792 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 15 June 2025 08:10:56 -0400 (0:00:00.462) 0:21:06.255 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 15 June 2025 08:10:56 -0400 (0:00:00.226) 0:21:06.482 *********** ok: [managed-node14] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 15 June 2025 08:10:58 -0400 (0:00:01.605) 0:21:08.088 *********** ok: [managed-node14] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 15 June 2025 08:10:59 -0400 (0:00:01.113) 0:21:09.201 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 15 June 2025 08:10:59 -0400 (0:00:00.198) 0:21:09.399 *********** ok: [managed-node14] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 15 June 2025 08:10:59 -0400 (0:00:00.255) 0:21:09.655 *********** ok: [managed-node14] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 15 June 2025 08:11:01 -0400 (0:00:01.343) 0:21:10.998 *********** skipping: [managed-node14] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 15 June 2025 08:11:01 -0400 (0:00:00.253) 0:21:11.252 *********** skipping: [managed-node14] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 15 June 2025 08:11:01 -0400 (0:00:00.248) 0:21:11.501 *********** skipping: [managed-node14] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 15 June 2025 08:11:01 -0400 (0:00:00.227) 0:21:11.728 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Sunday 15 June 2025 08:11:02 -0400 (0:00:00.298) 0:21:12.027 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Sunday 15 June 2025 08:11:02 -0400 (0:00:00.206) 0:21:12.233 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Sunday 15 June 2025 08:11:02 -0400 (0:00:00.217) 0:21:12.451 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Sunday 15 June 2025 08:11:02 -0400 (0:00:00.237) 0:21:12.688 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Sunday 15 June 2025 08:11:02 -0400 (0:00:00.158) 0:21:12.846 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Sunday 15 June 2025 08:11:03 -0400 (0:00:00.268) 0:21:13.114 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Sunday 15 June 2025 08:11:03 -0400 (0:00:00.269) 0:21:13.383 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Sunday 15 June 2025 08:11:03 -0400 (0:00:00.264) 0:21:13.648 *********** skipping: [managed-node14] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Sunday 15 June 2025 08:11:03 -0400 (0:00:00.235) 0:21:13.883 *********** skipping: [managed-node14] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Sunday 15 June 2025 08:11:04 -0400 (0:00:00.242) 0:21:14.126 *********** skipping: [managed-node14] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Sunday 15 June 2025 08:11:04 -0400 (0:00:00.223) 0:21:14.349 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Sunday 15 June 2025 08:11:04 -0400 (0:00:00.179) 0:21:14.529 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Sunday 15 June 2025 08:11:04 -0400 (0:00:00.291) 0:21:14.820 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Sunday 15 June 2025 08:11:04 -0400 (0:00:00.109) 0:21:14.930 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Sunday 15 June 2025 08:11:05 -0400 (0:00:00.315) 0:21:15.245 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Sunday 15 June 2025 08:11:05 -0400 (0:00:00.298) 0:21:15.543 *********** ok: [managed-node14] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Sunday 15 June 2025 08:11:05 -0400 (0:00:00.383) 0:21:15.927 *********** ok: [managed-node14] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Sunday 15 June 2025 08:11:06 -0400 (0:00:00.288) 0:21:16.215 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 15 June 2025 08:11:06 -0400 (0:00:00.361) 0:21:16.577 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023432", "end": "2025-06-15 08:11:07.965855", "rc": 0, "start": "2025-06-15 08:11:07.942423" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 15 June 2025 08:11:08 -0400 (0:00:01.608) 0:21:18.186 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 15 June 2025 08:11:08 -0400 (0:00:00.306) 0:21:18.492 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 15 June 2025 08:11:08 -0400 (0:00:00.395) 0:21:18.887 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 15 June 2025 08:11:09 -0400 (0:00:00.319) 0:21:19.207 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 15 June 2025 08:11:09 -0400 (0:00:00.282) 0:21:19.489 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 15 June 2025 08:11:09 -0400 (0:00:00.315) 0:21:19.804 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 15 June 2025 08:11:10 -0400 (0:00:00.237) 0:21:20.042 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Sunday 15 June 2025 08:11:10 -0400 (0:00:00.307) 0:21:20.349 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Sunday 15 June 2025 08:11:10 -0400 (0:00:00.173) 0:21:20.523 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Sunday 15 June 2025 08:11:10 -0400 (0:00:00.167) 0:21:20.690 *********** changed: [managed-node14] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:483 Sunday 15 June 2025 08:11:12 -0400 (0:00:01.792) 0:21:22.482 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node14 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 15 June 2025 08:11:13 -0400 (0:00:00.596) 0:21:23.079 *********** ok: [managed-node14] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 15 June 2025 08:11:13 -0400 (0:00:00.320) 0:21:23.399 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 08:11:13 -0400 (0:00:00.296) 0:21:23.696 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 08:11:14 -0400 (0:00:00.333) 0:21:24.029 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 08:11:14 -0400 (0:00:00.256) 0:21:24.285 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 08:11:14 -0400 (0:00:00.583) 0:21:24.869 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 08:11:15 -0400 (0:00:00.236) 0:21:25.106 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 08:11:15 -0400 (0:00:00.413) 0:21:25.519 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 08:11:15 -0400 (0:00:00.262) 0:21:25.782 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 08:11:16 -0400 (0:00:00.247) 0:21:26.030 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 08:11:16 -0400 (0:00:00.543) 0:21:26.573 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 08:11:21 -0400 (0:00:05.043) 0:21:31.617 *********** ok: [managed-node14] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 08:11:21 -0400 (0:00:00.304) 0:21:31.922 *********** ok: [managed-node14] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 08:11:22 -0400 (0:00:00.267) 0:21:32.189 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 08:11:27 -0400 (0:00:05.438) 0:21:37.628 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 08:11:28 -0400 (0:00:00.380) 0:21:38.009 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 08:11:28 -0400 (0:00:00.182) 0:21:38.191 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 08:11:28 -0400 (0:00:00.266) 0:21:38.457 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 08:11:28 -0400 (0:00:00.167) 0:21:38.625 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 08:11:32 -0400 (0:00:03.938) 0:21:42.564 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service": { "name": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service": { "name": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 08:11:35 -0400 (0:00:02.985) 0:21:45.550 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 08:11:35 -0400 (0:00:00.320) 0:21:45.871 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2d6cef51e7\x2d40d1\x2d4541\x2da5c4\x2d6934aa486a81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "name": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device system-systemd\\x2dcryptsetup.slice systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-6cef51e7-40d1-4541-a5c4-6934aa486a81", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-6cef51e7-40d1-4541-a5c4-6934aa486a81 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-6cef51e7-40d1-4541-a5c4-6934aa486a81 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2025-06-15 08:06:54 EDT", "StateChangeTimestampMonotonic": "2517957418", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...d40d1\x2d4541\x2da5c4\x2d6934aa486a81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "name": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 08:11:39 -0400 (0:00:03.604) 0:21:49.475 *********** fatal: [managed-node14]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 15 June 2025 08:11:44 -0400 (0:00:05.363) 0:21:54.839 *********** fatal: [managed-node14]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 08:11:45 -0400 (0:00:00.298) 0:21:55.137 *********** changed: [managed-node14] => (item=systemd-cryptsetup@luks\x2d6cef51e7\x2d40d1\x2d4541\x2da5c4\x2d6934aa486a81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "name": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d6cef51e7\\x2d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node14] => (item=systemd-cryptsetup@luk...d40d1\x2d4541\x2da5c4\x2d6934aa486a81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "name": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d40d1\\x2d4541\\x2da5c4\\x2d6934aa486a81.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 15 June 2025 08:11:48 -0400 (0:00:03.570) 0:21:58.707 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 15 June 2025 08:11:48 -0400 (0:00:00.219) 0:21:58.927 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 15 June 2025 08:11:49 -0400 (0:00:00.275) 0:21:59.202 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Sunday 15 June 2025 08:11:49 -0400 (0:00:00.194) 0:21:59.397 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989472.1967206, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1749989472.1967206, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1749989472.1967206, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "196242495", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Sunday 15 June 2025 08:11:50 -0400 (0:00:01.350) 0:22:00.747 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:507 Sunday 15 June 2025 08:11:51 -0400 (0:00:00.221) 0:22:00.969 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 08:11:51 -0400 (0:00:00.595) 0:22:01.564 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 08:11:51 -0400 (0:00:00.348) 0:22:01.912 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 08:11:52 -0400 (0:00:00.250) 0:22:02.163 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 08:11:52 -0400 (0:00:00.591) 0:22:02.754 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 08:11:53 -0400 (0:00:00.262) 0:22:03.017 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 08:11:53 -0400 (0:00:00.329) 0:22:03.346 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 08:11:53 -0400 (0:00:00.188) 0:22:03.535 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 08:11:53 -0400 (0:00:00.278) 0:22:03.813 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 08:11:54 -0400 (0:00:00.588) 0:22:04.406 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 08:11:59 -0400 (0:00:04.883) 0:22:09.290 *********** ok: [managed-node14] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 08:11:59 -0400 (0:00:00.231) 0:22:09.521 *********** ok: [managed-node14] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 08:11:59 -0400 (0:00:00.319) 0:22:09.841 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 08:12:05 -0400 (0:00:05.469) 0:22:15.310 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 08:12:06 -0400 (0:00:00.983) 0:22:16.294 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 08:12:06 -0400 (0:00:00.241) 0:22:16.535 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 08:12:06 -0400 (0:00:00.271) 0:22:16.807 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 08:12:07 -0400 (0:00:00.174) 0:22:16.981 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 08:12:11 -0400 (0:00:04.520) 0:22:21.502 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 08:12:14 -0400 (0:00:02.862) 0:22:24.364 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 08:12:14 -0400 (0:00:00.458) 0:22:24.823 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 08:12:15 -0400 (0:00:00.173) 0:22:24.996 *********** changed: [managed-node14] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 15 June 2025 08:12:28 -0400 (0:00:13.573) 0:22:38.570 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 15 June 2025 08:12:28 -0400 (0:00:00.293) 0:22:38.864 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989397.2178633, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a1522684f5b6a445a50f2611a4e0757a4aec1cf1", "ctime": 1749989397.2148633, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1749989397.2148633, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1393, "uid": 0, "version": "2912082900", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 15 June 2025 08:12:30 -0400 (0:00:01.518) 0:22:40.383 *********** ok: [managed-node14] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 08:12:31 -0400 (0:00:01.522) 0:22:41.905 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 15 June 2025 08:12:32 -0400 (0:00:00.112) 0:22:42.018 *********** ok: [managed-node14] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 15 June 2025 08:12:32 -0400 (0:00:00.337) 0:22:42.355 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 15 June 2025 08:12:32 -0400 (0:00:00.184) 0:22:42.539 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 15 June 2025 08:12:32 -0400 (0:00:00.271) 0:22:42.811 *********** changed: [managed-node14] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 15 June 2025 08:12:34 -0400 (0:00:01.309) 0:22:44.120 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 15 June 2025 08:12:36 -0400 (0:00:01.964) 0:22:46.085 *********** changed: [managed-node14] => (item={'src': '/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 15 June 2025 08:12:37 -0400 (0:00:01.253) 0:22:47.339 *********** skipping: [managed-node14] => (item={'src': '/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 15 June 2025 08:12:37 -0400 (0:00:00.256) 0:22:47.596 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 15 June 2025 08:12:39 -0400 (0:00:01.579) 0:22:49.175 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989410.3988383, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1749989402.709853, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 163578076, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1749989402.7088528, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1006700211", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 15 June 2025 08:12:40 -0400 (0:00:01.177) 0:22:50.353 *********** changed: [managed-node14] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-6224d09e-0240-4706-9b78-c965f75b5dc3', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 15 June 2025 08:12:42 -0400 (0:00:01.729) 0:22:52.083 *********** ok: [managed-node14] TASK [Verify role results] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:524 Sunday 15 June 2025 08:12:44 -0400 (0:00:01.918) 0:22:54.002 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node14 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 15 June 2025 08:12:44 -0400 (0:00:00.462) 0:22:54.465 *********** ok: [managed-node14] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 15 June 2025 08:12:44 -0400 (0:00:00.334) 0:22:54.799 *********** skipping: [managed-node14] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 15 June 2025 08:12:45 -0400 (0:00:00.913) 0:22:55.713 *********** ok: [managed-node14] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "6224d09e-0240-4706-9b78-c965f75b5dc3" }, "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "size": "4G", "type": "crypt", "uuid": "b6a563dd-7860-41f1-a91d-d91af2303c43" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "1Y9WCB-x61J-PNpb-KWfI-65dd-u3gi-UoLrvq" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 15 June 2025 08:12:47 -0400 (0:00:01.603) 0:22:57.317 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002247", "end": "2025-06-15 08:12:48.479786", "rc": 0, "start": "2025-06-15 08:12:48.477539" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 15 June 2025 08:12:48 -0400 (0:00:01.428) 0:22:58.745 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002302", "end": "2025-06-15 08:12:50.006545", "failed_when_result": false, "rc": 0, "start": "2025-06-15 08:12:50.004243" } STDOUT: luks-6224d09e-0240-4706-9b78-c965f75b5dc3 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 15 June 2025 08:12:50 -0400 (0:00:01.425) 0:23:00.171 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node14 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Sunday 15 June 2025 08:12:50 -0400 (0:00:00.336) 0:23:00.508 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Sunday 15 June 2025 08:12:50 -0400 (0:00:00.209) 0:23:00.717 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.023537", "end": "2025-06-15 08:12:52.102397", "rc": 0, "start": "2025-06-15 08:12:52.078860" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Sunday 15 June 2025 08:12:52 -0400 (0:00:01.615) 0:23:02.333 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Sunday 15 June 2025 08:12:52 -0400 (0:00:00.342) 0:23:02.676 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node14 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Sunday 15 June 2025 08:12:53 -0400 (0:00:00.485) 0:23:03.162 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Sunday 15 June 2025 08:12:53 -0400 (0:00:00.569) 0:23:03.731 *********** ok: [managed-node14] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Sunday 15 June 2025 08:12:55 -0400 (0:00:01.312) 0:23:05.044 *********** ok: [managed-node14] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Sunday 15 June 2025 08:12:55 -0400 (0:00:00.208) 0:23:05.252 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Sunday 15 June 2025 08:12:55 -0400 (0:00:00.311) 0:23:05.563 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Sunday 15 June 2025 08:12:55 -0400 (0:00:00.251) 0:23:05.814 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Sunday 15 June 2025 08:12:56 -0400 (0:00:00.234) 0:23:06.049 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Sunday 15 June 2025 08:12:56 -0400 (0:00:00.250) 0:23:06.299 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Sunday 15 June 2025 08:12:56 -0400 (0:00:00.116) 0:23:06.416 *********** ok: [managed-node14] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Sunday 15 June 2025 08:12:56 -0400 (0:00:00.256) 0:23:06.672 *********** ok: [managed-node14] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.14.207 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:74 Sunday 15 June 2025 08:12:58 -0400 (0:00:01.613) 0:23:08.286 *********** skipping: [managed-node14] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:84 Sunday 15 June 2025 08:12:58 -0400 (0:00:00.233) 0:23:08.519 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node14 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Sunday 15 June 2025 08:12:59 -0400 (0:00:00.628) 0:23:09.148 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Sunday 15 June 2025 08:12:59 -0400 (0:00:00.390) 0:23:09.539 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Sunday 15 June 2025 08:12:59 -0400 (0:00:00.339) 0:23:09.878 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Sunday 15 June 2025 08:13:00 -0400 (0:00:00.212) 0:23:10.091 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Sunday 15 June 2025 08:13:00 -0400 (0:00:00.257) 0:23:10.348 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Sunday 15 June 2025 08:13:00 -0400 (0:00:00.256) 0:23:10.604 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Sunday 15 June 2025 08:13:00 -0400 (0:00:00.243) 0:23:10.848 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Sunday 15 June 2025 08:13:01 -0400 (0:00:00.298) 0:23:11.146 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Sunday 15 June 2025 08:13:01 -0400 (0:00:00.380) 0:23:11.527 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Sunday 15 June 2025 08:13:01 -0400 (0:00:00.291) 0:23:11.818 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Sunday 15 June 2025 08:13:02 -0400 (0:00:00.291) 0:23:12.110 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:87 Sunday 15 June 2025 08:13:02 -0400 (0:00:00.178) 0:23:12.289 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node14 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Sunday 15 June 2025 08:13:02 -0400 (0:00:00.400) 0:23:12.689 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node14 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Sunday 15 June 2025 08:13:03 -0400 (0:00:00.986) 0:23:13.676 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Sunday 15 June 2025 08:13:04 -0400 (0:00:00.341) 0:23:14.018 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Sunday 15 June 2025 08:13:04 -0400 (0:00:00.306) 0:23:14.324 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Sunday 15 June 2025 08:13:04 -0400 (0:00:00.304) 0:23:14.629 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Sunday 15 June 2025 08:13:04 -0400 (0:00:00.323) 0:23:14.952 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Sunday 15 June 2025 08:13:05 -0400 (0:00:00.284) 0:23:15.237 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Sunday 15 June 2025 08:13:05 -0400 (0:00:00.290) 0:23:15.527 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:90 Sunday 15 June 2025 08:13:05 -0400 (0:00:00.290) 0:23:15.818 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node14 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Sunday 15 June 2025 08:13:06 -0400 (0:00:00.673) 0:23:16.492 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node14 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Sunday 15 June 2025 08:13:07 -0400 (0:00:00.618) 0:23:17.110 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Sunday 15 June 2025 08:13:07 -0400 (0:00:00.308) 0:23:17.418 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Sunday 15 June 2025 08:13:07 -0400 (0:00:00.288) 0:23:17.708 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Sunday 15 June 2025 08:13:07 -0400 (0:00:00.258) 0:23:17.966 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:93 Sunday 15 June 2025 08:13:08 -0400 (0:00:00.351) 0:23:18.318 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node14 TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Sunday 15 June 2025 08:13:08 -0400 (0:00:00.510) 0:23:18.828 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Sunday 15 June 2025 08:13:09 -0400 (0:00:00.256) 0:23:19.085 *********** skipping: [managed-node14] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Sunday 15 June 2025 08:13:09 -0400 (0:00:00.320) 0:23:19.406 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node14 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Sunday 15 June 2025 08:13:09 -0400 (0:00:00.555) 0:23:19.961 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Sunday 15 June 2025 08:13:10 -0400 (0:00:00.409) 0:23:20.370 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Sunday 15 June 2025 08:13:10 -0400 (0:00:00.225) 0:23:20.595 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Sunday 15 June 2025 08:13:10 -0400 (0:00:00.228) 0:23:20.824 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Sunday 15 June 2025 08:13:11 -0400 (0:00:00.303) 0:23:21.127 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Sunday 15 June 2025 08:13:11 -0400 (0:00:00.336) 0:23:21.463 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Sunday 15 June 2025 08:13:11 -0400 (0:00:00.243) 0:23:21.707 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:96 Sunday 15 June 2025 08:13:11 -0400 (0:00:00.252) 0:23:21.959 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node14 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Sunday 15 June 2025 08:13:12 -0400 (0:00:00.527) 0:23:22.487 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node14 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Sunday 15 June 2025 08:13:13 -0400 (0:00:00.502) 0:23:22.989 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Sunday 15 June 2025 08:13:13 -0400 (0:00:00.197) 0:23:23.186 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Sunday 15 June 2025 08:13:13 -0400 (0:00:00.233) 0:23:23.420 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Sunday 15 June 2025 08:13:13 -0400 (0:00:00.290) 0:23:23.711 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Sunday 15 June 2025 08:13:14 -0400 (0:00:00.290) 0:23:24.001 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Sunday 15 June 2025 08:13:14 -0400 (0:00:00.317) 0:23:24.319 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Sunday 15 June 2025 08:13:14 -0400 (0:00:00.196) 0:23:24.516 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:99 Sunday 15 June 2025 08:13:14 -0400 (0:00:00.272) 0:23:24.789 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node14 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Sunday 15 June 2025 08:13:15 -0400 (0:00:00.584) 0:23:25.373 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Sunday 15 June 2025 08:13:15 -0400 (0:00:00.312) 0:23:25.686 *********** skipping: [managed-node14] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Sunday 15 June 2025 08:13:16 -0400 (0:00:01.031) 0:23:26.718 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Sunday 15 June 2025 08:13:16 -0400 (0:00:00.203) 0:23:26.921 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Sunday 15 June 2025 08:13:17 -0400 (0:00:00.178) 0:23:27.100 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Sunday 15 June 2025 08:13:17 -0400 (0:00:00.171) 0:23:27.271 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Sunday 15 June 2025 08:13:17 -0400 (0:00:00.220) 0:23:27.492 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:102 Sunday 15 June 2025 08:13:17 -0400 (0:00:00.171) 0:23:27.663 *********** ok: [managed-node14] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Sunday 15 June 2025 08:13:17 -0400 (0:00:00.092) 0:23:27.756 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node14 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 15 June 2025 08:13:18 -0400 (0:00:00.211) 0:23:27.968 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 15 June 2025 08:13:18 -0400 (0:00:00.300) 0:23:28.268 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node14 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 15 June 2025 08:13:19 -0400 (0:00:01.051) 0:23:29.320 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 15 June 2025 08:13:19 -0400 (0:00:00.201) 0:23:29.521 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 15 June 2025 08:13:19 -0400 (0:00:00.265) 0:23:29.787 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Sunday 15 June 2025 08:13:20 -0400 (0:00:00.234) 0:23:30.021 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Sunday 15 June 2025 08:13:20 -0400 (0:00:00.216) 0:23:30.237 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Sunday 15 June 2025 08:13:20 -0400 (0:00:00.162) 0:23:30.400 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Sunday 15 June 2025 08:13:20 -0400 (0:00:00.225) 0:23:30.625 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Sunday 15 June 2025 08:13:20 -0400 (0:00:00.172) 0:23:30.798 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Sunday 15 June 2025 08:13:21 -0400 (0:00:00.241) 0:23:31.039 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Sunday 15 June 2025 08:13:21 -0400 (0:00:00.246) 0:23:31.285 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Sunday 15 June 2025 08:13:21 -0400 (0:00:00.158) 0:23:31.444 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 15 June 2025 08:13:21 -0400 (0:00:00.121) 0:23:31.566 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 15 June 2025 08:13:21 -0400 (0:00:00.243) 0:23:31.809 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 15 June 2025 08:13:21 -0400 (0:00:00.111) 0:23:31.921 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 15 June 2025 08:13:22 -0400 (0:00:00.135) 0:23:32.057 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 15 June 2025 08:13:22 -0400 (0:00:00.211) 0:23:32.268 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 15 June 2025 08:13:22 -0400 (0:00:00.219) 0:23:32.487 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 15 June 2025 08:13:22 -0400 (0:00:00.200) 0:23:32.688 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 15 June 2025 08:13:23 -0400 (0:00:00.319) 0:23:33.008 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 15 June 2025 08:13:23 -0400 (0:00:00.361) 0:23:33.369 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989548.109576, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749989548.109576, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 251684, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1749989548.109576, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 15 June 2025 08:13:24 -0400 (0:00:01.366) 0:23:34.735 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 15 June 2025 08:13:24 -0400 (0:00:00.229) 0:23:34.965 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 15 June 2025 08:13:25 -0400 (0:00:00.315) 0:23:35.281 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 15 June 2025 08:13:25 -0400 (0:00:00.241) 0:23:35.522 *********** ok: [managed-node14] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 15 June 2025 08:13:25 -0400 (0:00:00.311) 0:23:35.833 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 15 June 2025 08:13:26 -0400 (0:00:00.304) 0:23:36.138 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 15 June 2025 08:13:26 -0400 (0:00:00.306) 0:23:36.445 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989548.2545757, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749989548.2545757, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 269376, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1749989548.2545757, "nlink": 1, "path": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 15 June 2025 08:13:28 -0400 (0:00:01.637) 0:23:38.082 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 15 June 2025 08:13:32 -0400 (0:00:04.686) 0:23:42.769 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010350", "end": "2025-06-15 08:13:34.085660", "rc": 0, "start": "2025-06-15 08:13:34.075310" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 6224d09e-0240-4706-9b78-c965f75b5dc3 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 944862 Threads: 2 Salt: 94 bc b0 bc 37 dc 90 d8 57 2a 71 df 06 84 d1 48 c1 b8 2c 7a 4b 1e ee c7 8e 87 a0 07 5f fd 80 08 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 112993 Salt: c6 67 ef 8b f4 0a 73 02 a8 38 0b 8a b9 02 f3 0a c4 9d 8e 55 15 29 89 18 08 66 43 5b fe 24 17 62 Digest: e9 47 8b 59 f9 2c 41 b3 5c f3 f0 2b 0c 1b f4 5c bc 87 bd 31 db 2e ac 98 fb f7 e6 a1 24 c7 d9 c3 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 15 June 2025 08:13:34 -0400 (0:00:01.596) 0:23:44.365 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 15 June 2025 08:13:34 -0400 (0:00:00.277) 0:23:44.642 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 15 June 2025 08:13:34 -0400 (0:00:00.301) 0:23:44.944 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 15 June 2025 08:13:35 -0400 (0:00:00.267) 0:23:45.211 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 15 June 2025 08:13:35 -0400 (0:00:00.248) 0:23:45.459 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Sunday 15 June 2025 08:13:35 -0400 (0:00:00.352) 0:23:45.811 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Sunday 15 June 2025 08:13:36 -0400 (0:00:00.202) 0:23:46.014 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Sunday 15 June 2025 08:13:36 -0400 (0:00:00.247) 0:23:46.262 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-6224d09e-0240-4706-9b78-c965f75b5dc3 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Sunday 15 June 2025 08:13:36 -0400 (0:00:00.231) 0:23:46.493 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Sunday 15 June 2025 08:13:36 -0400 (0:00:00.203) 0:23:46.697 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Sunday 15 June 2025 08:13:37 -0400 (0:00:00.348) 0:23:47.046 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Sunday 15 June 2025 08:13:38 -0400 (0:00:01.005) 0:23:48.051 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Sunday 15 June 2025 08:13:38 -0400 (0:00:00.332) 0:23:48.384 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 15 June 2025 08:13:38 -0400 (0:00:00.234) 0:23:48.618 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 15 June 2025 08:13:38 -0400 (0:00:00.281) 0:23:48.899 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 15 June 2025 08:13:39 -0400 (0:00:00.234) 0:23:49.134 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 15 June 2025 08:13:39 -0400 (0:00:00.200) 0:23:49.334 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 15 June 2025 08:13:39 -0400 (0:00:00.292) 0:23:49.627 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 15 June 2025 08:13:39 -0400 (0:00:00.261) 0:23:49.888 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 15 June 2025 08:13:40 -0400 (0:00:00.250) 0:23:50.139 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 15 June 2025 08:13:40 -0400 (0:00:00.332) 0:23:50.471 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 15 June 2025 08:13:40 -0400 (0:00:00.220) 0:23:50.691 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 15 June 2025 08:13:40 -0400 (0:00:00.271) 0:23:50.962 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 15 June 2025 08:13:41 -0400 (0:00:00.198) 0:23:51.161 *********** ok: [managed-node14] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 15 June 2025 08:13:42 -0400 (0:00:01.569) 0:23:52.731 *********** ok: [managed-node14] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 15 June 2025 08:13:44 -0400 (0:00:01.367) 0:23:54.098 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 15 June 2025 08:13:44 -0400 (0:00:00.407) 0:23:54.505 *********** ok: [managed-node14] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 15 June 2025 08:13:44 -0400 (0:00:00.136) 0:23:54.641 *********** ok: [managed-node14] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 15 June 2025 08:13:45 -0400 (0:00:01.077) 0:23:55.719 *********** skipping: [managed-node14] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 15 June 2025 08:13:46 -0400 (0:00:00.306) 0:23:56.026 *********** skipping: [managed-node14] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 15 June 2025 08:13:46 -0400 (0:00:00.297) 0:23:56.324 *********** skipping: [managed-node14] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 15 June 2025 08:13:46 -0400 (0:00:00.228) 0:23:56.553 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Sunday 15 June 2025 08:13:46 -0400 (0:00:00.299) 0:23:56.852 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Sunday 15 June 2025 08:13:47 -0400 (0:00:00.313) 0:23:57.166 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Sunday 15 June 2025 08:13:47 -0400 (0:00:00.173) 0:23:57.339 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Sunday 15 June 2025 08:13:47 -0400 (0:00:00.241) 0:23:57.580 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Sunday 15 June 2025 08:13:47 -0400 (0:00:00.224) 0:23:57.804 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Sunday 15 June 2025 08:13:47 -0400 (0:00:00.137) 0:23:57.942 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Sunday 15 June 2025 08:13:48 -0400 (0:00:00.239) 0:23:58.181 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Sunday 15 June 2025 08:13:48 -0400 (0:00:00.267) 0:23:58.449 *********** skipping: [managed-node14] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Sunday 15 June 2025 08:13:48 -0400 (0:00:00.154) 0:23:58.603 *********** skipping: [managed-node14] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Sunday 15 June 2025 08:13:49 -0400 (0:00:00.370) 0:23:58.973 *********** skipping: [managed-node14] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Sunday 15 June 2025 08:13:49 -0400 (0:00:00.120) 0:23:59.094 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Sunday 15 June 2025 08:13:49 -0400 (0:00:00.178) 0:23:59.273 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Sunday 15 June 2025 08:13:49 -0400 (0:00:00.254) 0:23:59.527 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Sunday 15 June 2025 08:13:49 -0400 (0:00:00.280) 0:23:59.808 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Sunday 15 June 2025 08:13:50 -0400 (0:00:00.272) 0:24:00.081 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Sunday 15 June 2025 08:13:50 -0400 (0:00:00.248) 0:24:00.329 *********** ok: [managed-node14] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Sunday 15 June 2025 08:13:50 -0400 (0:00:00.251) 0:24:00.580 *********** ok: [managed-node14] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Sunday 15 June 2025 08:13:50 -0400 (0:00:00.222) 0:24:00.803 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 15 June 2025 08:13:51 -0400 (0:00:00.207) 0:24:01.010 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.026895", "end": "2025-06-15 08:13:52.353520", "rc": 0, "start": "2025-06-15 08:13:52.326625" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 15 June 2025 08:13:52 -0400 (0:00:01.561) 0:24:02.572 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 15 June 2025 08:13:52 -0400 (0:00:00.188) 0:24:02.760 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 15 June 2025 08:13:53 -0400 (0:00:00.208) 0:24:02.969 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 15 June 2025 08:13:53 -0400 (0:00:00.238) 0:24:03.208 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 15 June 2025 08:13:53 -0400 (0:00:00.329) 0:24:03.537 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 15 June 2025 08:13:53 -0400 (0:00:00.272) 0:24:03.810 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 15 June 2025 08:13:54 -0400 (0:00:00.296) 0:24:04.107 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Sunday 15 June 2025 08:13:54 -0400 (0:00:00.350) 0:24:04.458 *********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Sunday 15 June 2025 08:13:55 -0400 (0:00:00.907) 0:24:05.365 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:527 Sunday 15 June 2025 08:13:55 -0400 (0:00:00.250) 0:24:05.616 *********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 15 June 2025 08:13:56 -0400 (0:00:00.717) 0:24:06.334 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 15 June 2025 08:13:56 -0400 (0:00:00.259) 0:24:06.593 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 15 June 2025 08:13:56 -0400 (0:00:00.255) 0:24:06.849 *********** skipping: [managed-node14] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node14] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node14] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 15 June 2025 08:13:57 -0400 (0:00:00.597) 0:24:07.446 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 15 June 2025 08:13:57 -0400 (0:00:00.331) 0:24:07.778 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 15 June 2025 08:13:58 -0400 (0:00:00.313) 0:24:08.091 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 15 June 2025 08:13:58 -0400 (0:00:00.309) 0:24:08.401 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 15 June 2025 08:13:58 -0400 (0:00:00.182) 0:24:08.583 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 15 June 2025 08:13:59 -0400 (0:00:00.417) 0:24:09.001 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 15 June 2025 08:14:04 -0400 (0:00:05.158) 0:24:14.160 *********** ok: [managed-node14] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 15 June 2025 08:14:04 -0400 (0:00:00.378) 0:24:14.538 *********** ok: [managed-node14] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 15 June 2025 08:14:04 -0400 (0:00:00.258) 0:24:14.797 *********** ok: [managed-node14] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 15 June 2025 08:14:10 -0400 (0:00:05.244) 0:24:20.042 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node14 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 15 June 2025 08:14:10 -0400 (0:00:00.513) 0:24:20.555 *********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 15 June 2025 08:14:10 -0400 (0:00:00.225) 0:24:20.780 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 15 June 2025 08:14:11 -0400 (0:00:00.387) 0:24:21.168 *********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 15 June 2025 08:14:11 -0400 (0:00:00.161) 0:24:21.330 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 15 June 2025 08:14:15 -0400 (0:00:04.471) 0:24:25.802 *********** ok: [managed-node14] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 15 June 2025 08:14:18 -0400 (0:00:02.842) 0:24:28.645 *********** ok: [managed-node14] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 15 June 2025 08:14:19 -0400 (0:00:00.418) 0:24:29.063 *********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 15 June 2025 08:14:19 -0400 (0:00:00.179) 0:24:29.243 *********** changed: [managed-node14] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=1Y9WCB-x61J-PNpb-KWfI-65dd-u3gi-UoLrvq", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 15 June 2025 08:14:25 -0400 (0:00:05.930) 0:24:35.173 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 15 June 2025 08:14:25 -0400 (0:00:00.229) 0:24:35.402 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989557.1135588, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a4c6bd729c8c9f9bc8ce195ccc2cea1d93ff3299", "ctime": 1749989557.1105587, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904586, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1749989557.1105587, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "2912082900", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 15 June 2025 08:14:26 -0400 (0:00:01.123) 0:24:36.526 *********** ok: [managed-node14] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 15 June 2025 08:14:28 -0400 (0:00:01.965) 0:24:38.491 *********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 15 June 2025 08:14:28 -0400 (0:00:00.259) 0:24:38.750 *********** ok: [managed-node14] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=1Y9WCB-x61J-PNpb-KWfI-65dd-u3gi-UoLrvq", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 15 June 2025 08:14:29 -0400 (0:00:00.330) 0:24:39.081 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 15 June 2025 08:14:29 -0400 (0:00:00.315) 0:24:39.397 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=1Y9WCB-x61J-PNpb-KWfI-65dd-u3gi-UoLrvq", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 15 June 2025 08:14:29 -0400 (0:00:00.340) 0:24:39.737 *********** changed: [managed-node14] => (item={'src': '/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-6224d09e-0240-4706-9b78-c965f75b5dc3" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 15 June 2025 08:14:31 -0400 (0:00:01.518) 0:24:41.256 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 15 June 2025 08:14:33 -0400 (0:00:01.918) 0:24:43.174 *********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 15 June 2025 08:14:33 -0400 (0:00:00.369) 0:24:43.543 *********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 15 June 2025 08:14:33 -0400 (0:00:00.215) 0:24:43.759 *********** ok: [managed-node14] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 15 June 2025 08:14:35 -0400 (0:00:02.003) 0:24:45.762 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989570.0055342, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "52113c5a462c2612cd6ff53aa144ce4fa3b0d242", "ctime": 1749989561.76755, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 318767239, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1749989561.76555, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1098798974", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 15 June 2025 08:14:37 -0400 (0:00:01.528) 0:24:47.291 *********** changed: [managed-node14] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-6224d09e-0240-4706-9b78-c965f75b5dc3', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-6224d09e-0240-4706-9b78-c965f75b5dc3", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 15 June 2025 08:14:38 -0400 (0:00:01.625) 0:24:48.916 *********** ok: [managed-node14] TASK [Verify role results] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:537 Sunday 15 June 2025 08:14:40 -0400 (0:00:01.928) 0:24:50.845 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node14 TASK [Print out pool information] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 15 June 2025 08:14:41 -0400 (0:00:00.620) 0:24:51.465 *********** skipping: [managed-node14] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 15 June 2025 08:14:41 -0400 (0:00:00.223) 0:24:51.689 *********** ok: [managed-node14] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=1Y9WCB-x61J-PNpb-KWfI-65dd-u3gi-UoLrvq", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 15 June 2025 08:14:41 -0400 (0:00:00.274) 0:24:51.963 *********** ok: [managed-node14] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 15 June 2025 08:14:43 -0400 (0:00:01.443) 0:24:53.406 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002195", "end": "2025-06-15 08:14:44.621498", "rc": 0, "start": "2025-06-15 08:14:44.619303" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 15 June 2025 08:14:44 -0400 (0:00:01.439) 0:24:54.845 *********** ok: [managed-node14] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:01.003943", "end": "2025-06-15 08:14:46.985689", "failed_when_result": false, "rc": 0, "start": "2025-06-15 08:14:45.981746" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 15 June 2025 08:14:47 -0400 (0:00:02.402) 0:24:57.248 *********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Sunday 15 June 2025 08:14:47 -0400 (0:00:00.124) 0:24:57.372 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node14 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 15 June 2025 08:14:47 -0400 (0:00:00.313) 0:24:57.686 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 15 June 2025 08:14:47 -0400 (0:00:00.151) 0:24:57.837 *********** included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node14 included: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node14 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 15 June 2025 08:14:48 -0400 (0:00:00.969) 0:24:58.807 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 15 June 2025 08:14:49 -0400 (0:00:00.172) 0:24:58.979 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 15 June 2025 08:14:49 -0400 (0:00:00.223) 0:24:59.203 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Sunday 15 June 2025 08:14:49 -0400 (0:00:00.275) 0:24:59.478 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Sunday 15 June 2025 08:14:49 -0400 (0:00:00.163) 0:24:59.641 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Sunday 15 June 2025 08:14:50 -0400 (0:00:00.368) 0:25:00.010 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Sunday 15 June 2025 08:14:50 -0400 (0:00:00.163) 0:25:00.173 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Sunday 15 June 2025 08:14:50 -0400 (0:00:00.121) 0:25:00.295 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Sunday 15 June 2025 08:14:50 -0400 (0:00:00.441) 0:25:00.737 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Sunday 15 June 2025 08:14:51 -0400 (0:00:00.283) 0:25:01.020 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Sunday 15 June 2025 08:14:51 -0400 (0:00:00.279) 0:25:01.300 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 15 June 2025 08:14:51 -0400 (0:00:00.314) 0:25:01.614 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 15 June 2025 08:14:51 -0400 (0:00:00.334) 0:25:01.949 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 15 June 2025 08:14:52 -0400 (0:00:00.246) 0:25:02.195 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 15 June 2025 08:14:52 -0400 (0:00:00.284) 0:25:02.480 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 15 June 2025 08:14:52 -0400 (0:00:00.180) 0:25:02.660 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 15 June 2025 08:14:52 -0400 (0:00:00.276) 0:25:02.937 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 15 June 2025 08:14:53 -0400 (0:00:00.233) 0:25:03.171 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 15 June 2025 08:14:53 -0400 (0:00:00.220) 0:25:03.392 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 15 June 2025 08:14:53 -0400 (0:00:00.176) 0:25:03.568 *********** ok: [managed-node14] => { "changed": false, "stat": { "atime": 1749989664.8533537, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1749989664.8533537, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 35804, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1749989664.8533537, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 15 June 2025 08:14:54 -0400 (0:00:01.243) 0:25:04.812 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 15 June 2025 08:14:55 -0400 (0:00:00.289) 0:25:05.101 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 15 June 2025 08:14:55 -0400 (0:00:00.275) 0:25:05.377 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 15 June 2025 08:14:55 -0400 (0:00:00.156) 0:25:05.534 *********** ok: [managed-node14] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 15 June 2025 08:14:55 -0400 (0:00:00.256) 0:25:05.790 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 15 June 2025 08:14:56 -0400 (0:00:00.272) 0:25:06.062 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 15 June 2025 08:14:56 -0400 (0:00:00.198) 0:25:06.260 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 15 June 2025 08:14:56 -0400 (0:00:00.332) 0:25:06.593 *********** ok: [managed-node14] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 15 June 2025 08:15:00 -0400 (0:00:04.263) 0:25:10.857 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 15 June 2025 08:15:01 -0400 (0:00:00.292) 0:25:11.149 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 15 June 2025 08:15:01 -0400 (0:00:00.211) 0:25:11.361 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 15 June 2025 08:15:01 -0400 (0:00:00.212) 0:25:11.573 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 15 June 2025 08:15:01 -0400 (0:00:00.221) 0:25:11.794 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 15 June 2025 08:15:02 -0400 (0:00:00.289) 0:25:12.084 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Sunday 15 June 2025 08:15:02 -0400 (0:00:00.114) 0:25:12.198 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Sunday 15 June 2025 08:15:02 -0400 (0:00:00.127) 0:25:12.326 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Sunday 15 June 2025 08:15:02 -0400 (0:00:00.237) 0:25:12.563 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Sunday 15 June 2025 08:15:02 -0400 (0:00:00.266) 0:25:12.830 *********** ok: [managed-node14] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Sunday 15 June 2025 08:15:03 -0400 (0:00:00.193) 0:25:13.023 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Sunday 15 June 2025 08:15:03 -0400 (0:00:00.256) 0:25:13.280 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Sunday 15 June 2025 08:15:03 -0400 (0:00:00.102) 0:25:13.383 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Sunday 15 June 2025 08:15:03 -0400 (0:00:00.098) 0:25:13.481 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 15 June 2025 08:15:03 -0400 (0:00:00.131) 0:25:13.613 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 15 June 2025 08:15:03 -0400 (0:00:00.053) 0:25:13.667 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 15 June 2025 08:15:03 -0400 (0:00:00.171) 0:25:13.839 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 15 June 2025 08:15:04 -0400 (0:00:00.128) 0:25:13.967 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 15 June 2025 08:15:04 -0400 (0:00:00.214) 0:25:14.182 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 15 June 2025 08:15:04 -0400 (0:00:00.149) 0:25:14.331 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 15 June 2025 08:15:04 -0400 (0:00:00.232) 0:25:14.564 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 15 June 2025 08:15:04 -0400 (0:00:00.273) 0:25:14.837 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 15 June 2025 08:15:05 -0400 (0:00:00.180) 0:25:15.018 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 15 June 2025 08:15:05 -0400 (0:00:00.276) 0:25:15.294 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 15 June 2025 08:15:05 -0400 (0:00:00.262) 0:25:15.557 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 15 June 2025 08:15:05 -0400 (0:00:00.193) 0:25:15.750 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 15 June 2025 08:15:05 -0400 (0:00:00.176) 0:25:15.927 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 15 June 2025 08:15:06 -0400 (0:00:00.232) 0:25:16.159 *********** ok: [managed-node14] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 15 June 2025 08:15:06 -0400 (0:00:00.327) 0:25:16.486 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 15 June 2025 08:15:06 -0400 (0:00:00.206) 0:25:16.693 *********** skipping: [managed-node14] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 15 June 2025 08:15:06 -0400 (0:00:00.251) 0:25:16.944 *********** skipping: [managed-node14] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 15 June 2025 08:15:07 -0400 (0:00:00.272) 0:25:17.217 *********** skipping: [managed-node14] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 15 June 2025 08:15:07 -0400 (0:00:00.159) 0:25:17.377 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Sunday 15 June 2025 08:15:07 -0400 (0:00:00.269) 0:25:17.646 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Sunday 15 June 2025 08:15:07 -0400 (0:00:00.252) 0:25:17.898 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Sunday 15 June 2025 08:15:08 -0400 (0:00:00.176) 0:25:18.075 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Sunday 15 June 2025 08:15:08 -0400 (0:00:00.151) 0:25:18.227 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Sunday 15 June 2025 08:15:08 -0400 (0:00:00.160) 0:25:18.388 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Sunday 15 June 2025 08:15:08 -0400 (0:00:00.193) 0:25:18.581 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Sunday 15 June 2025 08:15:08 -0400 (0:00:00.312) 0:25:18.894 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Sunday 15 June 2025 08:15:09 -0400 (0:00:00.295) 0:25:19.189 *********** skipping: [managed-node14] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Sunday 15 June 2025 08:15:09 -0400 (0:00:00.255) 0:25:19.445 *********** skipping: [managed-node14] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Sunday 15 June 2025 08:15:09 -0400 (0:00:00.213) 0:25:19.658 *********** skipping: [managed-node14] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Sunday 15 June 2025 08:15:09 -0400 (0:00:00.219) 0:25:19.878 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Sunday 15 June 2025 08:15:10 -0400 (0:00:00.165) 0:25:20.044 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Sunday 15 June 2025 08:15:10 -0400 (0:00:00.101) 0:25:20.145 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Sunday 15 June 2025 08:15:10 -0400 (0:00:00.217) 0:25:20.363 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Sunday 15 June 2025 08:15:10 -0400 (0:00:00.257) 0:25:20.620 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Sunday 15 June 2025 08:15:11 -0400 (0:00:00.348) 0:25:20.969 *********** ok: [managed-node14] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Sunday 15 June 2025 08:15:11 -0400 (0:00:00.227) 0:25:21.196 *********** ok: [managed-node14] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Sunday 15 June 2025 08:15:11 -0400 (0:00:00.247) 0:25:21.443 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 15 June 2025 08:15:11 -0400 (0:00:00.147) 0:25:21.590 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 15 June 2025 08:15:11 -0400 (0:00:00.242) 0:25:21.833 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 15 June 2025 08:15:12 -0400 (0:00:00.189) 0:25:22.022 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 15 June 2025 08:15:12 -0400 (0:00:00.128) 0:25:22.151 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 15 June 2025 08:15:12 -0400 (0:00:00.234) 0:25:22.385 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 15 June 2025 08:15:12 -0400 (0:00:00.137) 0:25:22.523 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 15 June 2025 08:15:12 -0400 (0:00:00.179) 0:25:22.702 *********** skipping: [managed-node14] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 15 June 2025 08:15:12 -0400 (0:00:00.149) 0:25:22.852 *********** ok: [managed-node14] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Sunday 15 June 2025 08:15:13 -0400 (0:00:00.145) 0:25:22.997 *********** ok: [managed-node14] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node14 : ok=1229 changed=60 unreachable=0 failed=9 skipped=1068 rescued=9 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.9.27", "end_time": "2025-06-15T11:51:07.429843+00:00Z", "host": "managed-node14", "message": "encrypted volume 'foo' missing key/password", "start_time": "2025-06-15T11:51:02.142650+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T11:51:07.664155+00:00Z", "host": "managed-node14", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-15T11:51:07.500698+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T11:53:09.990554+00:00Z", "host": "managed-node14", "message": "cannot remove existing formatting on device 'luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf' in safe mode due to encryption removal", "start_time": "2025-06-15T11:53:04.750582+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T11:53:10.253694+00:00Z", "host": "managed-node14", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-af16ab2a-ac54-451b-94bc-b559e3fb69bf' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-15T11:53:10.020487+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T11:55:05.979661+00:00Z", "host": "managed-node14", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2025-06-15T11:55:00.618042+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T11:55:06.255150+00:00Z", "host": "managed-node14", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-15T11:55:06.017045+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T11:57:03.990443+00:00Z", "host": "managed-node14", "message": "encrypted volume 'test1' missing key/password", "start_time": "2025-06-15T11:56:58.820750+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T11:57:04.177332+00:00Z", "host": "managed-node14", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-15T11:57:04.016299+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T11:59:24.379013+00:00Z", "host": "managed-node14", "message": "cannot remove existing formatting on device 'luks-31d2f766-d0b1-458a-9b68-cf2015a578aa' in safe mode due to encryption removal", "start_time": "2025-06-15T11:59:18.825927+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T11:59:24.703098+00:00Z", "host": "managed-node14", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-31d2f766-d0b1-458a-9b68-cf2015a578aa' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-15T11:59:24.419202+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T12:01:42.396460+00:00Z", "host": "managed-node14", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2025-06-15T12:01:36.958715+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T12:01:42.552588+00:00Z", "host": "managed-node14", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-15T12:01:42.489925+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T12:04:13.363694+00:00Z", "host": "managed-node14", "message": "encrypted volume 'test1' missing key/password", "start_time": "2025-06-15T12:04:08.093508+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T12:04:13.590008+00:00Z", "host": "managed-node14", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-15T12:04:13.417855+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T12:09:06.739818+00:00Z", "host": "managed-node14", "message": "cannot remove existing formatting on device 'luks-6cef51e7-40d1-4541-a5c4-6934aa486a81' in safe mode due to encryption removal", "start_time": "2025-06-15T12:09:01.128548+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T12:09:06.959842+00:00Z", "host": "managed-node14", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-6cef51e7-40d1-4541-a5c4-6934aa486a81' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-15T12:09:06.755989+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T12:11:44.824781+00:00Z", "host": "managed-node14", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2025-06-15T12:11:39.510312+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2025-06-15T12:11:45.115906+00:00Z", "host": "managed-node14", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2025-06-15T12:11:44.874315+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Sunday 15 June 2025 08:15:13 -0400 (0:00:00.150) 0:25:23.148 *********** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.41s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.35s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.72s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.57s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.54s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.42s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.97s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.94s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.93s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.78s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.67s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.59s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.54s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.53s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.47s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 5.47s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 5.46s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 5.44s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.44s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.44s /tmp/collections-Ps3/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70